• Log InLog In
  • Register
Liquid`
Team Liquid Liquipedia
EST 22:57
CET 04:57
KST 12:57
  • Home
  • Forum
  • Calendar
  • Streams
  • Liquipedia
  • Features
  • Store
  • EPT
  • TL+
  • StarCraft 2
  • Brood War
  • Smash
  • Heroes
  • Counter-Strike
  • Overwatch
  • Liquibet
  • Fantasy StarCraft
  • TLPD
  • StarCraft 2
  • Brood War
  • Blogs
Forum Sidebar
Events/Features
News
Featured News
RSL Revival - 2025 Season Finals Preview8RSL Season 3 - Playoffs Preview0RSL Season 3 - RO16 Groups C & D Preview0RSL Season 3 - RO16 Groups A & B Preview2TL.net Map Contest #21: Winners12
Community News
Weekly Cups (Dec 29-Jan 4): Protoss rolls, 2v2 returns6[BSL21] Non-Korean Championship - Starts Jan 103SC2 All-Star Invitational: Jan 17-1822Weekly Cups (Dec 22-28): Classic & MaxPax win, Percival surprises3Weekly Cups (Dec 15-21): Classic wins big, MaxPax & Clem take weeklies3
StarCraft 2
General
Weekly Cups (Dec 29-Jan 4): Protoss rolls, 2v2 returns SC2 All-Star Invitational: Jan 17-18 Weekly Cups (Dec 22-28): Classic & MaxPax win, Percival surprises Chinese SC2 server to reopen; live all-star event in Hangzhou Starcraft 2 Zerg Coach
Tourneys
SC2 AI Tournament 2026 WardiTV Winter Cup OSC Season 13 World Championship uThermal 2v2 Circuit WardiTV Mondays
Strategy
Simple Questions Simple Answers
Custom Maps
Map Editor closed ?
External Content
Mutation # 507 Well Trained Mutation # 506 Warp Zone Mutation # 505 Rise From Ashes Mutation # 504 Retribution
Brood War
General
I would like to say something about StarCraft BW General Discussion StarCraft & BroodWar Campaign Speedrun Quest BGH Auto Balance -> http://bghmmr.eu/ Data analysis on 70 million replays
Tourneys
[Megathread] Daily Proleagues [BSL21] Grand Finals - Sunday 21:00 CET [BSL21] Non-Korean Championship - Starts Jan 10 SLON Grand Finals – Season 2
Strategy
Simple Questions, Simple Answers Game Theory for Starcraft Current Meta [G] How to get started on ladder as a new Z player
Other Games
General Games
Stormgate/Frost Giant Megathread Nintendo Switch Thread General RTS Discussion Thread Awesome Games Done Quick 2026! Should offensive tower rushing be viable in RTS games?
Dota 2
Official 'what is Dota anymore' discussion
League of Legends
Heroes of the Storm
Simple Questions, Simple Answers Heroes of the Storm 2.0
Hearthstone
Deck construction bug Heroes of StarCraft mini-set
TL Mafia
Vanilla Mini Mafia Mafia Game Mode Feedback/Ideas Survivor II: The Amazon Sengoku Mafia
Community
General
US Politics Mega-thread Russo-Ukrainian War Thread Trading/Investing Thread The Big Programming Thread Canadian Politics Mega-thread
Fan Clubs
White-Ra Fan Club
Media & Entertainment
Anime Discussion Thread [Manga] One Piece
Sports
2024 - 2026 Football Thread Formula 1 Discussion
World Cup 2022
Tech Support
Computer Build, Upgrade & Buying Resource Thread
TL Community
The Automated Ban List TL+ Announced
Blogs
How do archons sleep?
8882
Psychological Factors That D…
TrAiDoS
James Bond movies ranking - pa…
Topin
StarCraft improvement
iopq
GOAT of Goats list
BisuDagger
Customize Sidebar...

Website Feedback

Closed Threads



Active: 1575 users

US Politics Mega-thread - Page 1694

Forum Index > General Forum
Post a Reply
Prev 1 1692 1693 1694 1695 1696 5417 Next
Now that we have a new thread, in order to ensure that this thread continues to meet TL standards and follows the proper guidelines, we will be enforcing the rules in the OP more strictly. Be sure to give them a complete and thorough read before posting!

NOTE: When providing a source, please provide a very brief summary on what it's about and what purpose it adds to the discussion. The supporting statement should clearly explain why the subject is relevant and needs to be discussed. Please follow this rule especially for tweets.

Your supporting statement should always come BEFORE you provide the source.


If you have any questions, comments, concern, or feedback regarding the USPMT, then please use this thread: http://www.teamliquid.net/forum/website-feedback/510156-us-politics-thread
Slydie
Profile Joined August 2013
1929 Posts
July 20 2019 11:28 GMT
#33861
On July 20 2019 19:53 Uldridge wrote:
I think capitalism will eat itself once the automation / AI revolution is in full swing. Either that or we'll have a harsh dystopian outcome. The reaction of the demographic at risk during that time needs to be the correct one.. because only they will have the power to change their fate.


We'll be ok, we have a lot of experience in inventing new jobs when old ones disappear or are moved to lowcost countries.

Capitalism goes nowhere. You can try to limit how much power the richest have but I fail to see how strict regulations can fully replace market mechanisms. The market will always find a way, as all the failed attempts have shown so far.

Another unresolveable problem is potential corruption of the socialist system and that it is essentially anti democratic. You can say capitalism can exploit human weaknesses but strict socialism fails to take them into account and are doomed to fail.

(I am not talking about social democratic solutions which are firmly within regulated capitalism.)
Buff the siegetank
Nebuchad
Profile Blog Joined December 2012
Switzerland12379 Posts
July 20 2019 11:44 GMT
#33862
On July 20 2019 14:37 CosmicSpiral wrote:
While historical accounts give the impression neoliberalism was a natural progression from capitalism and liberalism, it represented a marked break from both traditions. It was the brainchild of a few public intellectuals who aggressively promoted it through think tanks and adviser positions; its key representatives viewed liberalism and democracy as antimonies, although they didn't proclaim so outside of the MPS; the epistemological justification for ceding judgment to the market would have been utterly alien to Smith or any Enlightenment-influenced philosopher.

It's late over here. I'll fully elaborate on each post tomorrow.


I understand that it's a break from tradition, it's just one that I find utterly logical, and one that I would expect to happen given the principles of capitalism. It's on a really basic level; we give the main share of power to a group of people, and after a while, theories of economics that favor this group of people even more start appearing and gaining prominence (as they are backed by some of the people who have the most influence and the most power). That makes a lot of intuitive sense to me regardless of what Smith thought.

You can find the ancestor of trickle-down economics before the 1927 crisis. It wasn't full neoliberalism as there was no aspect of globalization but that makes sense as well given how nations operated then and how they operate now.

Also I would definitely agree that there is a tension between democracy and liberalism.
No will to live, no wish to die
Gahlo
Profile Joined February 2010
United States35165 Posts
July 20 2019 11:59 GMT
#33863
On July 20 2019 20:28 Slydie wrote:
Show nested quote +
On July 20 2019 19:53 Uldridge wrote:
I think capitalism will eat itself once the automation / AI revolution is in full swing. Either that or we'll have a harsh dystopian outcome. The reaction of the demographic at risk during that time needs to be the correct one.. because only they will have the power to change their fate.


We'll be ok, we have a lot of experience in inventing new jobs when old ones disappear or are moved to lowcost countries.

Capitalism goes nowhere. You can try to limit how much power the richest have but I fail to see how strict regulations can fully replace market mechanisms. The market will always find a way, as all the failed attempts have shown so far.

Another unresolveable problem is potential corruption of the socialist system and that it is essentially anti democratic. You can say capitalism can exploit human weaknesses but strict socialism fails to take them into account and are doomed to fail.

(I am not talking about social democratic solutions which are firmly within regulated capitalism.)

The market has already failed many, many people immensely.
JimmiC
Profile Blog Joined May 2011
Canada22817 Posts
Last Edited: 2019-07-20 14:14:22
July 20 2019 13:28 GMT
#33864
--- Nuked ---
Gorsameth
Profile Joined April 2010
Netherlands22029 Posts
July 20 2019 14:15 GMT
#33865
On July 20 2019 22:28 JimmiC wrote:
Show nested quote +
On July 20 2019 20:59 Gahlo wrote:
On July 20 2019 20:28 Slydie wrote:
On July 20 2019 19:53 Uldridge wrote:
I think capitalism will eat itself once the automation / AI revolution is in full swing. Either that or we'll have a harsh dystopian outcome. The reaction of the demographic at risk during that time needs to be the correct one.. because only they will have the power to change their fate.


We'll be ok, we have a lot of experience in inventing new jobs when old ones disappear or are moved to lowcost countries.

Capitalism goes nowhere. You can try to limit how much power the richest have but I fail to see how strict regulations can fully replace market mechanisms. The market will always find a way, as all the failed attempts have shown so far.

Another unresolveable problem is potential corruption of the socialist system and that it is essentially anti democratic. You can say capitalism can exploit human weaknesses but strict socialism fails to take them into account and are doomed to fail.

(I am not talking about social democratic solutions which are firmly within regulated capitalism.)

The market has already failed many, many people immensely.

So has attempts at communism. And saying the people who have tried is the problem is lazy thinking. There are cultural human issues at play that have consistently caused it to fail the vsry people it is meant to help. It sounds great to say you are going to change the incentives of society so people strive for different things. This is a super hard thing to do even on a super small scale, how do you do it on a super large scale? What do you do with the people who don't want the change? Or cant change?
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.
It ignores such insignificant forces as time, entropy, and death
GreenHorizons
Profile Blog Joined April 2011
United States23552 Posts
July 20 2019 14:19 GMT
#33866
On July 20 2019 20:59 Gahlo wrote:
Show nested quote +
On July 20 2019 20:28 Slydie wrote:
On July 20 2019 19:53 Uldridge wrote:
I think capitalism will eat itself once the automation / AI revolution is in full swing. Either that or we'll have a harsh dystopian outcome. The reaction of the demographic at risk during that time needs to be the correct one.. because only they will have the power to change their fate.


We'll be ok, we have a lot of experience in inventing new jobs when old ones disappear or are moved to lowcost countries.

Capitalism goes nowhere. You can try to limit how much power the richest have but I fail to see how strict regulations can fully replace market mechanisms. The market will always find a way, as all the failed attempts have shown so far.

Another unresolveable problem is potential corruption of the socialist system and that it is essentially anti democratic. You can say capitalism can exploit human weaknesses but strict socialism fails to take them into account and are doomed to fail.

(I am not talking about social democratic solutions which are firmly within regulated capitalism.)

The market has already failed many, many people immensely.

There's an ongoing issue of people conflating capitalism and markets to be one in the same and they aren't. You can have markets without capitalism and I think some people don't make that distinction in their understandings/arguments.
"People like to look at history and think 'If that was me back then, I would have...' We're living through history, and the truth is, whatever you are doing now is probably what you would have done then" "Scratch a Liberal..."
JimmiC
Profile Blog Joined May 2011
Canada22817 Posts
July 20 2019 14:23 GMT
#33867
--- Nuked ---
GreenHorizons
Profile Blog Joined April 2011
United States23552 Posts
Last Edited: 2019-07-20 14:28:46
July 20 2019 14:27 GMT
#33868
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
"People like to look at history and think 'If that was me back then, I would have...' We're living through history, and the truth is, whatever you are doing now is probably what you would have done then" "Scratch a Liberal..."
Mohdoo
Profile Joined August 2007
United States15725 Posts
July 20 2019 14:30 GMT
#33869
Seizing a UK tanker feels like a profoundly bad idea for Iran. If Europe gets on board with "Fuck Iran", it is lights out. It would be a huge victory for Trump if Europe were to get legitimately pissed at Iran.
Gorsameth
Profile Joined April 2010
Netherlands22029 Posts
July 20 2019 14:34 GMT
#33870
On July 20 2019 23:27 GreenHorizons wrote:
Show nested quote +
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

Show nested quote +
A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
What do you think is more likely?
All of humanity stops being selfish, or we develop an AI advanced enough to govern for us.
It ignores such insignificant forces as time, entropy, and death
Nebuchad
Profile Blog Joined December 2012
Switzerland12379 Posts
July 20 2019 14:38 GMT
#33871
On July 20 2019 23:34 Gorsameth wrote:
Show nested quote +
On July 20 2019 23:27 GreenHorizons wrote:
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
What do you think is more likely?
All of humanity stops being selfish, or we develop an AI advanced enough to govern for us.


I don't understand why you think we need to remove selfishness
No will to live, no wish to die
Gorsameth
Profile Joined April 2010
Netherlands22029 Posts
July 20 2019 14:40 GMT
#33872
On July 20 2019 23:38 Nebuchad wrote:
Show nested quote +
On July 20 2019 23:34 Gorsameth wrote:
On July 20 2019 23:27 GreenHorizons wrote:
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
What do you think is more likely?
All of humanity stops being selfish, or we develop an AI advanced enough to govern for us.


I don't understand why you think we need to remove selfishness
Because its only a matter of time until those who want to abuse a system get into a position where they can abuse that system for themselves at the cost of others.
It ignores such insignificant forces as time, entropy, and death
JimmiC
Profile Blog Joined May 2011
Canada22817 Posts
July 20 2019 14:42 GMT
#33873
--- Nuked ---
Nebuchad
Profile Blog Joined December 2012
Switzerland12379 Posts
July 20 2019 14:45 GMT
#33874
On July 20 2019 23:40 Gorsameth wrote:
Show nested quote +
On July 20 2019 23:38 Nebuchad wrote:
On July 20 2019 23:34 Gorsameth wrote:
On July 20 2019 23:27 GreenHorizons wrote:
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
What do you think is more likely?
All of humanity stops being selfish, or we develop an AI advanced enough to govern for us.


I don't understand why you think we need to remove selfishness
Because its only a matter of time until those who want to abuse a system get into a position where they can abuse that system for themselves at the cost of others.


But... that's mostly okay. And the few ways that aren't okay can simply be illegal.
No will to live, no wish to die
GreenHorizons
Profile Blog Joined April 2011
United States23552 Posts
Last Edited: 2019-07-20 14:46:50
July 20 2019 14:45 GMT
#33875
On July 20 2019 23:34 Gorsameth wrote:
Show nested quote +
On July 20 2019 23:27 GreenHorizons wrote:
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
What do you think is more likely?
All of humanity stops being selfish, or we develop an AI advanced enough to govern for us.


Depends on how literally you mean that?

"all of humanity" isn't going to do anything except maybe go extinct, A selfish/corrupt society doesn't develop an AI that isn't also "selfish" and corrupt.

Banking on AI seems completely irrational from every angle to me, whereas "all of humanity stops being selfish" isn't happening under any system and capitalism has failed miserably to mitigate that selfishness as we have a handful of people with the majority of the worlds resources leading us straight into catastrophe that threatens the species (granted we're stubborn survivors) so they can be a bit wealthier tomorrow than they were today. So the objection "but there's still selfishness" doesn't make sense to me as damning.
"People like to look at history and think 'If that was me back then, I would have...' We're living through history, and the truth is, whatever you are doing now is probably what you would have done then" "Scratch a Liberal..."
Gorsameth
Profile Joined April 2010
Netherlands22029 Posts
July 20 2019 14:48 GMT
#33876
On July 20 2019 23:45 Nebuchad wrote:
Show nested quote +
On July 20 2019 23:40 Gorsameth wrote:
On July 20 2019 23:38 Nebuchad wrote:
On July 20 2019 23:34 Gorsameth wrote:
On July 20 2019 23:27 GreenHorizons wrote:
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
What do you think is more likely?
All of humanity stops being selfish, or we develop an AI advanced enough to govern for us.


I don't understand why you think we need to remove selfishness
Because its only a matter of time until those who want to abuse a system get into a position where they can abuse that system for themselves at the cost of others.


But... that's mostly okay. And the few ways that aren't okay can simply be illegal.
But that is pretty much where we are now.
And apparently things are not ok.
It ignores such insignificant forces as time, entropy, and death
Nebuchad
Profile Blog Joined December 2012
Switzerland12379 Posts
July 20 2019 14:56 GMT
#33877
On July 20 2019 23:48 Gorsameth wrote:
Show nested quote +
On July 20 2019 23:45 Nebuchad wrote:
On July 20 2019 23:40 Gorsameth wrote:
On July 20 2019 23:38 Nebuchad wrote:
On July 20 2019 23:34 Gorsameth wrote:
On July 20 2019 23:27 GreenHorizons wrote:
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
What do you think is more likely?
All of humanity stops being selfish, or we develop an AI advanced enough to govern for us.


I don't understand why you think we need to remove selfishness
Because its only a matter of time until those who want to abuse a system get into a position where they can abuse that system for themselves at the cost of others.


But... that's mostly okay. And the few ways that aren't okay can simply be illegal.
But that is pretty much where we are now.
And apparently things are not ok.


I perceive your objection to be similar to saying that in order to move from an authoritarian system of government to a democratic system of government (representative democracy), we need to get rid of power hunger in human nature. Otherwise people who are power hungry will try to game the representative system.

Like... yeah, it's true, they will, and they have. But we can still establish a representative system without eliminating power hunger, and that's still an improvement over an authoritarian system.
No will to live, no wish to die
GreenHorizons
Profile Blog Joined April 2011
United States23552 Posts
July 20 2019 14:59 GMT
#33878
On July 20 2019 23:56 Nebuchad wrote:
Show nested quote +
On July 20 2019 23:48 Gorsameth wrote:
On July 20 2019 23:45 Nebuchad wrote:
On July 20 2019 23:40 Gorsameth wrote:
On July 20 2019 23:38 Nebuchad wrote:
On July 20 2019 23:34 Gorsameth wrote:
On July 20 2019 23:27 GreenHorizons wrote:
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
What do you think is more likely?
All of humanity stops being selfish, or we develop an AI advanced enough to govern for us.


I don't understand why you think we need to remove selfishness
Because its only a matter of time until those who want to abuse a system get into a position where they can abuse that system for themselves at the cost of others.


But... that's mostly okay. And the few ways that aren't okay can simply be illegal.
But that is pretty much where we are now.
And apparently things are not ok.


I perceive your objection to be similar to saying that in order to move from an authoritarian system of government to a democratic system of government (representative democracy), we need to get rid of power hunger in human nature. Otherwise people who are power hungry will try to game the representative system.

Like... yeah, it's true, they will, and they have. But we can still establish a representative system without eliminating power hunger, and that's still an improvement over an authoritarian system.


His country does still have a king?
"People like to look at history and think 'If that was me back then, I would have...' We're living through history, and the truth is, whatever you are doing now is probably what you would have done then" "Scratch a Liberal..."
Gorsameth
Profile Joined April 2010
Netherlands22029 Posts
July 20 2019 15:01 GMT
#33879
On July 20 2019 23:45 GreenHorizons wrote:
Show nested quote +
On July 20 2019 23:34 Gorsameth wrote:
On July 20 2019 23:27 GreenHorizons wrote:
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
What do you think is more likely?
All of humanity stops being selfish, or we develop an AI advanced enough to govern for us.


Depends on how literally you mean that?

"all of humanity" isn't going to do anything except maybe go extinct, A selfish/corrupt society doesn't develop an AI that isn't also "selfish" and corrupt.

Banking on AI seems completely irrational from every angle to me, whereas "all of humanity stops being selfish" isn't happening under any system and capitalism has failed miserably to mitigate that selfishness as we have a handful of people with the majority of the worlds resources leading us straight into catastrophe that threatens the species (granted we're stubborn survivors) so they can be a bit wealthier tomorrow than they were today. So the objection "but there's still selfishness" doesn't make sense to me as damning.
My point is that the system you come up with after your revolution is going to suffer from the same problem, that of selfish people exploiting it for themselves at the cost of everyone else.
Because its going to be a system run by people and its only a matter of time until those people will be selfish and greedy ones. (and that time is likely going to be immediately).

Capitalism has a lot of big flaws and I would love to change it into something better but what reason do I have to stand with you on the barricades to bring it toppling down when the replacement is going to be the same or worse?
If society is going to roll the dice I would want more assurances then a shrug that its going to do something.

And in my opinion the answer to that is Artificial Intelligence, less chance of greed, selfishness and corruption (if done properly and with an advanced enough AI, which we don't have yet)
It ignores such insignificant forces as time, entropy, and death
Gorsameth
Profile Joined April 2010
Netherlands22029 Posts
July 20 2019 15:06 GMT
#33880
On July 20 2019 23:56 Nebuchad wrote:
Show nested quote +
On July 20 2019 23:48 Gorsameth wrote:
On July 20 2019 23:45 Nebuchad wrote:
On July 20 2019 23:40 Gorsameth wrote:
On July 20 2019 23:38 Nebuchad wrote:
On July 20 2019 23:34 Gorsameth wrote:
On July 20 2019 23:27 GreenHorizons wrote:
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
What do you think is more likely?
All of humanity stops being selfish, or we develop an AI advanced enough to govern for us.


I don't understand why you think we need to remove selfishness
Because its only a matter of time until those who want to abuse a system get into a position where they can abuse that system for themselves at the cost of others.


But... that's mostly okay. And the few ways that aren't okay can simply be illegal.
But that is pretty much where we are now.
And apparently things are not ok.


I perceive your objection to be similar to saying that in order to move from an authoritarian system of government to a democratic system of government (representative democracy), we need to get rid of power hunger in human nature. Otherwise people who are power hungry will try to game the representative system.

Like... yeah, it's true, they will, and they have. But we can still establish a representative system without eliminating power hunger, and that's still an improvement over an authoritarian system.
Sure but its a risk every time, and often in the past the situation for people was bad enough that they were willing to take that risk because it couldn't get worse.
I don't feel like that is the case now. Things can get a lot worse, and the last few decades have plenty of examples of it, and for most people it's simply not worth taking that risk at the moment.
It ignores such insignificant forces as time, entropy, and death
Prev 1 1692 1693 1694 1695 1696 5417 Next
Please log in or register to reply.
Live Events Refresh
Next event in 10h 3m
[ Submit Event ]
Live Streams
Refresh
StarCraft 2
RuFF_SC2 193
WinterStarcraft54
StarCraft: Brood War
Britney 16284
GuemChi 1319
Shuttle 88
ZergMaN 46
Dewaltoss 41
Noble 20
Icarus 6
Dota 2
monkeys_forever495
NeuroSwarm106
LuMiX1
League of Legends
JimRising 929
C9.Mang0865
Counter-Strike
summit1g7793
m0e_tv437
minikerr34
Super Smash Bros
hungrybox1852
Other Games
ViBE158
Livibee77
Fuzer 36
Ketroc4
Organizations
Other Games
gamesdonequick47055
StarCraft 2
Blizzard YouTube
StarCraft: Brood War
BSLTrovo
sctven
[ Show 17 non-featured ]
StarCraft 2
• Berry_CruncH117
• davetesta41
• practicex 2
• Kozan
• AfreecaTV YouTube
• intothetv
• sooper7s
• IndyKCrew
• LaughNgamezSOOP
• Migwel
StarCraft: Brood War
• RayReign 27
• Azhi_Dahaki19
• STPLYoutube
• ZZZeroYoutube
• BSLYoutube
League of Legends
• Doublelift3826
• Stunt302
Upcoming Events
OSC
10h 3m
Classic vs Krystianer
Solar vs TBD
ShoWTimE vs TBD
MaxPax vs TBD
MaNa vs MilkiCow
GgMaChine vs Mixu
SOOP
2 days
SHIN vs GuMiho
Cure vs Creator
The PondCast
2 days
Sparkling Tuna Cup
3 days
IPSL
3 days
DragOn vs Sziky
Replay Cast
4 days
Wardi Open
4 days
Monday Night Weeklies
4 days
Liquipedia Results

Completed

Proleague 2026-01-06
WardiTV 2025
META Madness #9

Ongoing

C-Race Season 1
IPSL Winter 2025-26
OSC Championship Season 13
eXTREMESLAND 2025
SL Budapest Major 2025
ESL Impact League Season 8
BLAST Rivals Fall 2025
IEM Chengdu 2025
PGL Masters Bucharest 2025

Upcoming

Escore Tournament S1: W3
BSL 21 Non-Korean Championship
CSL 2025 WINTER (S19)
Acropolis #4
IPSL Spring 2026
Bellum Gens Elite Stara Zagora 2026
HSC XXVIII
Thunderfire SC2 All-star 2025
Big Gabe Cup #3
Nations Cup 2026
Underdog Cup #3
NA Kuram Kup
BLAST Open Spring 2026
ESL Pro League Season 23
ESL Pro League Season 23
PGL Cluj-Napoca 2026
IEM Kraków 2026
BLAST Bounty Winter 2026
BLAST Bounty Winter Qual
TLPD

1. ByuN
2. TY
3. Dark
4. Solar
5. Stats
6. Nerchio
7. sOs
8. soO
9. INnoVation
10. Elazer
1. Rain
2. Flash
3. EffOrt
4. Last
5. Bisu
6. Soulkey
7. Mini
8. Sharp
Sidebar Settings...

Advertising | Privacy Policy | Terms Of Use | Contact Us

Original banner artwork: Jim Warren
The contents of this webpage are copyright © 2026 TLnet. All Rights Reserved.