• Log InLog In
  • Register
Liquid`
Team Liquid Liquipedia
EDT 23:19
CEST 05:19
KST 12:19
  • Home
  • Forum
  • Calendar
  • Streams
  • Liquipedia
  • Features
  • Store
  • EPT
  • TL+
  • StarCraft 2
  • Brood War
  • Smash
  • Heroes
  • Counter-Strike
  • Overwatch
  • Liquibet
  • Fantasy StarCraft
  • TLPD
  • StarCraft 2
  • Brood War
  • Blogs
Forum Sidebar
Events/Features
News
Featured News
Tournament Spotlight: FEL Cracow 20252Power Rank - Esports World Cup 202576RSL Season 1 - Final Week9[ASL19] Finals Recap: Standing Tall15HomeStory Cup 27 - Info & Preview18
Community News
Google Play ASL (Season 20) Announced18BSL Team Wars - Bonyth, Dewalt, Hawk & Sziky teams10Weekly Cups (July 14-20): Final Check-up0Esports World Cup 2025 - Brackets Revealed19Weekly Cups (July 7-13): Classic continues to roll8
StarCraft 2
General
#1: Maru - Greatest Players of All Time Power Rank - Esports World Cup 2025 What tournaments are world championships? Tournament Spotlight: FEL Cracow 2025 I offer completely free coaching services
Tourneys
Esports World Cup 2025 $25,000 Streamerzone StarCraft Pro Series announced $5,000 WardiTV Summer Championship 2025 WardiTV Mondays FEL Cracov 2025 (July 27) - $10,000 live event
Strategy
How did i lose this ZvP, whats the proper response
Custom Maps
External Content
Mutation #239 Bad Weather Mutation # 483 Kill Bot Wars Mutation # 482 Wheel of Misfortune Mutation # 481 Fear and Lava
Brood War
General
Google Play ASL (Season 20) Announced BGH Auto Balance -> http://bghmmr.eu/ BW General Discussion Flash Announces (and Retracts) Hiatus From ASL Dewalt's Show Matches in China
Tourneys
[Megathread] Daily Proleagues [BSL20] Non-Korean Championship 4x BSL + 4x China CSL Xiamen International Invitational [CSLPRO] It's CSLAN Season! - Last Chance
Strategy
Simple Questions, Simple Answers [G] Mineral Boosting Does 1 second matter in StarCraft?
Other Games
General Games
Nintendo Switch Thread Stormgate/Frost Giant Megathread Total Annihilation Server - TAForever [MMORPG] Tree of Savior (Successor of Ragnarok) Path of Exile
Dota 2
Official 'what is Dota anymore' discussion
League of Legends
Heroes of the Storm
Simple Questions, Simple Answers Heroes of the Storm 2.0
Hearthstone
Heroes of StarCraft mini-set
TL Mafia
TL Mafia Community Thread Vanilla Mini Mafia
Community
General
UK Politics Mega-thread US Politics Mega-thread Stop Killing Games - European Citizens Initiative Things Aren’t Peaceful in Palestine Russo-Ukrainian War Thread
Fan Clubs
INnoVation Fan Club SKT1 Classic Fan Club!
Media & Entertainment
[\m/] Heavy Metal Thread Anime Discussion Thread Movie Discussion! [Manga] One Piece Korean Music Discussion
Sports
2024 - 2025 Football Thread Formula 1 Discussion TeamLiquid Health and Fitness Initiative For 2023 NBA General Discussion
World Cup 2022
Tech Support
Installation of Windows 10 suck at "just a moment" Computer Build, Upgrade & Buying Resource Thread
TL Community
TeamLiquid Team Shirt On Sale The Automated Ban List
Blogs
Ping To Win? Pings And Their…
TrAiDoS
momentary artworks from des…
tankgirl
from making sc maps to makin…
Husyelt
StarCraft improvement
iopq
Socialism Anyone?
GreenHorizons
Eight Anniversary as a TL…
Mizenhauer
Customize Sidebar...

Website Feedback

Closed Threads



Active: 622 users

US Politics Mega-thread - Page 1694

Forum Index > General Forum
Post a Reply
Prev 1 1692 1693 1694 1695 1696 5126 Next
Now that we have a new thread, in order to ensure that this thread continues to meet TL standards and follows the proper guidelines, we will be enforcing the rules in the OP more strictly. Be sure to give them a complete and thorough read before posting!

NOTE: When providing a source, please provide a very brief summary on what it's about and what purpose it adds to the discussion. The supporting statement should clearly explain why the subject is relevant and needs to be discussed. Please follow this rule especially for tweets.

Your supporting statement should always come BEFORE you provide the source.


If you have any questions, comments, concern, or feedback regarding the USPMT, then please use this thread: http://www.teamliquid.net/forum/website-feedback/510156-us-politics-thread
Slydie
Profile Joined August 2013
1915 Posts
July 20 2019 11:28 GMT
#33861
On July 20 2019 19:53 Uldridge wrote:
I think capitalism will eat itself once the automation / AI revolution is in full swing. Either that or we'll have a harsh dystopian outcome. The reaction of the demographic at risk during that time needs to be the correct one.. because only they will have the power to change their fate.


We'll be ok, we have a lot of experience in inventing new jobs when old ones disappear or are moved to lowcost countries.

Capitalism goes nowhere. You can try to limit how much power the richest have but I fail to see how strict regulations can fully replace market mechanisms. The market will always find a way, as all the failed attempts have shown so far.

Another unresolveable problem is potential corruption of the socialist system and that it is essentially anti democratic. You can say capitalism can exploit human weaknesses but strict socialism fails to take them into account and are doomed to fail.

(I am not talking about social democratic solutions which are firmly within regulated capitalism.)
Buff the siegetank
Nebuchad
Profile Blog Joined December 2012
Switzerland12172 Posts
July 20 2019 11:44 GMT
#33862
On July 20 2019 14:37 CosmicSpiral wrote:
While historical accounts give the impression neoliberalism was a natural progression from capitalism and liberalism, it represented a marked break from both traditions. It was the brainchild of a few public intellectuals who aggressively promoted it through think tanks and adviser positions; its key representatives viewed liberalism and democracy as antimonies, although they didn't proclaim so outside of the MPS; the epistemological justification for ceding judgment to the market would have been utterly alien to Smith or any Enlightenment-influenced philosopher.

It's late over here. I'll fully elaborate on each post tomorrow.


I understand that it's a break from tradition, it's just one that I find utterly logical, and one that I would expect to happen given the principles of capitalism. It's on a really basic level; we give the main share of power to a group of people, and after a while, theories of economics that favor this group of people even more start appearing and gaining prominence (as they are backed by some of the people who have the most influence and the most power). That makes a lot of intuitive sense to me regardless of what Smith thought.

You can find the ancestor of trickle-down economics before the 1927 crisis. It wasn't full neoliberalism as there was no aspect of globalization but that makes sense as well given how nations operated then and how they operate now.

Also I would definitely agree that there is a tension between democracy and liberalism.
No will to live, no wish to die
Gahlo
Profile Joined February 2010
United States35143 Posts
July 20 2019 11:59 GMT
#33863
On July 20 2019 20:28 Slydie wrote:
Show nested quote +
On July 20 2019 19:53 Uldridge wrote:
I think capitalism will eat itself once the automation / AI revolution is in full swing. Either that or we'll have a harsh dystopian outcome. The reaction of the demographic at risk during that time needs to be the correct one.. because only they will have the power to change their fate.


We'll be ok, we have a lot of experience in inventing new jobs when old ones disappear or are moved to lowcost countries.

Capitalism goes nowhere. You can try to limit how much power the richest have but I fail to see how strict regulations can fully replace market mechanisms. The market will always find a way, as all the failed attempts have shown so far.

Another unresolveable problem is potential corruption of the socialist system and that it is essentially anti democratic. You can say capitalism can exploit human weaknesses but strict socialism fails to take them into account and are doomed to fail.

(I am not talking about social democratic solutions which are firmly within regulated capitalism.)

The market has already failed many, many people immensely.
JimmiC
Profile Blog Joined May 2011
Canada22817 Posts
Last Edited: 2019-07-20 14:14:22
July 20 2019 13:28 GMT
#33864
--- Nuked ---
Gorsameth
Profile Joined April 2010
Netherlands21667 Posts
July 20 2019 14:15 GMT
#33865
On July 20 2019 22:28 JimmiC wrote:
Show nested quote +
On July 20 2019 20:59 Gahlo wrote:
On July 20 2019 20:28 Slydie wrote:
On July 20 2019 19:53 Uldridge wrote:
I think capitalism will eat itself once the automation / AI revolution is in full swing. Either that or we'll have a harsh dystopian outcome. The reaction of the demographic at risk during that time needs to be the correct one.. because only they will have the power to change their fate.


We'll be ok, we have a lot of experience in inventing new jobs when old ones disappear or are moved to lowcost countries.

Capitalism goes nowhere. You can try to limit how much power the richest have but I fail to see how strict regulations can fully replace market mechanisms. The market will always find a way, as all the failed attempts have shown so far.

Another unresolveable problem is potential corruption of the socialist system and that it is essentially anti democratic. You can say capitalism can exploit human weaknesses but strict socialism fails to take them into account and are doomed to fail.

(I am not talking about social democratic solutions which are firmly within regulated capitalism.)

The market has already failed many, many people immensely.

So has attempts at communism. And saying the people who have tried is the problem is lazy thinking. There are cultural human issues at play that have consistently caused it to fail the vsry people it is meant to help. It sounds great to say you are going to change the incentives of society so people strive for different things. This is a super hard thing to do even on a super small scale, how do you do it on a super large scale? What do you do with the people who don't want the change? Or cant change?
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.
It ignores such insignificant forces as time, entropy, and death
GreenHorizons
Profile Blog Joined April 2011
United States23221 Posts
July 20 2019 14:19 GMT
#33866
On July 20 2019 20:59 Gahlo wrote:
Show nested quote +
On July 20 2019 20:28 Slydie wrote:
On July 20 2019 19:53 Uldridge wrote:
I think capitalism will eat itself once the automation / AI revolution is in full swing. Either that or we'll have a harsh dystopian outcome. The reaction of the demographic at risk during that time needs to be the correct one.. because only they will have the power to change their fate.


We'll be ok, we have a lot of experience in inventing new jobs when old ones disappear or are moved to lowcost countries.

Capitalism goes nowhere. You can try to limit how much power the richest have but I fail to see how strict regulations can fully replace market mechanisms. The market will always find a way, as all the failed attempts have shown so far.

Another unresolveable problem is potential corruption of the socialist system and that it is essentially anti democratic. You can say capitalism can exploit human weaknesses but strict socialism fails to take them into account and are doomed to fail.

(I am not talking about social democratic solutions which are firmly within regulated capitalism.)

The market has already failed many, many people immensely.

There's an ongoing issue of people conflating capitalism and markets to be one in the same and they aren't. You can have markets without capitalism and I think some people don't make that distinction in their understandings/arguments.
"People like to look at history and think 'If that was me back then, I would have...' We're living through history, and the truth is, whatever you are doing now is probably what you would have done then" "Scratch a Liberal..."
JimmiC
Profile Blog Joined May 2011
Canada22817 Posts
July 20 2019 14:23 GMT
#33867
--- Nuked ---
GreenHorizons
Profile Blog Joined April 2011
United States23221 Posts
Last Edited: 2019-07-20 14:28:46
July 20 2019 14:27 GMT
#33868
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
"People like to look at history and think 'If that was me back then, I would have...' We're living through history, and the truth is, whatever you are doing now is probably what you would have done then" "Scratch a Liberal..."
Mohdoo
Profile Joined August 2007
United States15686 Posts
July 20 2019 14:30 GMT
#33869
Seizing a UK tanker feels like a profoundly bad idea for Iran. If Europe gets on board with "Fuck Iran", it is lights out. It would be a huge victory for Trump if Europe were to get legitimately pissed at Iran.
Gorsameth
Profile Joined April 2010
Netherlands21667 Posts
July 20 2019 14:34 GMT
#33870
On July 20 2019 23:27 GreenHorizons wrote:
Show nested quote +
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

Show nested quote +
A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
What do you think is more likely?
All of humanity stops being selfish, or we develop an AI advanced enough to govern for us.
It ignores such insignificant forces as time, entropy, and death
Nebuchad
Profile Blog Joined December 2012
Switzerland12172 Posts
July 20 2019 14:38 GMT
#33871
On July 20 2019 23:34 Gorsameth wrote:
Show nested quote +
On July 20 2019 23:27 GreenHorizons wrote:
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
What do you think is more likely?
All of humanity stops being selfish, or we develop an AI advanced enough to govern for us.


I don't understand why you think we need to remove selfishness
No will to live, no wish to die
Gorsameth
Profile Joined April 2010
Netherlands21667 Posts
July 20 2019 14:40 GMT
#33872
On July 20 2019 23:38 Nebuchad wrote:
Show nested quote +
On July 20 2019 23:34 Gorsameth wrote:
On July 20 2019 23:27 GreenHorizons wrote:
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
What do you think is more likely?
All of humanity stops being selfish, or we develop an AI advanced enough to govern for us.


I don't understand why you think we need to remove selfishness
Because its only a matter of time until those who want to abuse a system get into a position where they can abuse that system for themselves at the cost of others.
It ignores such insignificant forces as time, entropy, and death
JimmiC
Profile Blog Joined May 2011
Canada22817 Posts
July 20 2019 14:42 GMT
#33873
--- Nuked ---
Nebuchad
Profile Blog Joined December 2012
Switzerland12172 Posts
July 20 2019 14:45 GMT
#33874
On July 20 2019 23:40 Gorsameth wrote:
Show nested quote +
On July 20 2019 23:38 Nebuchad wrote:
On July 20 2019 23:34 Gorsameth wrote:
On July 20 2019 23:27 GreenHorizons wrote:
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
What do you think is more likely?
All of humanity stops being selfish, or we develop an AI advanced enough to govern for us.


I don't understand why you think we need to remove selfishness
Because its only a matter of time until those who want to abuse a system get into a position where they can abuse that system for themselves at the cost of others.


But... that's mostly okay. And the few ways that aren't okay can simply be illegal.
No will to live, no wish to die
GreenHorizons
Profile Blog Joined April 2011
United States23221 Posts
Last Edited: 2019-07-20 14:46:50
July 20 2019 14:45 GMT
#33875
On July 20 2019 23:34 Gorsameth wrote:
Show nested quote +
On July 20 2019 23:27 GreenHorizons wrote:
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
What do you think is more likely?
All of humanity stops being selfish, or we develop an AI advanced enough to govern for us.


Depends on how literally you mean that?

"all of humanity" isn't going to do anything except maybe go extinct, A selfish/corrupt society doesn't develop an AI that isn't also "selfish" and corrupt.

Banking on AI seems completely irrational from every angle to me, whereas "all of humanity stops being selfish" isn't happening under any system and capitalism has failed miserably to mitigate that selfishness as we have a handful of people with the majority of the worlds resources leading us straight into catastrophe that threatens the species (granted we're stubborn survivors) so they can be a bit wealthier tomorrow than they were today. So the objection "but there's still selfishness" doesn't make sense to me as damning.
"People like to look at history and think 'If that was me back then, I would have...' We're living through history, and the truth is, whatever you are doing now is probably what you would have done then" "Scratch a Liberal..."
Gorsameth
Profile Joined April 2010
Netherlands21667 Posts
July 20 2019 14:48 GMT
#33876
On July 20 2019 23:45 Nebuchad wrote:
Show nested quote +
On July 20 2019 23:40 Gorsameth wrote:
On July 20 2019 23:38 Nebuchad wrote:
On July 20 2019 23:34 Gorsameth wrote:
On July 20 2019 23:27 GreenHorizons wrote:
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
What do you think is more likely?
All of humanity stops being selfish, or we develop an AI advanced enough to govern for us.


I don't understand why you think we need to remove selfishness
Because its only a matter of time until those who want to abuse a system get into a position where they can abuse that system for themselves at the cost of others.


But... that's mostly okay. And the few ways that aren't okay can simply be illegal.
But that is pretty much where we are now.
And apparently things are not ok.
It ignores such insignificant forces as time, entropy, and death
Nebuchad
Profile Blog Joined December 2012
Switzerland12172 Posts
July 20 2019 14:56 GMT
#33877
On July 20 2019 23:48 Gorsameth wrote:
Show nested quote +
On July 20 2019 23:45 Nebuchad wrote:
On July 20 2019 23:40 Gorsameth wrote:
On July 20 2019 23:38 Nebuchad wrote:
On July 20 2019 23:34 Gorsameth wrote:
On July 20 2019 23:27 GreenHorizons wrote:
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
What do you think is more likely?
All of humanity stops being selfish, or we develop an AI advanced enough to govern for us.


I don't understand why you think we need to remove selfishness
Because its only a matter of time until those who want to abuse a system get into a position where they can abuse that system for themselves at the cost of others.


But... that's mostly okay. And the few ways that aren't okay can simply be illegal.
But that is pretty much where we are now.
And apparently things are not ok.


I perceive your objection to be similar to saying that in order to move from an authoritarian system of government to a democratic system of government (representative democracy), we need to get rid of power hunger in human nature. Otherwise people who are power hungry will try to game the representative system.

Like... yeah, it's true, they will, and they have. But we can still establish a representative system without eliminating power hunger, and that's still an improvement over an authoritarian system.
No will to live, no wish to die
GreenHorizons
Profile Blog Joined April 2011
United States23221 Posts
July 20 2019 14:59 GMT
#33878
On July 20 2019 23:56 Nebuchad wrote:
Show nested quote +
On July 20 2019 23:48 Gorsameth wrote:
On July 20 2019 23:45 Nebuchad wrote:
On July 20 2019 23:40 Gorsameth wrote:
On July 20 2019 23:38 Nebuchad wrote:
On July 20 2019 23:34 Gorsameth wrote:
On July 20 2019 23:27 GreenHorizons wrote:
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
What do you think is more likely?
All of humanity stops being selfish, or we develop an AI advanced enough to govern for us.


I don't understand why you think we need to remove selfishness
Because its only a matter of time until those who want to abuse a system get into a position where they can abuse that system for themselves at the cost of others.


But... that's mostly okay. And the few ways that aren't okay can simply be illegal.
But that is pretty much where we are now.
And apparently things are not ok.


I perceive your objection to be similar to saying that in order to move from an authoritarian system of government to a democratic system of government (representative democracy), we need to get rid of power hunger in human nature. Otherwise people who are power hungry will try to game the representative system.

Like... yeah, it's true, they will, and they have. But we can still establish a representative system without eliminating power hunger, and that's still an improvement over an authoritarian system.


His country does still have a king?
"People like to look at history and think 'If that was me back then, I would have...' We're living through history, and the truth is, whatever you are doing now is probably what you would have done then" "Scratch a Liberal..."
Gorsameth
Profile Joined April 2010
Netherlands21667 Posts
July 20 2019 15:01 GMT
#33879
On July 20 2019 23:45 GreenHorizons wrote:
Show nested quote +
On July 20 2019 23:34 Gorsameth wrote:
On July 20 2019 23:27 GreenHorizons wrote:
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
What do you think is more likely?
All of humanity stops being selfish, or we develop an AI advanced enough to govern for us.


Depends on how literally you mean that?

"all of humanity" isn't going to do anything except maybe go extinct, A selfish/corrupt society doesn't develop an AI that isn't also "selfish" and corrupt.

Banking on AI seems completely irrational from every angle to me, whereas "all of humanity stops being selfish" isn't happening under any system and capitalism has failed miserably to mitigate that selfishness as we have a handful of people with the majority of the worlds resources leading us straight into catastrophe that threatens the species (granted we're stubborn survivors) so they can be a bit wealthier tomorrow than they were today. So the objection "but there's still selfishness" doesn't make sense to me as damning.
My point is that the system you come up with after your revolution is going to suffer from the same problem, that of selfish people exploiting it for themselves at the cost of everyone else.
Because its going to be a system run by people and its only a matter of time until those people will be selfish and greedy ones. (and that time is likely going to be immediately).

Capitalism has a lot of big flaws and I would love to change it into something better but what reason do I have to stand with you on the barricades to bring it toppling down when the replacement is going to be the same or worse?
If society is going to roll the dice I would want more assurances then a shrug that its going to do something.

And in my opinion the answer to that is Artificial Intelligence, less chance of greed, selfishness and corruption (if done properly and with an advanced enough AI, which we don't have yet)
It ignores such insignificant forces as time, entropy, and death
Gorsameth
Profile Joined April 2010
Netherlands21667 Posts
July 20 2019 15:06 GMT
#33880
On July 20 2019 23:56 Nebuchad wrote:
Show nested quote +
On July 20 2019 23:48 Gorsameth wrote:
On July 20 2019 23:45 Nebuchad wrote:
On July 20 2019 23:40 Gorsameth wrote:
On July 20 2019 23:38 Nebuchad wrote:
On July 20 2019 23:34 Gorsameth wrote:
On July 20 2019 23:27 GreenHorizons wrote:
On July 20 2019 23:15 Gorsameth wrote:
The problem is human nature so its not something we are going to fix long term until we get rid of the problem.
I for one look forward to our AI overlord.


When we do research looking for the traits people claim to be "human nature" we discover time and again it's mostly learned behavior that can be changed by changing the circumstances/learning.

A new set of studies provides compelling data allowing us to analyze human nature not through a philosopher’s kaleidoscope or a TV producer’s camera, but through the clear lens of science. These studies were carried out by a diverse group of researchers from Harvard and Yale—a developmental psychologist with a background in evolutionary game theory, a moral philosopher-turned-psychologist, and a biologist-cum-mathematician—interested in the same essential question: whether our automatic impulse—our first instinct—is to act selfishly or cooperatively.

+ Show Spoiler +
This focus on first instincts stems from the dual process framework of decision-making, which explains decisions (and behavior) in terms of two mechanisms: intuition and reflection. Intuition is often automatic and effortless, leading to actions that occur without insight into the reasons behind them. Reflection, on the other hand, is all about conscious thought—identifying possible behaviors, weighing the costs and benefits of likely outcomes, and rationally deciding on a course of action. With this dual process framework in mind, we can boil the complexities of basic human nature down to a simple question: which behavior—selfishness or cooperation—is intuitive, and which is the product of rational reflection? In other words, do we cooperate when we overcome our intuitive selfishness with rational self-control, or do we act selfishly when we override our intuitive cooperative impulses with rational self-interest?

To answer this question, the researchers first took advantage of a reliable difference between intuition and reflection: intuitive processes operate quickly, whereas reflective processes operate relatively slowly. Whichever behavioral tendency—selfishness or cooperation—predominates when people act quickly is likely to be the intuitive response; it is the response most likely to be aligned with basic human nature.

The experimenters first examined potential links between processing speed, selfishness, and cooperation by using 2 experimental paradigms (the “prisoner’s dilemma” and a “public goods game”), 5 studies, and a tot al of 834 participants gathered from both undergraduate campuses and a nationwide sample.
Each paradigm consisted of group-based financial decision-making tasks and required participants to choose between acting selfishly—opting to maximize individual benefits at the cost of the group—or cooperatively—opting to maximize group benefits at the cost of the individual. The results were striking: in every single study, faster—that is, more intuitive—decisions were associated with higher levels of cooperation, whereas slower—that is, more reflective—decisions were associated with higher levels of selfishness. These results suggest that our first impulse is to cooperate—that Augustine and Hobbes were wrong, and that we are fundamentally “good” creatures after all.


www.scientificamerican.com
What do you think is more likely?
All of humanity stops being selfish, or we develop an AI advanced enough to govern for us.


I don't understand why you think we need to remove selfishness
Because its only a matter of time until those who want to abuse a system get into a position where they can abuse that system for themselves at the cost of others.


But... that's mostly okay. And the few ways that aren't okay can simply be illegal.
But that is pretty much where we are now.
And apparently things are not ok.


I perceive your objection to be similar to saying that in order to move from an authoritarian system of government to a democratic system of government (representative democracy), we need to get rid of power hunger in human nature. Otherwise people who are power hungry will try to game the representative system.

Like... yeah, it's true, they will, and they have. But we can still establish a representative system without eliminating power hunger, and that's still an improvement over an authoritarian system.
Sure but its a risk every time, and often in the past the situation for people was bad enough that they were willing to take that risk because it couldn't get worse.
I don't feel like that is the case now. Things can get a lot worse, and the last few decades have plenty of examples of it, and for most people it's simply not worth taking that risk at the moment.
It ignores such insignificant forces as time, entropy, and death
Prev 1 1692 1693 1694 1695 1696 5126 Next
Please log in or register to reply.
Live Events Refresh
Next event in 5h 41m
[ Submit Event ]
Live Streams
Refresh
StarCraft 2
WinterStarcraft382
Nina 194
RuFF_SC2 183
StarCraft: Brood War
Sea 5041
Stork 125
NaDa 89
Sharp 68
sSak 54
Sexy 53
Icarus 7
Dota 2
monkeys_forever1167
LuMiX1
Heroes of the Storm
Khaldor166
Other Games
summit1g13076
tarik_tv10225
JimRising 988
ViBE219
Organizations
Other Games
gamesdonequick1663
StarCraft 2
Blizzard YouTube
StarCraft: Brood War
BSLTrovo
sctven
[ Show 11 non-featured ]
StarCraft 2
• AfreecaTV YouTube
• intothetv
• Kozan
• IndyKCrew
• LaughNgamezSOOP
• Migwel
• sooper7s
StarCraft: Brood War
• BSLYoutube
• STPLYoutube
• ZZZeroYoutube
League of Legends
• Lourlo645
Upcoming Events
FEL
5h 41m
BSL20 Non-Korean Champi…
10h 41m
BSL20 Non-Korean Champi…
14h 41m
Bonyth vs Zhanhun
Dewalt vs Mihu
Hawk vs Sziky
Sziky vs QiaoGege
Mihu vs Hawk
Zhanhun vs Dewalt
Fengzi vs Bonyth
Sparkling Tuna Cup
2 days
Online Event
2 days
uThermal 2v2 Circuit
3 days
The PondCast
4 days
Replay Cast
4 days
Korean StarCraft League
5 days
CranKy Ducklings
6 days
Liquipedia Results

Completed

CSLPRO Last Chance 2025
Esports World Cup 2025
Murky Cup #2

Ongoing

Copa Latinoamericana 4
Jiahua Invitational
BSL 20 Non-Korean Championship
BSL 20 Team Wars
FEL Cracov 2025
CC Div. A S7
Underdog Cup #2
IEM Cologne 2025
FISSURE Playground #1
BLAST.tv Austin Major 2025
ESL Impact League Season 7
IEM Dallas 2025
PGL Astana 2025
Asian Champions League '25

Upcoming

ASL Season 20: Qualifier #1
ASL Season 20: Qualifier #2
ASL Season 20
CSLPRO Chat StarLAN 3
BSL Season 21
RSL Revival: Season 2
Maestros of the Game
SEL Season 2 Championship
WardiTV Summer 2025
uThermal 2v2 Main Event
HCC Europe
ESL Pro League S22
StarSeries Fall 2025
FISSURE Playground #2
BLAST Open Fall 2025
BLAST Open Fall Qual
Esports World Cup 2025
BLAST Bounty Fall 2025
BLAST Bounty Fall Qual
TLPD

1. ByuN
2. TY
3. Dark
4. Solar
5. Stats
6. Nerchio
7. sOs
8. soO
9. INnoVation
10. Elazer
1. Rain
2. Flash
3. EffOrt
4. Last
5. Bisu
6. Soulkey
7. Mini
8. Sharp
Sidebar Settings...

Advertising | Privacy Policy | Terms Of Use | Contact Us

Original banner artwork: Jim Warren
The contents of this webpage are copyright © 2025 TLnet. All Rights Reserved.