• Log InLog In
  • Register
Liquid`
Team Liquid Liquipedia
EDT 04:10
CEST 10:10
KST 17:10
  • Home
  • Forum
  • Calendar
  • Streams
  • Liquipedia
  • Features
  • Store
  • EPT
  • TL+
  • StarCraft 2
  • Brood War
  • Smash
  • Heroes
  • Counter-Strike
  • Overwatch
  • Liquibet
  • Fantasy StarCraft
  • TLPD
  • StarCraft 2
  • Brood War
  • Blogs
Forum Sidebar
Events/Features
News
Featured News
[ASL21] Ro8 Preview Pt2: Progenitors4Code S Season 1 - RO12 Group A: Rogue, Percival, Solar, Zoun13[ASL21] Ro8 Preview Pt1: Inheritors16[ASL21] Ro16 Preview Pt2: All Star10Team Liquid Map Contest #22 - The Finalists22
Community News
RSL Revival: Season 5 - Qualifiers and Main Event10Code S Season 1 (2026) - RO12 Results12026 GSL Season 1 Qualifiers25Maestros of the Game 2 announced92026 GSL Tour plans announced15
StarCraft 2
General
Blizzard Classic Cup @ BlizzCon 2026 - $100k prize pool Code S Season 1 (2026) - RO12 Results Code S Season 1 - RO12 Group A: Rogue, Percival, Solar, Zoun Team Liquid Map Contest #22 - The Finalists MaNa leaves Team Liquid
Tourneys
StarCraft Evolution League (SC Evo Biweekly) $1,400 SEL Season 3 Ladder Invitational RSL Revival: Season 5 - Qualifiers and Main Event GSL Code S Season 1 (2026) SC2 INu's Battles#15 <BO.9 2Matches>
Strategy
Custom Maps
[D]RTS in all its shapes and glory <3 [A] Nemrods 1/4 players [M] (2) Frigid Storage
External Content
Mutation # 524 Death and Taxes The PondCast: SC2 News & Results Mutation # 523 Firewall Mutation # 522 Flip My Base
Brood War
General
ASL21 General Discussion BW General Discussion [ASL21] Ro8 Preview Pt2: Progenitors Why there arent any 256x256 pro maps? BGH Auto Balance -> http://bghmmr.eu/
Tourneys
[ASL21] Ro8 Day 3 [ASL21] Ro8 Day 2 [Megathread] Daily Proleagues Escore Tournament StarCraft Season 2
Strategy
Simple Questions, Simple Answers Fighting Spirit mining rates What's the deal with APM & what's its true value Any training maps people recommend?
Other Games
General Games
Stormgate/Frost Giant Megathread OutLive 25 (RTS Game) Dawn of War IV Nintendo Switch Thread Daigo vs Menard Best of 10
Dota 2
The Story of Wings Gaming
League of Legends
G2 just beat GenG in First stand
Heroes of the Storm
Simple Questions, Simple Answers Heroes of the Storm 2.0
Hearthstone
Deck construction bug Heroes of StarCraft mini-set
TL Mafia
Vanilla Mini Mafia Mafia Game Mode Feedback/Ideas TL Mafia Community Thread Five o'clock TL Mafia
Community
General
US Politics Mega-thread Can Diabetes Be Reversed or Cured Permanently? European Politico-economics QA Mega-thread Russo-Ukrainian War Thread 3D technology/software discussion
Fan Clubs
The IdrA Fan Club
Media & Entertainment
[Manga] One Piece Anime Discussion Thread [Req][Books] Good Fantasy/SciFi books Movie Discussion!
Sports
2024 - 2026 Football Thread Formula 1 Discussion McBoner: A hockey love story
World Cup 2022
Tech Support
streaming software Strange computer issues (software) [G] How to Block Livestream Ads
TL Community
The Automated Ban List
Blogs
Movie Stars In Video Games: …
TrAiDoS
ramps on octagon
StaticNine
Broowar part 2
qwaykee
Funny Nicknames
LUCKY_NOOB
Customize Sidebar...

Website Feedback

Closed Threads



Active: 1182 users

IBM Watson Computer Beats Jeopardy Pros - Page 10

Forum Index > General Forum
Post a Reply
Prev 1 8 9 10 11 12 Next All
Nizaris
Profile Joined May 2010
Belgium2230 Posts
Last Edited: 2011-02-16 12:11:38
February 16 2011 11:35 GMT
#181
On February 16 2011 19:57 BlackJack wrote:
Show nested quote +
On February 16 2011 19:39 igotmyown wrote:
For matches against the machine, if a contestent buzzes in within some small time period (like .25 seconds) and the computer does as well, someone should be randomly selected to answer.


Yeah these matches are kind of stupid. They didn't prove they could build a computer that can beat a human at jeopardy, all they've proven is that they've built a robot that can click a buzzer faster than a human..

The competitive aspect is almost irrelevant. It can understand a sentence in 2-3s that's all that matters.

The wait for the host to finish reading the question rule is dumb however. They should let ppl buzz whenever they feel like it.
Believer
Profile Joined March 2010
Sweden212 Posts
February 16 2011 11:42 GMT
#182
+ Show Spoiler +

On January 20 2011 14:52 SpoR wrote:
In preliminary testing rounds.

                              [image loading]

http://www.tgdaily.com/sustainability-brief/53584-ibms-watson-computer-beats-human-players-in-jeopardy

Show nested quote +
After years of planning, IBM's learning, human-aware computer Watson was put to a competition like no other - a match of Jeopardy against quiz show heavyweights Ken Jennings and Brad Rutter. The result - Watson won. Barely.

The match, which Watson has been training for since 2009, was officially announced last year. At the end of last week, the multi-episode feature where Watson faces off against Jennings and Rutter was filmed.

But right before that, all three competed in a trial run at IBM's headquarters in New York State. The trial lasted as long as a normal game of Jeopardy would before its first commercial break - in other words, about enough time for the contestants to get through half of a round.

Right before the last clue of the round, Jennings and Watson were tied at $3,400. However, Watson chimed in to answer the final question and correctly identified the children's book Harold and the Purple Crayon. That set him ahead to $4,400. Rutter trailed at $1,200.

The full-length Jeopardy matches have been filmed, but no one is allowed to discuss the results. They'll be aired on TV next month, and at that time we'll really know who wins in the battle of man versus machine.

Should be interesting. I'm sure everyone is aware of the Deep Blue Project (also by IBM) which beat chess pro Kasparov decades ago.




Show nested quote +
IBM and the producers of quiz show Jeopardy announced Tuesday that an IBM computer known as "Watson" will compete against two of the show's most successful contestants in February 2011.

Watson, named after IBM founder Thomas J. Watson, will go up against Ken Jennings and Brad Rutter (left) on February 14, 15, and 16 in two matches over three days. Jennings won 74 games in a row during the 2004-2005 season, taking home more than $2.5 million. Rutter is Jeopardy's highest-earning player, winning more than $3.25 million during several appearances in 2002 and 2005.

The grand prize for the Watson-Jennings-Rutter matchup will be $1 million, with second place winnings of $300,000 and a $200,000 third prize. Jennings and Rutter will donate 50 percent of their winnings to charity, while IBM will donate 100 percent of Watson's cash.

Getting Watson to the Jeopardy stage has taken several years. Many clues in Jeopardy rely on subtle word play, irony, and riddles, something at which humans excel but that computers have difficulty understanding. Essentially, IBM had to figure out how to get Watson to think.

"After four years, our scientific team believes that Watson is ready for this challenge based on its ability to rapidly comprehend what the Jeopardy clue is asking, analyze the information it has access to, come up with precise answers, and develop an accurate confidence in its response," Dr. David Ferrucci, head of the Watson research team, said in a statement. "Beyond our excitement for the match itself, our team is very motivated by the possibilities that Watson's breakthrough computing capabilities hold for building a smarter planet and helping people in their business tasks and personal lives."

In a video about Watson's journey (below), Ferrucci said said the nature of Jeopardy is "going to drive the technology in the right direction."

"It's got the broad domain aspect, asks all kinds of things, which was one of the challenges we really wanted to take on," he said. "It had the confidence aspect; don't answer unless you think you're right. You also had to do it really quickly."

IBM said the technology used by Watson could be helpful in areas like healthcare, to help accurately diagnose patients, to improve online self-service help desks, to provide tourists and citizens with specific information regarding cities, or prompt customer support via phone.

To prepare, Watson played more than 50 "sparring games" against former Jeopardy champions. Watson also took and passed the same Jeopardy test administered to all potential contestants.

In the video, Harry Friedman, executive producer of Jeopardy, said when IBM first approached the show, producers were intrigued but were also concerned about it being viewed as a stunt or gimmick.

"But this was different. This was the notion of knowledge acquired by a computer against knowledge acquired and displayed by the best Jeopardy players," Friedman said. "This could be something important, and we want to be a part of it."

Friedman and other producers first watched Watson in action in December 2009, when it sparred against two other human contestants.

Watson is powered by an IBM POWER7 server, which is optimized to handle the massive number of tasks that Watson must perform at rapid speeds, IBM said. The machine also has a number of proprietary technologies that handle concurrent tasks and data while analyzing information in real time.


http://www.pcmag.com/article2/0,2817,2374331,00.asp



There are also a ton more videos about Watson on YT

This is the most indepth video I found so far. It describes the algorithm and abilities of the computer.


UPDATE 2/15/11

+ Show Spoiler [Day 1 Episode] +





+ Show Spoiler [Day 2 Episode] +



Pt. 2 coming soon


+ Show Spoiler [Day 3 Episode] +

Coming Soon


Nova special

dunno if a full upload to YT is up yet but you can watch on NOVA site here
http://www.pbs.org/wgbh/nova/tech/smartest-machine-on-earth.html

+ Show Spoiler [Tues Feb 14th] +
Watson $5,000 , Ken $2,000, Brad $5,000
WOW

+ Show Spoiler [Tues Feb 15th] +
Watson $35,734 , Ken $4,800, Brad $10,400
OMG!


+ Show Spoiler [Tues Feb 16th] +
Watson ?k , Ken?k, Brad ?k
?????

http://www.jeopardy.com/showguide/whentowatch/



OP's post above.

Deep Blue did not fairly beat Gary Kasparov, read up on the controversy of that match and you can clearly see that it did not.

As for the Jeopardy machine, how can someone ever believe we can beat a computer in a quiz?
Errare humanum est, ignoscere divinum
Bigpet
Profile Joined July 2010
Germany533 Posts
Last Edited: 2011-02-16 11:56:35
February 16 2011 11:56 GMT
#183
On February 16 2011 20:27 W2 wrote:
I am having a hard time understanding this. With the wikipedia and google I don't think it is possible to get an answer wrong. Of course the computer would win.


again: it has no internet access and the real feat here is not giving answers to keywords but actually parsing the syntax and semantic of the question to come up with the answer.

Go ahead and google the questions that were asked that won't be very successful. Maybe the right answer is somewhere among the first results but you'd still have to know which word is the answer on the result-pages.
I'm NOT the caster with a similar nick
3loodMoon
Profile Joined February 2011
Thailand13 Posts
February 16 2011 12:23 GMT
#184
WOW
I am stunned... I heard of Watson a while back, now I see what it can do... the amount of work just to get Watson to understand the question and then answer it must have been nuts.

Good Job IBM
Go Watson!
trainRiderJ
Profile Joined August 2010
United States615 Posts
Last Edited: 2011-02-16 15:01:58
February 16 2011 15:01 GMT
#185
IBM hurt themselves in a way by doing so well with Watson. People don't understand how hard this actually was.
turdburgler
Profile Blog Joined January 2011
England6749 Posts
February 16 2011 15:25 GMT
#186
teaching computers to understand the meaning of language is a huge step in AI. if you have ever dreamed of having a robot that will do your house work it needs to understand what you mean when you say something.
Welmu
Profile Blog Joined November 2009
Finland3295 Posts
February 16 2011 15:34 GMT
#187
Woah impressive! Wonder how computer will know, which ones have already been taken out^^
Progamertwitter.com/welmu1 | twitch.com/Welmu1
kainzero
Profile Blog Joined January 2009
United States5211 Posts
February 16 2011 15:54 GMT
#188
On February 16 2011 20:42 Believer wrote:
As for the Jeopardy machine, how can someone ever believe we can beat a computer in a quiz?

well, jeopardy questions are structured very strangely, so to process it is actually quite a significant feat that people REALLY underestimate here.

but it also seems like buzzing speed is an issue here. i knew that would come into play.
Cobalt
Profile Joined April 2008
United States441 Posts
February 16 2011 15:58 GMT
#189
I am so excited for tonight. Some IBM guys are coming to my university to talk about Watson today, then showing tonight's episode.
Alejandrisha
Profile Blog Joined July 2010
United States6565 Posts
February 16 2011 16:47 GMT
#190
I first read this as "IBM Watson Computer Beats Protoss" O_O been reading too much TL lately I suppose
get rich or die mining
TL+ Member
PartyBiscuit
Profile Joined September 2010
Canada4525 Posts
February 16 2011 17:17 GMT
#191
I agree that it made it look to easy - I tuned out during midway (not just because they gave a lot of profile information in both episodes), but Watson dominating the 2nd day got a bit dull. Seemed most of them knew the answers, but they can't outbuzz Watson is the main problem.
the farm ends here
semantics
Profile Blog Joined November 2009
10040 Posts
February 16 2011 17:31 GMT
#192
On February 17 2011 01:47 Alejandrisha wrote:
I first read this as "IBM Watson Computer Beats Protoss" O_O been reading too much TL lately I suppose

I'd be interested if ibm could make a bot that doesn't hack map, mineral etc. And a bot that can beat most people.
LittLeD
Profile Joined May 2010
Sweden7973 Posts
Last Edited: 2011-02-16 17:50:46
February 16 2011 17:38 GMT
#193
This is cool as hell. If you fully understand the meaning of this, it blows your mind. We're coming closer and closer as to having machines that can interact just like humans and fully understand our language, both body and speech.
On February 17 2011 02:31 semantics wrote:
Show nested quote +
On February 17 2011 01:47 Alejandrisha wrote:
I first read this as "IBM Watson Computer Beats Protoss" O_O been reading too much TL lately I suppose

I'd be interested if ibm could make a bot that doesn't hack map, mineral etc. And a bot that can beat most people.

When we reach that point of computers to fully understand the world around it and interact accordingly, we will have computers that outmatches humans in Video games, no question about it.
☆Grubby ☆| Tod|DeMusliM|ThorZaiN|SaSe|Moon|Mana| ☆HerO ☆
YejinYejin
Profile Blog Joined July 2009
United States1053 Posts
Last Edited: 2011-02-16 18:02:17
February 16 2011 18:01 GMT
#194
Yeah, I'd say the importance of Watson is not a computer's skill at Jeopardy. It's the fact that it can understand natural language that we take for granted as humans.

Now, I'm not completely sure how the Jeopardy buzzer system works. I played Quizbowl in high school, and that's what I'm familiar with. In Quizbowl, the reader reads the question, and at any point, you are allowed to buzz in. If you buzz in while he is still in the middle of the question, he will stop, and you don't get to hear any more of it. Then you answer, and if you're wrong, your team is locked out, and he reads the full question for the other team.

In Jeopardy, of course, Trebek always finishes the question. I'm wondering if the contestants are allowed to buzz-in during the question, and then Trebek finishes, and whoever buzzed in first gets to answer. The alternative is that no one is allowed to buzz until Trebek is done reading, and after that point, whoever buzzes first gets to answer.

The fact is that I knew about 80% of the answers that came up in the video for Double Jeopardy part 1 before Trebek was done reading the question, and were I the one playing, I'd attempt to buzz in as soon as I was allowed to. I'd assume that Rutter and Jennings therefore also know them, because they're definitely both more knowledgeable than me. The only reason Watson is winning by such a large margin, then, is that he's winning on the buzzer races.

That doesn't take away anything from the incredible feat that IBM has accomplished. The simple fact that he answers the questions properly is astounding. I'm just saying that people shouldn't look at the scores and say, "HOHOHO Watson so good, Jennings and Rutter = stoopid!!!!" Frankly, I don't think his knowledge outclasses their knowledge; only his reflexes do.

And I just realized that I've been using "his" as a pronoun for Watson. Interesting.
안지호
BlackJack
Profile Blog Joined June 2003
United States10574 Posts
Last Edited: 2011-02-16 18:16:19
February 16 2011 18:10 GMT
#195
On February 16 2011 20:35 Nizaris wrote:
Show nested quote +
On February 16 2011 19:57 BlackJack wrote:
On February 16 2011 19:39 igotmyown wrote:
For matches against the machine, if a contestent buzzes in within some small time period (like .25 seconds) and the computer does as well, someone should be randomly selected to answer.


Yeah these matches are kind of stupid. They didn't prove they could build a computer that can beat a human at jeopardy, all they've proven is that they've built a robot that can click a buzzer faster than a human..

The competitive aspect is almost irrelevant. It can understand a sentence in 2-3s that's all that matters.

The wait for the host to finish reading the question rule is dumb however. They should let ppl buzz whenever they feel like it.


Well Alex has to finish reading the question for the t.v. audience otherwise the show would be unwatchable. If they let people lock in to answer before Alex finishes they'd all just spam the buzzer non-stop because these guys know the answer to over 90% of the questions so they will start buzzing before they even know what the question is.
EscPlan9
Profile Blog Joined December 2006
United States2777 Posts
February 16 2011 18:13 GMT
#196
On February 16 2011 16:01 aztrorisk wrote:
Watson is a major disappointment.

Ken > Watson

First of all, Watson probably hacked the jeopardy system. Not that hard to program a computer to access the jeopardy system if it is connected directly to it. This is why it selected the daily double questions out of the blue. Comeon, it picked something for 800 and the rest for 200. Then it randomly picks something for 600 and then normally picks again when all the daily double are gone.

Ken > Watson


Troll? I mean how can you seriously think that what Watson is doing is "hacking the Jeopardy system"?
Undefeated TL Tecmo Super Bowl League Champion
lixlix
Profile Blog Joined December 2009
United States482 Posts
February 16 2011 20:57 GMT
#197
I'm surprised by the number of people who thinks its difficult to come up with an AI to beat FLASH without mineral hack, map hack. Its amazingly trivial. Maybe not trivial for you or I but with the level of research and money on par with Watson or even 1/10 of Watson, it is fairly trivial.

I mean Deep Blue already beat Kasparov. In Chess you don't even have the mechanical disadvantages human BW players have against AI BW players.
Xapti
Profile Joined April 2010
Canada2473 Posts
February 16 2011 21:24 GMT
#198
People talking about buzzing speed but I'd say that's a more minor issue than another.
The biggest issue to me would be that Watson can start analyzing the question instantly as soon as it's displayed, while a human has to read it which takes time.
"Then he told me to tell you that he wouldn't piss on you if you were on fire" — "Well, you tell him that I said that I wouldn't piss on him if he was on Jeopardy!"
SpoR
Profile Blog Joined November 2010
United States1542 Posts
Last Edited: 2011-02-16 21:31:05
February 16 2011 21:29 GMT
#199
On February 16 2011 20:42 Believer wrote:
+ Show Spoiler +

On January 20 2011 14:52 SpoR wrote:
In preliminary testing rounds.

                              [image loading]

http://www.tgdaily.com/sustainability-brief/53584-ibms-watson-computer-beats-human-players-in-jeopardy

Show nested quote +
After years of planning, IBM's learning, human-aware computer Watson was put to a competition like no other - a match of Jeopardy against quiz show heavyweights Ken Jennings and Brad Rutter. The result - Watson won. Barely.

The match, which Watson has been training for since 2009, was officially announced last year. At the end of last week, the multi-episode feature where Watson faces off against Jennings and Rutter was filmed.

But right before that, all three competed in a trial run at IBM's headquarters in New York State. The trial lasted as long as a normal game of Jeopardy would before its first commercial break - in other words, about enough time for the contestants to get through half of a round.

Right before the last clue of the round, Jennings and Watson were tied at $3,400. However, Watson chimed in to answer the final question and correctly identified the children's book Harold and the Purple Crayon. That set him ahead to $4,400. Rutter trailed at $1,200.

The full-length Jeopardy matches have been filmed, but no one is allowed to discuss the results. They'll be aired on TV next month, and at that time we'll really know who wins in the battle of man versus machine.

Should be interesting. I'm sure everyone is aware of the Deep Blue Project (also by IBM) which beat chess pro Kasparov decades ago.




Show nested quote +
IBM and the producers of quiz show Jeopardy announced Tuesday that an IBM computer known as "Watson" will compete against two of the show's most successful contestants in February 2011.

Watson, named after IBM founder Thomas J. Watson, will go up against Ken Jennings and Brad Rutter (left) on February 14, 15, and 16 in two matches over three days. Jennings won 74 games in a row during the 2004-2005 season, taking home more than $2.5 million. Rutter is Jeopardy's highest-earning player, winning more than $3.25 million during several appearances in 2002 and 2005.

The grand prize for the Watson-Jennings-Rutter matchup will be $1 million, with second place winnings of $300,000 and a $200,000 third prize. Jennings and Rutter will donate 50 percent of their winnings to charity, while IBM will donate 100 percent of Watson's cash.

Getting Watson to the Jeopardy stage has taken several years. Many clues in Jeopardy rely on subtle word play, irony, and riddles, something at which humans excel but that computers have difficulty understanding. Essentially, IBM had to figure out how to get Watson to think.

"After four years, our scientific team believes that Watson is ready for this challenge based on its ability to rapidly comprehend what the Jeopardy clue is asking, analyze the information it has access to, come up with precise answers, and develop an accurate confidence in its response," Dr. David Ferrucci, head of the Watson research team, said in a statement. "Beyond our excitement for the match itself, our team is very motivated by the possibilities that Watson's breakthrough computing capabilities hold for building a smarter planet and helping people in their business tasks and personal lives."

In a video about Watson's journey (below), Ferrucci said said the nature of Jeopardy is "going to drive the technology in the right direction."

"It's got the broad domain aspect, asks all kinds of things, which was one of the challenges we really wanted to take on," he said. "It had the confidence aspect; don't answer unless you think you're right. You also had to do it really quickly."

IBM said the technology used by Watson could be helpful in areas like healthcare, to help accurately diagnose patients, to improve online self-service help desks, to provide tourists and citizens with specific information regarding cities, or prompt customer support via phone.

To prepare, Watson played more than 50 "sparring games" against former Jeopardy champions. Watson also took and passed the same Jeopardy test administered to all potential contestants.

In the video, Harry Friedman, executive producer of Jeopardy, said when IBM first approached the show, producers were intrigued but were also concerned about it being viewed as a stunt or gimmick.

"But this was different. This was the notion of knowledge acquired by a computer against knowledge acquired and displayed by the best Jeopardy players," Friedman said. "This could be something important, and we want to be a part of it."

Friedman and other producers first watched Watson in action in December 2009, when it sparred against two other human contestants.

Watson is powered by an IBM POWER7 server, which is optimized to handle the massive number of tasks that Watson must perform at rapid speeds, IBM said. The machine also has a number of proprietary technologies that handle concurrent tasks and data while analyzing information in real time.


http://www.pcmag.com/article2/0,2817,2374331,00.asp

http://www.youtube.com/watch?v=_1c7s7-3fXI

There are also a ton more videos about Watson on YT

This is the most indepth video I found so far. It describes the algorithm and abilities of the computer.
http://www.youtube.com/watch?v=3G2H3DZ8rNc

UPDATE 2/15/11

+ Show Spoiler [Day 1 Episode] +

http://www.youtube.com/watch?v=BfNBWJTGEEA

http://www.youtube.com/watch?v=TFe2pJETNuw


+ Show Spoiler [Day 2 Episode] +

http://www.youtube.com/watch?v=PHhDLUVAtqU

Pt. 2 coming soon


+ Show Spoiler [Day 3 Episode] +

Coming Soon


Nova special
http://www.youtube.com/watch?v=GyIf5oIjIC8
dunno if a full upload to YT is up yet but you can watch on NOVA site here
http://www.pbs.org/wgbh/nova/tech/smartest-machine-on-earth.html

+ Show Spoiler [Tues Feb 14th] +
Watson $5,000 , Ken $2,000, Brad $5,000
WOW

+ Show Spoiler [Tues Feb 15th] +
Watson $35,734 , Ken $4,800, Brad $10,400
OMG!


+ Show Spoiler [Tues Feb 16th] +
Watson ?k , Ken?k, Brad ?k
?????

http://www.jeopardy.com/showguide/whentowatch/



OP's post above.

Deep Blue did not fairly beat Gary Kasparov, read up on the controversy of that match and you can clearly see that it did not.

As for the Jeopardy machine, how can someone ever believe we can beat a computer in a quiz?

I know all about it.
Kasparov claimed that there were grandmasters in the hidden server room inputting their selected moves and then deep blue continued with its brute force methods while putting more weight on the GM's choices.

Regardless, a few years later Kasparov played another iteration of deep blue called deep junior http://en.wikipedia.org/wiki/Junior_(chess)
In 2003 Deep Junior played a 6-game match against Garry Kasparov that resulted in a 3-3 tie. It won a 2006 match with Teimour Radjabov.

People get so defensive about machines/computers besting them. The machine isn't smarter, or more intelligent. It just brute forces a giant knowledge data base and does it really really fast. If humans had access to all that plus our own brains we would win uncontested every time.
Ultimately, that is the point.. For us to say look how much we can do with computers, we can apply these things in other areas and have a better world.
A man is what he thinks about all day long.
NEOtheONE
Profile Joined September 2010
United States2233 Posts
Last Edited: 2011-02-16 22:28:31
February 16 2011 22:05 GMT
#200
So Ken Jennings had an epic quote today in his final Jeopardy response "(I for one welcome our new computer overlords)."

+ Show Spoiler +
Watson wins with a two day total of over $77,000
Abstracts, the too long didn't read of the educated world.
Prev 1 8 9 10 11 12 Next All
Please log in or register to reply.
Live Events Refresh
Next event in 50m
[ Submit Event ]
Live Streams
Refresh
StarCraft 2
ProTech149
StarCraft: Brood War
Bisu 1947
Killer 876
Zeus 451
Leta 143
Hm[arnc] 120
SilentControl 90
JulyZerg 76
Soulkey 73
GoRush 30
soO 29
[ Show more ]
Sacsri 10
Sharp 9
Terrorterran 2
Dota 2
monkeys_forever348
NeuroSwarm127
canceldota36
League of Legends
JimRising 573
Counter-Strike
ceh9518
Super Smash Bros
hungrybox749
Other Games
summit1g9776
WinterStarcraft613
C9.Mang0433
RuFF_SC234
Organizations
Dota 2
PGL Dota 2 - Main Stream65
StarCraft 2
Blizzard YouTube
StarCraft: Brood War
BSLTrovo
[ Show 13 non-featured ]
StarCraft 2
• LUISG 19
• AfreecaTV YouTube
• intothetv
• Kozan
• IndyKCrew
• LaughNgamezSOOP
• Migwel
• sooper7s
StarCraft: Brood War
• BSLYoutube
• STPLYoutube
• ZZZeroYoutube
League of Legends
• Lourlo1351
• TFBlade743
Upcoming Events
Replay Cast
50m
Afreeca Starleague
1h 50m
Jaedong vs Light
Wardi Open
2h 50m
Monday Night Weeklies
7h 50m
Replay Cast
15h 50m
Sparkling Tuna Cup
1d 1h
Afreeca Starleague
1d 1h
Snow vs Flash
WardiTV Invitational
1d 2h
SHIN vs Nicoract
Solar vs Nice
GSL
2 days
Classic vs Cure
Maru vs Rogue
GSL
3 days
SHIN vs Zoun
ByuN vs herO
[ Show More ]
OSC
3 days
OSC
3 days
Replay Cast
3 days
Escore
4 days
The PondCast
4 days
WardiTV Invitational
4 days
Zoun vs Ryung
Lambo vs ShoWTimE
Replay Cast
4 days
CranKy Ducklings
5 days
RSL Revival
5 days
SHIN vs Bunny
ByuN vs Shameless
WardiTV Invitational
5 days
Krystianer vs TriGGeR
Cure vs Rogue
uThermal 2v2 Circuit
5 days
BSL
5 days
Replay Cast
5 days
Sparkling Tuna Cup
6 days
RSL Revival
6 days
Cure vs Zoun
Clem vs Lambo
WardiTV Invitational
6 days
BSL
6 days
Liquipedia Results

Completed

Proleague 2026-05-02
WardiTV TLMC #16
Nations Cup 2026

Ongoing

BSL Season 22
ASL Season 21
CSL 2026 SPRING (S20)
IPSL Spring 2026
KCM Race Survival 2026 Season 2
Acropolis #4
SCTL 2026 Spring
RSL Revival: Season 5
2026 GSL S1
BLAST Rivals Spring 2026
IEM Rio 2026
PGL Bucharest 2026
Stake Ranked Episode 1
BLAST Open Spring 2026
ESL Pro League S23 Finals
ESL Pro League S23 Stage 1&2
PGL Cluj-Napoca 2026

Upcoming

YSL S3
Escore Tournament S2: W6
KK 2v2 League Season 1
BSL 22 Non-Korean Championship
Escore Tournament S2: W7
Escore Tournament S2: W8
CSLAN 4
Kung Fu Cup 2026 Grand Finals
HSC XXIX
uThermal 2v2 2026 Main Event
Maestros of the Game 2
2026 GSL S2
Stake Ranked Episode 3
XSE Pro League 2026
IEM Cologne Major 2026
Stake Ranked Episode 2
CS Asia Championships 2026
Asian Champions League 2026
IEM Atlanta 2026
PGL Astana 2026
TLPD

1. ByuN
2. TY
3. Dark
4. Solar
5. Stats
6. Nerchio
7. sOs
8. soO
9. INnoVation
10. Elazer
1. Rain
2. Flash
3. EffOrt
4. Last
5. Bisu
6. Soulkey
7. Mini
8. Sharp
Sidebar Settings...

Advertising | Privacy Policy | Terms Of Use | Contact Us

Original banner artwork: Jim Warren
The contents of this webpage are copyright © 2026 TLnet. All Rights Reserved.