Samanth Subramanian also notes the alacrity with which Watson got to the buzzer so consistently. He provides some info on how the Jeopardy rules work:
... The Jeopardy buzzer system is set up such that you can’t buzz until Alex has finished reading the entire question in his slow, perennially wry drawl; if you do, you’re locked out, and you have to watch in frustration as somebody else answers. So I figured that computer and humans both had sufficient time to take the question in and work out whether they knew the answer or not. To boot, Watson didn’t buzz electronically; he (it? his highness? Master?) had to physically depress a buzzer like Ken and Brad.
It begins to make sense for me now. When the questions took long for Alex to read, Watson got the end-of-question signal electronically, and probably had an advantage over the humans who got it visually. On the other hand, when the questions were short, the humans had the advantage. I don't know if this interpretation is correct, but take a look at the first part of Day 3. There was one category on "Actors who direct" in which the clues were just movie titles, and therefore, short. And guess what? It was the human contestants who got to the buzzer faster than Watson -- for all the five questions in that category.
There have been some commentary on Watson and what it means for natural language processing. Ben Zimmer's piece at The Atlantic is about the best. Here's an excerpt where he red-flags a bunch of wild claims made in a pre-show
hype ad by Dave Ferrucci who led the Watson team.
I first encountered IBM's hype about the tournament last month, during the NFL's conference championship games, when Dave Ferrucci, the ebullient lead engineer on the project, showed up in commercial breaks to tell us about the marvels of Watson. One commercial intriguingly opens with Groucho Marx telling his classic joke: "One morning I shot an elephant in my pajamas. How he got into my pajamas, I don't know." Ferrucci then begins: "Real language is filled with nuance, slang and metaphor. It's more than half the world's data. But computers couldn't understand it." He continues: "Watson is a computer that uncovers meaning in our language, and pinpoints the right answer, instantly. It uses deep analytics to answer questions computers never could before, even the ones on Jeopardy!" Then a Jeopardy! clue is displayed: "Groucho quipped, 'One morning I shot' this 'in my pajamas.'"
Now, that's a provocative set of claims. Watson's performance in the tournament (despite a few howlers along the way) clearly demonstrates that it is very skilled in particular types of question-answering, and I have no doubt it could handle that Groucho clue with aplomb. But does that mean that Watson "understands" the "nuance, slang and metaphor" of natural language? That it "uncovers meaning in our language"? Depends what you mean by "meaning," and how you understand "understanding."
And Zimmer punctures the hype with this verdict:
... [F]or all of the impressive NLP programming that has gone into Watson, the computer is unable to penetrate the semantics of language, or comprehend how meanings of words are shot through with allusions to human culture and the experience of daily life.
Finally, a link to a post-show discussion held at IBM moderated by Stephen Baker, author of Final Jeopardy: Man vs. Machine and the Quest to Know Everything.