Thursday, February 24, 2011


I watched the artificial intelligence Watson compete at Jeopardy. It’s pretty fascinating how good it is, though it has a circuitry advantage in that its metal mind works faster than the human mind. It has an initial appearance of being close to passing the Turing test. If you watch it, though, it’s pretty clear that Watson doesn’t understand English.

I mean, its interface is in English. You ask it questions (or technically answers) in English, and it responds in English. But it doesn’t understand it. There is no comprehension there. Humans just made a device that can analyze a string of words and figure out what is the most likely response in a lot of situations, which creates the illusion of comprehension.

A philosopher named Searle came up with the concept of the Chinese room to illustrate how artificial intelligences can never be truly self-aware like human beings. I disagree with the “never” part, but the Chinese room is apt for discussing modern day AIs. Basically, the idea is that there’s a person who doesn’t know Chinese sitting in a room. Outside, a person who does know Chinese slips messages written in Chinese into the room. The person inside compares the messages to examples given in a book, which gives examples for the appropriate responses, which the person then copies down onto paper and slips the paper out the door. Assuming the book is comprehensive enough, the person outside may never become aware that the person inside doesn’t know Chinese. This is how it is with AIs, which answer messages in the way that they are programmed to do but have no understanding of the language.

One of the Jeopardy questions was looking for something like “What is modern art?” and had some specific wording like “art period”. Watson, however, could only recognize that it was talking about cubism, and it answered “What is Picasso?” when the wording specified a time period and not a person. Another question was looking for a character from a song, and Watson could only come up with the song featuring the character. It didn’t understand the words.

I think Watson is interesting. It’s an interesting publicity stunt, but it kind of presents a false image of what it is. People look at it and think this is some advanced computer well on the way to becoming Johnny 5. It’s not. It’s an advanced toy.

No comments: