This question exists because it has historical significance, but it is not considered a good, on-topic question for this site, so please do not use it as evidence that you can ask similar questions here. This question and its answers are frozen and cannot be changed. More info: help center.

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
If this question can be reworded to fit the rules in the help center, please edit the question.

"That September 11th thing was amazing, wasn't it?" - A human would get the reference, a machine is much less likely to. Generally the best way would be to pose something where you can expect a decent emotional response from a human, but where the programmer may have not thought of that situation. You look like a jackass if it is a human, though.
–
TZHXApr 1 '11 at 9:27

3

I believe that 70% of Earth population (incl. children adults elders and retarded, etc.) would not get that reference, and 30% would not get it even if you say it in their native language. We are not being fair to the machines.
–
JobApr 1 '11 at 15:39

26 Answers
26

EDIT: I've come to think of that Douglas Hofstadter has done a delightful piece on this exact subject (including the highest rated answer) and found an online version at http://www.cse.unr.edu/~sushil/class/ai/papers/coffeehouse.html. Especially the scenario where he tries to disclose Nicolai in the "Post Scriptum" section is a fantastic read. I believe I read this in Metamagical Themes.

Semantically it's a sensible question, and a computer would probably try and answer it, but a human being would just say - "Awee... c'mon.. how the hell would I know?"

Anything with that pattern, ie. linguistically, semantically, and culturally a sensible question, but something which no real person would ask, or answer. (This can be done, without going into deeply personal areas - in fact, the computer might be programmed to handle those with "that's private").

You’re in a desert walking along in the sand when all of the sudden you look down, and you see a tortoise, it’s crawling toward you. You reach down, you flip the tortoise over on its back. The tortoise lays on its back, its belly baking in the hot sun, beating its legs trying to turn itself over, but it can’t, not without your help. But you’re not helping. Why is that?

Note: the original Turing test proposal was for the computer to pretend to be a woman, the interviewer to be a man, and the test limited to five minutes. If the man was unable to determine if the computer was a woman or not in five minutes, we would have to conclude that the computer was intelligent, "because the converse is not polite".

A woodchuck could chuck no amount of wood since a woodchuck couldn't chuck wood.
–
AntApr 1 '11 at 9:46

3

But if a woodchuck could chuck and would chuck some amount of wood, what amount of wood would a woodchuck chuck?
–
Kristof ClaesApr 1 '11 at 12:21

3

Even if a woodchuck could chuck wood and even if a woodchuck would chuck wood, should a woodchuck chuck wood?
–
bastibeApr 1 '11 at 12:50

7

A woodchuck would chuck all the wood he could chuck if a woodchuck could chuck wood. (According to the tongue twister, although the paper "The Ability of Woodchucks to Chuck Cellulose Fibers" by P.A. Paskevich and T.B. Shea in Annals of Improbable Research vol. 1, no. 4, pp. 4-9, July/August 1995, concluded that a woodchuck can chuck 361.9237001 cubic centimeters of wood per day.)
–
GStoApr 1 '11 at 13:36

I don't think it is a good question. For one thing, both "What is a Turing test?" and "I don't know" would be a perfectly legitimate answer from most humans, and a trivial choice for a machine.
–
Péter TörökApr 1 '11 at 12:24

3

Well then you could explain what it is in simple terms, and then ask again.
–
Mr. ShickadanceApr 1 '11 at 12:33

4

The point here is that the machine would enter an endless loop and stack overflow, hence being revealed!
–
PhilippeApr 1 '11 at 12:36

6

@Philippe, unless it has been taught to ask difficult questions on programmers.stackexchange.com...
–
user1249Apr 1 '11 at 18:25

Essentially this means that it will always take a series of questions and subsequent analysis of the answers to establish if the anonymous entity at the end of the line is a human being or not. A single question will not achieve this.

I suppose you could ask "Will you meet me in the car park in 2 minutes?" and then see what turns up.