>Simulating a brain is no good; a simulation of rain doesn't
>make you wet:

Just as simulating a calculation is no good; a simulated calculation
doesn't give you the correct answer. Oh, wait.

> just as obviously as in the original
>example, he still doesn't understand the stories he is asked about.

By Searle's masking construction. The task is made impossible *for a man in
such a Room*--as defined. But as almost everyone agrees who's studied this
topic, shrink the man to the scale of a neuron or a cortical column or
whatever's most salient, add a hundred billion more, and the whole
composite system of microscopic men and their rules is *of course* able to
perform the task. Hence:

>It seems highly implausible to
>attribute understanding to an arbitrary 'system' made up of the conjunction
>of the man and some rules.

On the contrary, suitably normalized, it is the definition of a conscious
human brain and its inputs and outputs.

>If necessary, the man can memorise the rules:
>then the whole 'system' is in his memory, but he still doesn't understand
>the Chinese.

Absurd. No individual *can* "memorise the rules" (or rather the Vast
content of the Room's stored information as well as the syntactic rules) in
Searle's parody, and yet *everyone's neural apparatus* does so in the real
world; that's what learning to talk *is*.

>[So these systems must be ruled out as conscious machines]

Rather, we see that consciousness *is* just (the inwardness of) such a system.