Ghosts, computers, and Chinese Whispers

Toby Howard

This article first appeared in Personal Computer World
magazine, November 1996.

"CLICK HERE to upload your soul" was one of the
tamer headlines seen recently in reports describing the "new
research direction" of British Telecom's research labs at
Martlesham Heath. BT, it was reported by such authorities as
The Guardian, Reuters, Time, and the Electronic Telegraph, is
embarking on a huge new research project. Funded to the tune
of 25 million pounds, an eight-man team is developing a "Soul
Catcher" memory chip. The chip will be implanted behind
a person's eye, and will record all the thoughts and experiences
of their lifetime.

The idea that we might migrate minds from brains into silicon
has been around for a while, and is usually referred to as "uploading".
This might be achieved destructively, by visiting each neuron
in turn, determining its particular characteristics and interconnections,
and then replacing it with an equivalent microscopic computer
based on nanotechnology; or, perhaps preferable from the patient's
point of view, by non-destructively scanning the structure of
the living brain and reproducing it elsewhere in silicon, like
copying a complex drawing using a pantograph.

The amount of data involved would be immense, since the hardware
of our nervous system is believed to comprise around 1012
neurons with 1015 synaptic interconnections. But the
capacity of silicon for storing information is increasing at
an almost unbelievable rate. "Moore's Law", first expressed
by Intel co-founder Gordon Moore in 1964, stated that the density
of chips roughly doubles every year. Although the doubling period
since the early 1970s is now more like 18 months, we are still
seeing an explosive growth of chip-power. If the trend continues
into the next century we might expect by 2025 to store ten million
megabytes (ten terabytes, or 1013 bytes) on a single
chip. If so, we might hope to record information about a brain's
physical structure in a few of these chips.

But to talk about uploading thoughts and memories is quite
another matter. When we talk about brains and minds, we must
confront the classical "mind-body problem". We know
that our brain is a collection of biological structures of (so
far, at least) unimaginable complexity. Outwardly this "brain-stuff",
once described by computing pioneer Alan Turing as like a bowl
of cold porridge, is unmistakably physical. Deeper, a stained
microscopic section of brain matter reveals a riot of interconnected
neurons that looks like squashed rhubarb. How can this biology
conjure or host the mystery of human consciousness? How can our
minds influence the matter in the universe? Imagine: you're in
the pub, and you fancy a bottle of beer. Within moments the bartender
obliges. Your mind has somehow caused uncountable billions of
atoms in the universe -- atoms in your muscles, your throat,
the air, the barman's ears, his brain, his muscles, the fridge,
the bottle, the opener, the glass -- to directly respond to your
will. Quite a trick for porridge and rhubarb.

What is this "you" that has such power to disrupt
the universe? Are "you" some ethereal entity -- a ghost,
a soul -- operating the controls of the brain machine? Or is
the machine itself just so complex that in our inability to understand
its activity we seek refuge in the idea of a separate "soul"?
For proponents of "Strong Artifical Intelligence",
the answer is clear: human consciousness really is nothing more
than the algorithmic bubbling of cold porridge, and in fact any
sufficiently complex algorithm, running on any kind of machine,
will lead inexorably to thinking and consciousness. Or so they
say. A fierce debate rages over this claim.

If the "Strong AI" researchers are right, then we
should one day expect to see computers which behave as if they
are conscious. (It's the "as if" here which so antagonises
the philosophers). But the possibility of creating conscious
machines raises serious ethical questions. What rights would
a conscious machine possess? Would concerned individuals form
a Royal Society for the Protection of Cruelty to Machines? Would
those of a religious bent demand that a conscious machine be
taught how to worship God? What if conscious machines did not
like us, and turned nasty?

But we must, I regret, be sceptical futurists, and return
to earth. Is BT really trying to cache our consciousnesses onto
chips stuffed in our heads? "No!", says Chris Winter,
the BT researcher quoted in the press as heading the "team
of eight Soul Catcher scientists". "We are not building
anything!", he told me. The whole story is a media invention,
developed like a game of Chinese Whispers from its origins in
an after-dinner press briefing Winter gave to local journalists,
intending to enthuse them about the future-looking work at BT
Labs. Winter's research group had simply undertaken a "technology
trend analysis" to speculate on the future capacity of silicon,
and using Moore's Law had estimated the 10 terabyte chip by 2025.
To illustrate the immensity of such storage, Winter compared
it with a back-of-an-envelope guestimate of the volume of data
input through a person's sensory organs in an average 70-year
lifespan -- 10 terabytes. The press took it from there.

Since current research into neuro-computational cochlear implants
for the deaf, and retinal implants for the blind is proving successful,
perhaps in the distant future something like the Soul Catcher
will become a reality. However, the vision of Bob Hoskins saying
"It's good to upload" makes me reach for another universe-changing
beer. For more on uploading, visit www.aleph.se/Trans/Global/Uploading.