Residual self image was a concept popularised by the Matrix films, which entered
popular culture. The idea that you are always the same person when interacting
in ego-space. That is to say, even without your own physical body being in your
vision, your mind conjures a new duplicate up that still looks like, talks like,
feels and moves like you.

The theory it seems, was half-right, but it was also very, very wrong. You
conjure up a replacement body alright, but its not necessarily the same you
as you started with. Put into another's form, you react and integrate with that
body, treating it, as you.

Experiments at the end of 2008 at the Swedish medical university Karolinska
Institutet (KI) by cognitive neuroscientists have opened up quite a can of worms
with regard to the meaning of identity. Specifically relating to simstim and
to virtual reality experiences. Several experiments conducted by Valeria Petkova,
and Henrik Ehrsson have been very successful at making subjects perceive the
bodies of mannequins and other people as their own body, and identifying with
it as such.

?Joe opens the sauna-cabinet like a big clamshell standing on end with
a lot of funny business inside. Our girl shucks her shift and walks into
it bare, totally unembarrassed. eager.

She settles in face-forward, butting jacks into sockets. Joe closes it
carefully onto her humpback. Clunk. She can?t see in there, or hear, or
move. She hates this minute. But she loves what comes next!

Joe?s at his console and the lights on the other side of the glass wall
come up. A room is on the other side, all fluff and kicky bits, a girly
bedroom. In the bed is a small mound of silk with a rope of yellow hair
hanging out. The sheet stirs and gets whammed back flat.

Sitting up in the bed is the darlingest girl you?ve EVER seen. She quivers
? porno for angels. She sticks both her little arms straight up, flips
her hair, looks around full of sleepy pazazz. Then she can?t resist rubbing
her hands down over her minibreasts and belly. Because, you see, it?s
the godawful P. Burke who is sitting there hugging her perfect girl-body,
looking at you out of delighted eyes.?

Source: The Girl Who Was Plugged In

They achieved this through perspective manipulation, placing sensory hardware
designed for augmented reality, on the being of the other person, and re-routing
volunteer's senses throuth it. They saw with the other person's eyes, as far as
their brains could tell. Likewise, heard with their ears, and felt with their
muscles, at least as far as their brains could tell. This was enough to make the
brain believe they were that other person. The perspective shift was strong enough,
that they could shake hands with their actual body, and not break the illusion
even whilst looking it in the eye.

The strength of the illusion was confirmed when the subjects' bodies showed
signs of extreme stress when a knife was held to the body they perceived to
be theirs, and threatening gestures made - even when that body was most definitely
not the victim's own. The stressful reaction was more in kin with a personal
survival reaction than an empathic reaction.

The fundamental importance of these experiments could not possibly be overstated.

It seems the experience of 'being' a particular body, is adaptive, even throughout
adulthood, and appears to stem from the point of self-focus - the localisation
of self within a given environment.

In other words, you can adapt to being in any body, and treat it as if it always
was you, within a short space of time.

The Experiments

The experiments used, have much in common with the Rubber Hand Illusion technique
which has been observed before.

Rubber hand studies were first developed for use in VR by the non-profit
science society American Association for the Advancement of Science, (or AAAS
for short,) in 2007. It had been noted earlier, in the case of prosthetic
hands.

The illusion occurs when a person's unseen hand is stroked synchronously
with a visible fake hand, and then the person is asked to point to her or
his own hand. Subjects invariably err in the direction of the fake hand, attributing
it to their own bodies.

Top:
Subject dons immersive HMD that subsumes their visual and auditory senses,
replacing the natural input with cameras and microphones on the dummy.

Bottom: The dummy is touched with a pen within
the subject's altered self view, at the same time and place as their physical
body is also touched.

Image credit: PLoS ONE

This time, the experiments were much more involved than just a hand. An entire
false body was used. In the first set of experiments, two fake, fake bodies
were used as housing for the subject's sense of self. A synchronous one, which
corresponded roughly to the form of the human body, and an asynchronous one
which did not.

The synchronous form was a plastic display mannequin, whilst the asynchronous
was a green plastic crate.

In both cases, the subjects donned a HMD (head mounted display unit) intended
to completely block out their natural sense of sight, and of hearing. Everything
they saw and heard was directed through the LCD screens of the HMD or heard
through its speakers.

The HMD was paired with cameras affixed to the dummy, over the position of
the dummy eyes, whilst the speakers were paired with microphones over the dummy
ears. Each camera and microphone talked to the appropriate one of the HMD's
screens or speakers, such that the individual gained a sense of perspective
of seeing and hearing through the dummy eyes and ears.

To complete the illusion, the subject and mannequin stood in the same position,
both looking down. A researcher then pressed the dummy's abdomen with a pen,
within the camera's field of view. At the exact same time, she pressed the subject's
abdomen with another pen, out of camera view.

The net result was the subject saw the pen press against 'their' body, and
at the same time, felt it in that location with their muscles. As with rubber
hand illusion before it, this new rubber body illusion worked, and in all cases,
the subjects found themselves thinking of the mannequin body as theirs.

The effect was much less pronounced although still evident to a degree when
the box was used instead of a mannequin, perhaps due to the total alienness
of such a body-form.

The pen-stimulation continued for two minutes with each subject, and a variety
of positions and strokes utilised, each time exactly the same area, pressure
and pattern on the faux and actual body. At the end of the session, the subjects
each completed a questionnaire on their experiences, the results of which are
shown below:

The Questionnaire Results: Source: PLoS ONE

An interesting thing to note immediately is that there is no sensation of actually
being the false body when it is just too different from a recognisable human
form. Echoes of the uncanny valley in visualisation, if its not nearly realistic
enough to approach what your brain understands as human-like, no synthesis is
achieved. However, if the form looks at least vaguely human, with hands, feet,
torso, all in the right places relatively, that is enough for a powerful effect
of presence.

Likewise the feeling of the pen against the avatar being pen against self,
is nearly overwhelming when it looks like an actual body.

No-one felt naked, because of course they could still feel their clothes against
their actual body, and it is the sensory data the body recieves which drives
this.

Swapping body with another person

Being poked with a pen is all very well, but for a full SimStim or VR immersion
experience, rather more sensation is necessary. What happens if the body moves.
Does the perception of it still being you hold up, when it is moving about?

The hope was to find that perception of being localised in another human's
body during the performance of various tasks would hold up, but there was no
way without experimentation of knowing if it would.

Like with the mannequin, a researcher wore the camera/microphone rig, whilst
each subject wore the linked HMD. The subjects could see their own body from
the perspective of the researcher, standing opposite. They extended their right
arm, took hold of the researcher's hand and shook it. Their own physical body
providing the motion, and the sensation of hand shaking, which they visually
saw from the perspective of the owner of the hand being shook.

Shaking Hands: Source: PLoS ONE

The arrangement allowed the subjects to view and recognise their own physical
body from a distance. Yet, as before, the visual, and auditory centre of self,
along with the feeling of having an arm shook, wit the visual imagery putting
that hand exactly where it should be, conspired to engender a feeling of shaking
a familiar friend's hand, as opposed to their own body's.

In addition, the effect worked regardless of differences in skin colour, gender,
height and age between researcher and subject. Gender and racial identification
factors were found to be unimportant when perceiving a body as one's own.

Meanings

The sheer scope of this discovery, like the similar work done before it, slowly
edging up in totality of sensation, is one of those times when even a VR vety
is stunned at the implications.

Yes, we now have empirical proof that when enough sensory modality is applied
to a faux body, the person's sense of self shifts to being in that body, in
much the same way as the cyberpunk films and movies have predicted.

It is also immensely interesting that there is indeed a similar line of acceptance
to potential physical bodies, as there is for visual memes. The uncanny valley
has a counterpart - the less familiar a body is, the less likely the mind is
to naturally bond with it. A completely different human body will work, as will
a display feature to a markedly lesser extent. However, a table, a box, a chair.
These things will not.

When VR sensory modality advances enough to be able to place this research
into public worlds, we will experience interesting parallels that could not
be monitored here. A centaur is not like a table or box, yet it is not like
a human form either. At least not entirely. Will the part that is like the human
form be enough to counteract that which is not? It is this researcher's considered
opinion that it will fall somewhere in the valley. Accepted, but not, and perhaps,
just perhaps, taking time to become accustomed to.

Likewise, new hope is breathed into SimStim, that concept of taking a piggy-back
ride through the senses of another person. If they are ever developed, something
with is fairly likely, a neural-feedback recording device such as that used
in novels such as Necromancer or films such as Strange Days, will in actuality,
work as advertised, with the viewer's brain automatically bonding a sense of
self with the sensory data supplied.

In fact, when shownm the research, Kynan Eng, a researcher in neuroinformatics
at the University of Zurich, Switzerland, stated in professional opinion that
after a stimulus of an appropriate kind and duration, people could "produce
measurable ownership responses to any virtual or real object, such as an often-used
tool".

The Race Card

More profoundly perhaps for the general populace, is the interesting tack this
opens up on race relations. As experiments to date have shown, the brain adapts
regardless of the skin colour of the body it perceives as its own. If you were
for example, take someone who is not African-American but who has an unreasoning
bigotry towards African-Americans, and with the aid of a self-perception relocation
system such as this, you can have them look into a mirror, move choreographed
about a room, perhaps look at others, through what their brain is continually
subconsciously telling them, is their own African-American physical body.

The effect would require a greater degree of mimicry, born out of increased
kinaesthetic sensory feedback most likely - ability of volunteer to move their
body in mimicry to how the subject moved theirs. Alternatives include robotic
substitute bodies and highly realistic mocap avatars of course.

Profound Disability goes Bye bye

At the start of this report, we had a quote from the story "The girl who
was plugged in" the 1974 book by by James Tiptree, Jr, a pen name for psychologist
Alice Sheldon.

In it, P Burke, a severely disfigured young lady for whom normal function in
society was all but impossible, was plugged into life support, and her brainstem
attached to a computer which mediated two way communication for a stunningly
beautiful waldo body, a body with no mind of its own, but which became P Burke
in every way that mattered.

The technology level demonstrated in that story is far beyond what we have
currently, but, combined with the research outlined here, the exact same setup
could be used with a robot body, and a sense of self shifted into it, no differently
than with the mannequin. Able to see and hear, talk, move via the robot, individuals
too badly disfigured or disabled to achieve all their hopes and dreams due to
the visual prejudices of others, could be granted a level of normal life above
what is currently available.

?Now let?s get one thing clear. P. Burke does not feel her brain
is in the next room, she feels it?s in that sweet little body. When you
wash your hands, do you feel the water is running on your brain? Of course
not. You feel the water on your hand, although the ?feeling? is actually
a potential-pattern flickering over the electrochemical jelly between
your ears. And it?s delivered there via the long circuits from your hands.
Just so, P. Burke?s brain in the cabinet feels the water on her hands
in the bathroom. The fact that the signals have jumped across space on
the way in makes no difference at all. If you want the jargon, it?s known
as eccentric projection or sensory reference and you?ve done it all your
life. Clear??

Source: The Girl Who Was Plugged In

Telepresence with full Presence

In the same way as it works for disability, telepresence takes on a whole new
meaning when you can guarantee that the interface you are using to put yourself
in a remote location, will feel like it is you there. It opens up the possibility
for teams of robot waldoes used by educational institutes, to enable students
who cannot physically attend, to gain the classroom presence that being in a
classroom allows. Lecturers from across the globe to feel like they are standing
before the class. Doctors, on-scene disaster experts, commanders who gain first
hand perspectives of the battlefields. Robots soldiers whose operators see and
hear like it's their own body.

We know now, that the technology works, that the brain processes data by perceived
location, not by physical. In a very real sense, one of the greatest psychological
humps, is behind us, and we know the technology will function as we hoped. Now,
we 'just' have to bring the actual interaction and synchronisation technologies
to be.

Final Notes

This work was supported by grants from the Swedish Medical Research Council,
the Swedish Foundation for Strategic Research, the Human Frontier Science Programme,
and the European Research Council.

None of these bodies None of these bodies had a role in study design, data
collection and analysis, decision to publish, or preparation of the original
paper, available below.