SECOND LIFE is an online “virtual world” which enables users to create a customised avatar, or digital persona, with which they can interact with each other. It has become incredibly popular since its launch just over 6 years ago, with millions of “residents” now using it regularly to meet others, socialize and even to have virtual sex. Second Life is now filled with virtual communities and institutions – it has businesses and universities, and its own virtual economy.

Now, imagine a futuristic version of Second Life, in which avatars can transfer sensations to the bodies of their users. Such a scenario may seem far-fetched, but a team of European researchers has just taken us one step closer it. They demonstrate a perceptual illusion in which a computer-generated virtual body can be made to feel like one’s real body, so that one can feel sensations from it and respond to it as if it were real.

The new work, led by Mel Slater of the Experimental Virtual Environments for Neuroscience and Technology (EVENT) lab at the Universitat de Barcelona in Spain, builds on earlier studies which demonstrate that the brain’s representation of the body can be manipulated very easily. The first of these, published in 1998, described the rubber hand illusion, whereby the brain incorporates a prosthetic limb into the body (or takes “ownership” of it), so that stimuli applied to the prosthesis are perceived to originate from one’s real hand. More recently, it has been found that this effect can be induced for the whole body, resulting in the body swap illusion, in which subjects perceive the body of another person to belong to themselves, so that tactile sensations appeared to arise from the other person’s body and not their own.

Following on from this earlier work, Slater and his colleagues now show that these illusions can be easily reproduced in virtual reality. The virtual hand illusion was induced in almost exactly the same way as the rubber hand illusion. Participants placed an arm out of sight behind a screen, and looked at a three dimensional computer-generated image of an arm, which was projected onto a large screen directly in front of them (below). An electronic wand was then used to stroke the participants’ arms. In some cases, the movements of the wand were synchronized with those of a small yellow ball on the screen, so that the participants saw the ball touching the virtual arm as they felt their real arm being touched in the same way. In others, the movements of ball on the screen were pre-recorded, and did not correspond to the touches applied to their real arm.

During the condition in which the movements of the wand and the ball were synchronized, but not in the asynchronous condition, the participants reported that the virtual arm felt like their own, and that the sensations they experienced were caused by the ball on the screen rather than by the wand. Five minutes into each trial, the virtual arm was rotated back and forth; at the same time, electrodes placed on the participants’ real arms recorded electrical activity from the muscles, as if they were also moving. The brain had incorporated the virtual arm into its representation of the body, treating it as if it were part of the body. In other words, the participants had taken full ownership of the virtual arm.

In a second set of experiments, the participants wore a data glove which detects finger movements and transfers them to a computer in real time. In one condition, the data were used to control the movements of the virtual arm, while in another, the virtual arm performed a series of pre-recorded movements which did not correspond to those of the participants’ hands. Only when the movements of the real and virtual arms were the same did the participants report feeling that their arm weas located where the virtual arm was, or that the virtual arm felt like their own.

Finally, the researchers found that the illusion could also be induced when the virtual arm was controlled by a non-invasive brain-computer interface (BCI, bottom right panel in the figure above). Participants were first trained to use the BCI to open and close the virtual hand, by imagining that they were performing the movements. The BCI records the electrical activity associated with this motor imagery, and translates it into computer commands which can be used to control the movements of the virtual hand. In one condition, the movements of the virtual arm were synchronous with the participants’ motor imagery; in the other, they were not. Again, only in the synchronous condition did the participants report a sense of ownership over the virtual limb, and only in that condition was electrical activity from the arm muscles recorded. The illusion was somewhat weaker, but nevertheless still robust.

The illusion has not yet been extended to a full virtual body, but there is good reason to believe that this is possible. In the body swap illusion, ownership of the participants’ entire body was transferred to that of another person, and out-of-body experiences have been induced using a similar method (both involve viewing one’s own body from a third-person perspective). This strongly suggests that the illusion of ownership over an entire virtual body could easily be induced, and Slater and his colleagues are already working towards this, using head-mounted, movement tracking displays to project virtual bodies into the spaces normally occupied by real ones.

And what of Second Life avatars which can be perceived as being real? These findings show that this is possible, at least in theory. Some of the necessary technologies are already available, but in practice, it cannot currently be achieved. Sensations could be transferred from the avatar to the user by a full-body haptic suit, but the experience would only feel “real” if the suit could also transmit the movements of the user to the avatar, precisely and in real time. Immersive avatars are, therefore still a long way off. Meanwhile, control of virtual bodies with BCIs could prove to be very useful in enabling fully paralyzed patients to communicate with others in virtual environments.

This seems to tie directly into the work Ramachandran has done in “body extension” illusion and the neural capacity to map exogenous items onto the endogenous body-map representation. I can dig up the citations if they aren’t already familiar for you.