In Alice’s Adventures in Wonderland, the titular heroine quaffs a potion that shrinks her down to the size of a doll, and eats a cake that makes her grow to gigantic proportions. Such magic doesn’t exist outside of Lewis Carroll’s imagination, but there are certainly ways of making people think that they have changed in size.

There’s nowhere in the world that’s better at creating such illusions than the lab of Henrik Ehrsson in Sweden’s Karolinska Institute. In a typical experiment, a volunteer is being stroked while wearing a virtual reality headset. She’s lyng down and looking at her feet, but she doesn’t see them. Instead, the headset shows her the legs of a mannequin lying next to her.

As she watches, Bjorn van der Hoort, one of Ehrsson’s former interns,uses two rods to stroke her leg, and the leg of the mannequin, at the same time. This simple trick creates an overwhelming feeling that the mannequin’s legs are her own. If the legs belong to a Barbie, she feels like she’s the size of a doll. If the legs are huge, she feels like a 13-foot giant.

Van der Hoort performed this illusion on almost 200 people. Questionnaires revealed that they did indeed think of the mannequins as their own body parts. Familiar objects didn’t break the spell. When van der Hoort threatened the mannequins’ legs with a knife, the volunteers’ skin broke into a worried sweat, as if their real bodies were in danger. If he touched the doll’s legs with a pencil or his finger, the recruits thought they were being prodded by giant objects. Rather than feeling like dolls in a normal world, they felt like normal people in a giant world.

Life doesn’t come with scale bars for us to measure size against. Instead, we work out the size of objects with the help of cues in the environment, like how far away they are or what’s around them. The height of a tree is easier to judge when there’s a person next to it. But these cues aren’t everything – we also judge the size of objects according to how big wefeel.

Van der Hoort demonstrated this by showing the recruits a series of cubes, held at a constant distance from the cameras. If they were in the doll’s body, they thought that the cubes were larger and further away than when they were “placed” in a normal-sized mannequin. In the body of a giant, they thought objects were closer and smaller. When they were asked to stand and walk (eyes closed) to where they thought the objects were , they covered more ground if they were under the doll illusion than if they were under the giant one.

Even though the volunteers were seeing exactly the same things at the same distances under the same lighting, their own perceived proportions changed their judgments of size. “The sense of one’s own body affects how we visually experience the world,” says van der Hoort.

You might think that they just used the size of their own ‘legs’ to measure everything else against, but that’s not entirely the case. They needed to feel like they owned the doll or giant legs for the illusion to work. If van der Hoort broke the spell by stroking the subject and their mannequins out of time, the effect vanished.

For the illusion to work, the brain must combine the information it gets from vision, touch and other senses to create a mental depiction of the body. Based on this study, it seems that this depiction then interacts with the visual signals we get about the world around us.

Van der Hoort didn’t look at which parts of the brain are responsible, but he thinks that the posterior parietal cortex might be involved. This region sits towards the back of the brain and it’s sometimes disrupted by epileptic fits or migraines. When that happens, people sometimes experience “Alice in Wonderland syndrome”, where they feel like they’re growing or shrinking in size, with the world around them getting smaller or bigger accordingly.

It’s possible that people will fall for the illusion no matter what size the mannequins are, as long as their body parts are all in the right proportions. That could have important technological implications. Van der Hoort says, “The present results provide the proof of concept that this could work with very small or very large humanoid robots. A surgeon could experience a full-body illusion of ‘‘being’’ a microrobot performing surgery inside the patient’s body or an engineer could perceive ownership of a gigantic humanoid robot repairing deep-sea oil-drilling devices.” This, after all, is surely the ultimate goal of science – working towards a glorious future when people can drive giant robot suits…

For the illusion to work, the brain must combine the information it gets from vision, touch and other senses to create a mental depiction of the body.

To me the creepiest thing is that ten minutes in the lab can override thirty or forty years of experience. That suggests that the brain’s body model has little persistence, and is updated or rebuilt continuously from sensory data.

The experiments seem to have been done with people who are passively receiving the stimulus. The examples of future use ( surgeon and engineer) would require active participation in the illusion. I sense a disconnect here.

What the brain knows about the body is indeed being rebuilt continuously using perceptual information. The process is called calibration, and is critical – every time you pick something up the dynamics of your body change, and if you couldn’t keep up you would be screwed. It is of course then ‘hard to break the illusion’ because it’s not, strictly speaking, an illusion: it’s recalibration, and while it’s being maintained by information it will persist (and even has a level of stability if it’s not being maintained).

The essence of a calibration process is taking a measurement and placing it on some scale. The visual perception of distance is intrinsically unitless and so of no use; calibration uses other perceptual information to provide an action relevant scale. So you perceive the length of your arm via vision and dynamic touch, and then measure distance via vision using that arm length as your ruler. You end up perceiving ‘can I reach it?’, not some abstract number.

So Ed:Life doesn’t come with scale bars for us to measure size against.
Actually it does: ourselves. The perception-action system has access to information about ourselves as well as everything else, and it perceives the world in terms of our ability to act on it – we perceive affordances. This ability changes (as you pick things up, or become fatigued, etc) hence the calibration process must be an ongoing one – a static ‘brain model’ would never work.

This study is an interesting example – but people have been manipulating perception via recalibration for years. Geoff Bingham, at IU, has done a lot of this with prehension (he’s demonstrated things like the dynamic stability of recalibration); Daniel Wolpert is famous for his distance perception studies too.

Very cool. It suggests that in the future, if virtual reality technology actually gets good, it will work extremely well: people will really feel like they’re “in” the VR, if you could set up the visual & tactile feedback properly.

What would be really cool is if we are all in VR now and will find out some time in the future that there is in fact a technological ‘afterlife’ not based on myths but based on technology of the universe we were not aware of hahah.

I remember reading a paper about the possibility of our universe being a simulation.

I always know when my kid is growing really fast because he starts to bump into everything. I figure his brain is working out his new proportions. Really interesting that we are so constantly updating that info, though I wonder if it happens more quickly to large changes than small ones – something like faster pain reception if it’s really harmful. Perhaps the brain goes, “Whoa guys, something has really changed. We had better devote some resources to this new info.”