Try mixed reality, where the virtual and real collide

Virtual reality is so 2013 – a new, immersive blending of physical and virtual worlds suggests we could one day live our lives in "mixed reality"

I SLIP on the virtual reality headset, expecting the real world to vanish. Instead, something odd happens: I can still see the room I'm in, but it's slightly blurry. I look down at my body – everything's where it should be. With the click of a button the floor suddenly becomes cluttered with virtual objects and an avatar appears before me, extending her arm to shake my hand. She's trying to blend in with the real people but I can tell she's computer-generated. Then everything goes black and white, and suddenly the entire scene feels real.

This is a new type of augmented reality (AR) – called mixed reality (MR). A system that creates it is being demoed for the first time at an event at the graphics company Inition in London. The idea is to show how seamlessly the physical and digital worlds can be blended.

The setup uses an Oculus Rift headset, known for its ability to immerse people in virtual reality. But Will Steptoe of University College London has designed an attachment that pipes in real-time video of the real world, then augments that view. "In tablet-based AR, the user holds a window on to the mixed reality so there is a clear disconnect between what is physical and what is virtual," says Steptoe. With the new rig, a person sees the real world from their normal, embodied perspective (see video).

To blend the two spaces, Steptoe's system applies filters available in image editing software to the combined world. The filters make everything appear slightly cartoonish, but they hide the imperfections of virtual objects, making it hard to tell what's real and what's virtual.

Steptoe thinks the system will transform multiplayer gaming, allowing faraway friends to beam into a shared reality in which virtual players are indistinguishable from people. It could also work for telepresence systems, which allow colleagues in separate places to work together in a common space. Current systems still require interaction through a screen, which limits the experience, says Steptoe. "In 2D, you can't truly share a space."

The prototype system has its flaws: its object-recognition software keeps track of the floor and walls but cannot locate people or objects in three dimensions. This causes a discrepancy when, say, you wave your hand in front of you. If it passes between you and the avatar, it should appear in front of the avatar, but instead your hand is obscured (see picture). Steptoe says mounting a Kinect depth camera on the headset should fix the problem.

Still, inhabitants of MR can manipulate virtual objects. The most compelling of these, Steptoe says, is a virtual display. With the push of a button, you can summon up a device in the palm of your hand, allowing you to browse the internet without having a real device. "In this display, you can simulate other displays," says Steptoe. "You can mimic a tablet or a mobile phone and get it to hover anywhere."

In the future, Steptoe envisions the MR setup will work on a more lightweight headset like Google Glass, as its small display might prove less intrusive than the Oculus Rift one, which covers the wearer's face. And improvements in graphics and display resolution should make the MR experience more coherent.

Back in the real world, I reach for my cellphone but realise the battery is dead. I catch Steptoe immersed in his headset, browsing the New Scientist website on a virtual computer screen. "You can't forget it like an iPad, or run out of batteries," he says with a smile.

This article appeared in print under the headline "The virtual, in reality"

Out-of-body experiences

Despite the rise of videoconferencing and apps like FaceTime, virtual systems still cannot match meeting in person. A European Union project called Beaming aims to change that by placing people in a virtual location where they can interact in a way that feels just like the real world.

Will Steptoe's system is one way of doing this (see main story), but other projects are already giving people a physical form at their destination, allowing them to "inhabit" the body of a robot, say. New insights into how the brain represents the body are helping to make such embodiment more realistic. Beaming is focusing on systems for remote teaching, virtual conferences and rehab for patients in remote areas.

If you would like to reuse any content from New Scientist, either in print or online, please contact the syndication department first for permission. New Scientist does not own rights to photos, but there are a variety of licensing options available for use of articles and graphics we own the copyright to.