Hands-On with the Perception Neuron Motion Capture VR System

Virtual reality has come a long way in just a handful of years. This time last year, I was quick to write-off the experiences that “state-of-the-art” VR peripherals were providing. From my perspective, strapping a screen on someone’s face and giving it motion sensors does not an immersive and true VR experience make. I still believe that though, to a certain degree, but companies have made a lot of steps towards improving the overall experience.

One of those companies is Perception Neuron.In September of 2014 they raised over half a million dollars on Kickstarter to fund the development and distribution of their special, cutting edge motion capture VR sensors. Part of what made their crowdfunding so successful, garnering well over double the asking amount, is the level of interactivity offered.

The limitations of headset is that it’s just – on your face. You may be able to move your head around to simulate the act of looking in a virtual environment, but chances are you would still be using traditional controller methods, or basic motion controllers like the PlayStation Move. Perception Neuron completely redefines how we interact with games in a virtual space.

At GDC 2015, I went hands-on with the Perception Neuron. As I sat down at their booth on the Expo Floor – a booth that was surrounded by interest convention attendees – I met with their PR representatives to get all the up-front information I needed. They had me wear a thin plastic glove over my hand (presumably to limit the sensors’ contact with moisture from my hands) and slipped a cloth glove over my hand. As you can see in the images, this black gloves had the fingertips cut off with sensors placed on each individual finger.

Since so many small sensors are used in conjunction, this means the game is tracking the movement of each of my fingers and the hand itself in real-time. I put the Oculus Rift on over my head and am immediately taken aback by the bright and vibrant carnival setting I seem to be standing within. Due to the glove’s sensors, I can see a digital representation of my hand in the game as I physically reach my actual hand out in front of my face.

I do the natural human thing and wiggle my fingers around, spread them out, move each of them one-at-a-time, and make a fist to test everything out. To my surprise, it reads my movements accurately. At first, it takes some getting used to since the glove doesn’t provide any actual haptic feedback. That means the glove doesn’t provide any tension or resistance as I touch objects in the game – I can only tell if I am holding something based on the visuals in the game. It’s an unfortunate but completely understandable limitation of the technology.

When you’re completely immersed in a VR game, it’s easy to forget where you’re physically located in the real world. I turned my head from side-to-side, inspecting my surroundings. I saw buckets labeled with different scores, a pile of cans waiting to be knocked over, and a ball waiting to be picked up. I reach down with my hand – both in the game and in the real world – and closed my fingers around the ball to pick it up. I didn’t press a button or make a gesture to signify I was picking it up, I just literally reached down and grabbed it. Once I got acclimated to the experience, I was landing balls in the buckets with ease.

The final part of my GDC session with Perception Neuron involved me putting on an entire full-body set of sensors. This included not only my arms and hands, but my chest, head, back, waist, legs, feet, and everything in between. Once it was all calibrated and synched to my orientation, I was able to freely move about and every single joint movement was perfectly replicated on the screen.

The gameplay applications of something like this are hard to really quantify. Perhaps it could be used to capture a player’s body type to make sure their in-game character matched their real-life counterpart, or in some type of Kinect Sports mini-game collection. The more practical use, of course, is more along the lines of motion capturing actors in a more affordable and accessible way – which would be great for indie studios looking to increase the production quality in their games.

Overall, I came away very impressed with my time at the Perception Neuron booth at GDC. I was surprised by how well it all worked and pleased with how easily I was able to get acclimated with the system. Once I got comfortable, I quickly forgot where I was and just got immersed in the game. The possibilities are close to endless with this type of technology.

Share this:

David Jagneaux Senior Editor Experienced and educated are often not words used to describe David, but he throws them around anyways. He has been playing games since before he could walk and loves to share his thoughts and opinions on the industry as a whole.