Oculus Rift + Kinect + Razer Hydra + PS Move demo released!

In the past months we’ve been adding new features and Oculus Rift and Razer Hydra support in RUIS for Unity. Our just released TurboTuscany demo showcases the new capabilities of RUIS:

Video part 2:

TurboTuscany features a 1st person view with a Kinect controlled full-body avatar and 4 methods for six-degrees-of-freedom (6DOF) head tracking:

Oculus Rift + Kinect

Oculus Rift + Razer Hydra

Oculus Rift + Razer Hydra + Kinect

Oculus Rift + PlayStation Move (+ Kinect)

Head tracking with Razer Hydra

It makes a difference to see your own body in the virtual world, affecting the sense of presence. Those of you with Kinect: Try pushing and kicking stuff, or even climb the ladder with your hands. You can take steps freely inside Kinect’s range, and when you need to go further, just use a wireless controller to walk or run like you would in any normal game. We are blending your Kinect captured pose with Mecanim animation, so while you’re running and your feet follow a run animation clip, you can still flail around your hands and upper body as you like.
(Kinect users will need to install Win32-bit version of OpenNI 1.5.4.0. See the readme that comes with the demo for other details.)

Positional head tracking with Kinect alone is quite rough, so try the Razer+Kinect or PS Move option if you can.

RUIS for Unity will possibly support OpenNI 2 via ZigFu in the future. Even better would be if PrimeSense made their own OpenNI 2 unity plugin with less restrictive license. Right now the focus is on Kinect 2 (hopefully will get the beta version this Autumn).
Our development is a bit slow since it’s just 2 guys working on our little spare time (we just used the last of our funding).

Windows 8:
Windows 8 doesn’t like unsigned drivers. People with Windows 8 have successfully installed OpenNI and gotten TurboTuscany to work with the following procedure:
Uninstall OpenNi, Nite, and the Kinect driver
Windows key + R to open the run prompt
shutdown.exe /r /o /f /t 00
Select troubleshoot
Select advanced
Select Windows startup and then restart
Enter the option for Disable Driver Signature
Reinstall OpenNi (32-bit version), Nite, and the Kinect driver.
“I followed those instructions and it worked for me I suspect that only the driver needs to be re-installed, I also had to go in to device manager and make sure it was pointed at the right drivers because I had installed the oppenni sdk drivers and they don’t work with processing.”

Kinect for Windows:
Microsoft released Kinect for Windows and Kinect SDK, but they are not compatible with OpenNI. The kinect-mssdk-openni-bridge is an experimental module that connects Kinect SDK to OpenNI and allows Kinect for Windows users use OpenNI applications. This bridge _might_ get TurboTuscany demo to work with Kinect for Windows:https://code.google.com/p/kinect-mssdk-openni-bridge/

1) No, the grandma model uses a more complex skeletal structure that came with it when we downloaded it from Mixamo. In our RUIS for Unity toolkit the developer can use pretty much any human-like model and map it to Kinect. The current RUIS toolkit package in the download section is buggy and without documentation, but we intend to upload a more polished version by 10th of February.

Shouldn’t it be possible to attach two calibrated flat cameraboards with fisheye lenses on to the front of the OR? That could be useful to turn the OC into a heads-up display that also reflects the real scene on front of the user. And you could attach a PrimeSense board between it to get additional 3D data.
Looking into your real room and turning it into a Command and Conquer battlefield could be a next step. You command your units around with hand gestures and voice commands, placing buildings on the couch, connecting it to the coffeeetable with a virtual bridge, etc. … Endless possibilities.

I was an adviser in a master thesis work where two fisheye cameras were used with Oculus Rift, it was pretty sweet.
Researchers have combined head-mounted-displays with real-time Kinect 3D reconstruction. For now I could find only this video, where only the hands are reconstructed with Kinect:https://www.youtube.com/watch?v=R0-dsbeasgA&t=6m11s
But yeah, this is all cool stuff 🙂

I’m wondering something about the wonderful grandma model and how it moves. I see there’s a walking animation attached to it, for when you’re moving about. I see that you can make her walk using either gestures with your arms or pushing buttons on a controller. Yet, when you kick aside that barrel with your legs, your Kinect-tracked legs become the main controller of the model. Is this a completely trivial thing or did you have to introduce conditional statements to check when to apply either animation or map skeleton tracking to the model?

The “long distance walking” in the demo is indeed triggered with controller buttons (PS Navigation controller / Razer Hydra / gamepad). Most of the time the character is 100% Kinect controlled, but when one of those walk buttons is pressed, then our scripts start blending in a walking animation loop on the legs of the character. Personally I see this as a viable alternative to the cumbersome VR treadmill controllers.

This was not trivial to implement even with Unity’s Mecanim (using only animation layers was not enough), because in run-time our script dynamically creates two copies of the skeleton rig (Kinect animated and walk cycle animated) that we blend into a third rig on every frame update.

Now that we have created such functionality, anyone can just download RUIS for Unity, and use the MecanimBlendedCharacter prefab (see OculusRiftExample or KinectTwoPlayers scenes). You can even replace the default Constructor guy model with your own human model, although some joint position tuning is most likely required if you want the Kinect controlling to be just right.

Currently there is no walking gesture in RUIS. I think you’re referring to this part of the video:https://www.youtube.com/watch?v=-YYiTkf3sDs&t=0m50s
There I’m using an Xbox 360 controller to make the character walk (in other parts I use PS Navigation controller and Razer Hydra). I was just moving my arms for show 🙂
You could try creating your own walking gesture algorithm that tracks the Kinect hand positions.