Yesterday I took part in FIVR (Finland VR) meeting and got the chance to try out HTC Vive and other VR technology that I haven’t tried before. The FIVR group started meeting autumn 2014 and within one year the number of participants grew from four people to 40. This is mostly thanks to the Finnish game company Mindfield Games who have been very active in organizing the events.

The meeting and our demo was featured in TV news and an online article by Finnish public broadcasting company YLE.

Valve’s Portal VR demo for HTC Vive is the best VR demo that I’ve seen (disclaimer: I haven’t tried Oculus Rift CV1 yet). While interaction-wise it’s nothing special, Valve’s high production values have resulted in an audio-visually beautiful piece in the familiar Aperture Science environment that is loved by many. A big part of the experience is the HTC Vive hardware that performs exceptionally well; The 110 degrees 2160×1200@90Hz HMD takes immersion one step further and I’m yet to see such precise and low-latency 6DOF tracking, all working in a room scale.

We don’t have HTC Vive yet, but hope to get one in the future. We are still working on a Kinect 2 + Oculus Rift demo, and after that we’re planning to modify the RUIS code architecture so that it will be easier to add support for upcoming VR input devices (e.g. Oculus Touch, Sixense STEM).

Unity 5 allows the use of Kinect 2 and Razer Hydra without a Pro license, so it makes sense to update your RUIS project to Unity 5. In May June we will put out an official RUIS for Unity 5 release. Meanwhile you can use the below guide to upgrade RUIS to work in Unity 5.

Update instructions
1. Create backup of your project.
2. If you have modified any RUIS scripts, prefabs, or scenes, you need to create duplicates of them because RUIS files with the original names will be overwritten by the RUIS update.
3. Install Unity 5. If you are a Windows user and intend to use Kinect 1, you should install the 32-bit Editor (Additional Downloads, For Windows link):http://unity3d.com/get-unity/download?ref=personal
4. Open your project with Unity 5, and Upgrade the project when Unity asks to do so.
5. When “API Update Required” appears, choose the option “I Made a Backup, Go Ahead!”.
6. The project should be open now. Ignore any errors in Console, and open the RUISunity1071_Unity5.unitypackage file in Explorer/Finder and import everything.
7. After importing, delete the following files and bundles in \Assets\Plugins: KinectForUnity.dll, libOculusPlugin.so, OculusPlugin.dll, sixense.dll, sixense.bundle. Do NOT touch any files in the Android, Metro, OculusPlugin.bundle, x86, and x86_64 subfolders.
8. Everything should work now, unless some of your own scripts or assets are broken.

P.S. If anyone knows a math library for C# that supports (or can be easily modified to support) in-place matrix operations for multiplication, addition, inversion, and transposition of 4-by-4 matrices, let me know! The current matrix library that we are using (and Unity’s Matrix4x4) allocates new memory after every matrix operation, which causes very frequent garbage collection and results in a noticeable performance loss when Kinect rotation smoothing is enabled in RUIS.

Aalto University’s course for virtual reality started earlier this month, and to get things rolling we have just released the latest version of RUIS for Unity. RUIS 1.07 adds new features and fixes many issues of our previous release, which we admittedly rushed out to be in time for the Spatial User Interaction 2014 conference in October. Among other things, the bug fixes of RUIS 1.07 restore positional head tracking functionality for Oculus Rift DK1 using Kinect 1/2, PS Move, and Razer Hydra.

For Kinect 2 we have added avatar joint filtering and a fist gesture that can be used to grab and manipulate objects. Developers with Oculus Rift but without Kinect get to choose whether the Rift’s orientation rotates only the avatar head, the whole body, or the walk forward direction.

Kinect 2 tracked player grabs the hammer with a fist gesture.

Unity Free users can now use Oculus Rift, but they need to import OculusUnityIntegration.unitypackage and overwrite the existing files. Due to the premature nature of the Oculus Unity integration, there are also other considerations that you can check from the “Known issues” section of our readme. These issues, including “judder” in Unity Editor, should be alleviated as new versions of Oculus SDK will be released in the future.

We’ve performed appropriate testing this time, and RUIS for Unity 1.07 holds together well and is shaping up nicely. There is still jaggedness in the motion of Kinect controlled avatars and motion controllers, that seems to be related to Unity frame update and the irregular device refresh rate. We’ll hope to fix that for the next release.

Below you see VR applications created by my students with our RUIS for Unity toolkit. Most of the applications featured Oculus Rift DK1, Kinect, and PlayStation Move controllers.

This year and in 2015 the virtual reality course is organized in Aalto University under the name Experimental User Interfaces. A new course is starting in January of 2015.

A video of the student-created applications is available:

Wheelchair Hero

An empowering first-person game using Oculus Rift, where the player controls a wheelchair by spinning wheels that have PS Move controllers attached to them.

Flying Game

A two player co-operative game, where the player with Oculus Rift and a rifle sneaks around a city, using a laser to light up targets which can be destroyed by the second player who pilots an attack helicopter.

Virtual Curling

Two player co-operative curling simulator where one player is a curler who “throws” stones, and the another player acts as a sweeper, affecting the trajectory of the stones while it slides on ice.

Lazerzilla

The player’s avatar is a giant cyber-lizard, who uses his claws and “laser breath” to destroy skyscrapers while fighting human soldiers, tanks, and helicopters.

COVRSCPG

A co-operative two player game in the spirit of Super Monkey Ball and Marble Madness; one player is controlling a size-varying ball from a first-person Oculus Rift view, while the other player uses a god-view to help him advance in obstacle courses.

Runner

Two players compete on who can travel further on a snowy path filled with dangers.

All you see above is created by students who have no or very little experience in creating VR applications. Five out of six applications featured two different display systems: Oculus Rift and two stereo 3D screens (for audience and/or second player).

In the course the students were free to create any kind of applications, and for some reason everyone chose to develop games 🙂

We have also added a process to calibrate the transformation matrix between several different sensor pairs (see above image). This enables to use Kinect 1, Kinect 2 (Win8 only), Oculus DK2, and PS Move in the same coordinate system even if the individual sensors have some distance between them or are oriented into different directions (the sensors’ view frustums need to partially overlap though), In other words, if you have calibrated Oculus Rift DK2 and Kinect 2, the Kinect 2 avatar’s head and body is correctly aligned with the head tracked position of Oculus Rift DK2 when you are using RUIS prefabs, and you will see your whole body in virtual reality!

There are still some issues that will be fixed for the next RUIS release. For example, Kinect 2 joint data is not smoothed yet, and the joints have a noticeable amount of jitter. The “Known issues” section in RUIS’ readme file lists a few more rough edges.

We just received our Oculus Rift Development Kit 2 (DK2) head-mounted-display and are thrilled to report our experiences with it.

Rift DK2 in front left, its position tracking camera in front center, and two Kinect 2 sensors behind them.

The DK2 comes bundled with an infrared webcam that tracks the Rift’s position (and most likely helps to correct yaw drift in orientation as well). My first question upon unboxing DK2 was “Where the infrared LEDs at??”

So I pointed Kinect 2’s infrared camera at it, and took the below picture:

The LEDs appear as overexposed white blobs in the infrared image.

It seems that the LEDs are below the Rift’s exterior, which is made of (special?) plastic that lets through IR spectrum but absorbs visible light, hiding the nasty insides.

DK2 demo experiences

Oculus demo scene is best for checking out the tracking and image quality, as the scene is peaceful and its 3D objects are simple and elegant. Cyber Space is a virtual amusement park ride for those of us who want to explore our cyber-sickness limits, Horse World Online is only for the most hard-core horsie fans, and Chilling Space has a calm atmosphere (we didn’t notice how positional tracking was employed though).

DK2 has been out for a relatively short time, and I’m not aware of any killer apps for it yet. Personally I’m looking forward to the DK2 version of the Senza Peso opera.

In many ways Oculus Rift DK2 is superior to DK1: head position tracking is responsive and accurate, which is integral to immersion and minimizing nausea. While the screen door (pixel border) effect is still noticeable, it’s a minor nuisance as there are major improvements in other areas. DK2’s resolution is higher, its OLED-display produces a better color range, and its image is crisp because the motion blur from slow pixel color change times has improved (except for blacks). The tracking latency is still low as it should be, and the low persistence technique really seems to do the trick, considerably reducing cyber-sickness.

Meant for each other?

That’s it for now, we’ll get back to combining DK2 with Kinect 2! It’s wonderful stuff, keep your eyes on us!

I participated in the IEEE Virtual Reality 2014 conference that was held between March 29th – April 2nd in Minneapolis. Eager beavers can jump straight to the below link to see a list of the best papers and demos at the conference:http://ieeevr.org/2014/awards.html

Sense of touch is one of the major senses and perhaps the most challenging sense to provide with convincing virtual sensations. Currently haptic feedback is missing from most virtual reality applications. Reactive Grip could change that for many applications: it is cheap and simple haptic technology that could be integrated into any number of modern game controller variations. The handle of Reactive Grip utilizes four sliding contactor plates whose movement conveys the sense of inertia from the virtual object. Examples include gun recoil (kickback), struggle of a fish caught by a fishing rod, or hit of a sword against another virtual object. I tried bunch of demos that included those examples. Tactical Haptics have close ties with Sixense, and in the prototype motion tracking is handled via Razer Hydra controllers.

Reactive Grip has its limitations: because it is a game controller, the Tactical Haptics’ device can only approximate sensations from rigid, hand-held objects such as virtual gun grips, fishing rods, steering wheels, and other tools. For most games and applications this should be enough though. Reactive Grip is a mechanical device and I wonder how robust it can be. Traditionally haptic devices break easily.

Funny thing is that if you close your eyes when using the controllers, the haptic feedback alone doesn’t convey what you are doing in the virtual world, due to the vagueness and low fidelity of the haptic effect. But when combined with audiovisual cues, the different perceptions merge together gracefully providing more immersion than any of the cues alone. Most importantly, the haptic feedback doesn’t contradict the audiovisual cues but rather supports them.

Tactical Haptics ran a Kickstarter campaign last Autumn that unfortunately didn’t go through. People really need to try out this controller to see its potential. Tactical Haptics’ invention is something that for the first time could bring haptic feedback to the masses, especially if one of the major console manufacturers would adopt it.

The acquisition of Oculus VR by Facebook was a big news topic throughout the conference. As such it was a pity that we didn’t get to see Crystal Cove or DK2 prototype of Rift. Vicon was in talks with Oculus VR to bring DK2 to the conference, but at the time Oculus canceled public demonstrations of DK2 due to the Facebook buyout. That’s what I heard anyway. Palmer Luckey was also supposed to participate in the conference, but apparently the Facebook acquisition and the related death threats to Oculus staff got in the way.

Several times I witnessed Oculus’ HMD referred as Facebook Rift and FaceRift. Perhaps there was slight bitterness in the air regarding the 2 billion dollar buyout? This is understandable as traditionally VR hasn’t been a very lucrative business, and suddenly seasoned VR researchers and practitioners see a VR company go from zero to a hero in less than two years.

I talked to a person who had tried Sony’s Morpheus, DK2, and Valve’s prototype. In his opinion DK2 and Morpheus were very close to each other performance-wise. He liked Valve’s prototype the best though, because of the wide positional tracking that was implemented with camera-based inside-out-tracking of fiducial markers. With Michael Abrash joining Oculus, hopefully the good features of Valve’s prototype will be transferred to future Oculus HMDs.

University of Minnesota presented a bunch of their VR related projects to the conference audience. The most interesting one was a high-resolution, wide-FOV HMD built from an iPad mini and a 3D printed frame. In their demo up to 6-8 people wore the HMDs, dwelling the same VR place simultaneously, while being tracked over a large area using a commercial optical tracker.

The HMD utilized high-quality glass optics (~$40 a piece) to spread the iPad mini’s 2048-by-1536 resolution to a FOV that similar to Oculus Rift’s. Needless to say the image was much crispier than with Rift, whereas the iPad’s orientation tracking was slightly less responsive than that of the Rift. Overall, I was very impressed with this HMD!

P.S. I also visited Kinect 2 Developer Preview Program Breakfast that was co-organized with Microsoft’s Build Conference in San Francisco. Microsoft hopes to start selling Kinect 2 for Windows in the summer, and us developers with the preview version should get a Kinect 2 Unity plugin even before that.

Last week Oculus Rift was acquired by Facebook for 2 billion dollars, which is the biggest move in virtual reality industry that we have seen. I speculate that this was at least partially influenced by Sony finally becoming serious with head-mounted-displays through their Morpheus HMD.

Another news piece of (almost) similar proportions, is that the latest version of RUIS for Unity is out 🙂 Oculus Rift package has been updated to version 0.2.5c and several bugs have been fixed. So what can you do with RUIS for Unity? Use RUIS’ Wand prefabs to easily bring interaction via input devices like Razer Hydra, Kinect, and PlayStation Move to your application, configure multiple mono or stereo displays in Unity through RUIS’ DisplayManager, or use the MecanimBlendedCharacter prefab to blend real-time Kinect body tracking with Mecanim animation of your 3D avatar.

Speaking of Oculus Rift, apparently some people experience 150 ms latency in certain applications built with Unity. Jason Jerald found out that this can be remedied by
commenting out the following line inside SetMaximumVisualQuality() function of OVRCameraController.cs script:

The year 2014 looks very promising for virtual reality; A new version of Oculus Rift is coming out, along with plethora of VR peripherals like Sixense STEM and Virtuix Omni. Developing applications that use these devices means that middleware and software toolkits like RUIS will have an even more important role in the future, when developers want to combine different devices or develop using higher levels of abstraction.

Valve and Sony are working on their own head-mounted-displays, and who knows what surprises this year has in store for us! [update: seems like Valve is not making their own HMD after all] VR gaming is far from becoming mainstream however, and I suspect that it will be in 2015 at earliest, when indie developers start to make some serious profit with games that exclusively require VR peripherals. I don’t expect established game companies to develop big budget VR-only games in the near future. What about games that have a traditional UI and a VR user interface then? I have my own reservations; Getting two interfaces to work in one game while sharing gameplay mechanics etc., requires a lot of work and is likely to dilute both experiences if not botch at least one of them altogether.

New virtual reality course

Starting this January, we will run our virtual reality course for the 4th time in Aalto University (we started organizing it in 2011). Student teams will develop virtual reality applications using Oculus Rift, Kinect, PS Move, and other peripherals. Check out the projects from the previous year. Any interested Aalto University students should keep their eye on the course homepage, and note the new course name: Experimental User Interfaces. I also have access to Kinect 2, which will be supported in some future version of RUIS for Unity.

And speaking of further developments of RUIS: Since autumn we’ve been working at our own pace to improve RUIS for Unity with the aim of releasing it in Unity asset store. We have been improving documentation, adding essential features, fixing bugs, and making RUIS easier to use. Work has been slower than we anticipated and we missed our planned release date, as I’m busy writing publications for my PhD and Mikael has been focusing on his Master Thesis. It’s coming however, with all the features that we used to combine Oculus Rift with Kinect, PS Move, and Razer Hydra in our TurboTuscany demo. And then some 🙂