I worked through a bunch of issues and glitches the last couple of weeks so that tonight I was finally able to play SkyRim in FriiSpace. I barely know the SkyRim controls and the FriiSpace motion system is not precise enough yet to handle fast paced combat situations, so I stayed up in the mountains and avoided people the best I could. But it still made for an amazing experience. Free walking is just such an intuitive interface - I would go for 10 minutes at a time without even having to hit a button. You get so used to the normal sensations of locomotion that I found myself trying (and failing) to crouch and duck around stuff. At one point I was up on a ledge and saw a deer down below me, and I just naturally went to one knee and tried to crane my neck to see through some foliage. Sucks that the game interface is not flexible enough to support that granularity of motion. Also, redirected motion works really great! Before - when I was specifically testing it, I could sense it. But when you trust in the system and just concentrate on your surroundings and moving through complex terrain, you don't notice it at all. It is really strange when you finally take off your headset. You have to look around for a few moments to figure out where the heck you are. A last observation - moving steeply uphill and downhill is a strange sensation. You are looking up but your legs are not climbing and it's a bit confusing.

That sounds like some nice progress, brant! Redirected motion in particular sounds like something that should only work well in theory, so I'm kind of fascinated to hear of it actually working well in real life.

How bad was the dissonance when you ran into something like an unexpected collision volume in-game but kept walking in real life? Also, have you thought about trying out FriiSpace in a homespun application where you can test it with a 1:1 mapping rather than keyboard/game pad emulation? While perhaps not useful for retrofitting games, I think it'd be a good exercise for the technology.

Redirected motion in particular sounds like something that should only work well in theory, so I'm kind of fascinated to hear of it actually working well in real life.

It's funny because initially the things that I thought were going to be simple about this project have turned out to be much more challenging than I expected, while the thing that seemed the most experimental and high-risk (redirected motion) turned out to be the simplest. I can honestly say that redirected motion is the best functioning and most impactful part of the system. But the code is almost embarrassingly simple and took me just a couple of evenings to work out - a rare and satisfying experience for me as a coder. I think people are turned off by the amount of space that you need to operate in, but it really is worth it. It has to be experienced to be appreciated.

BillRoeske wrote:

How bad was the dissonance when you ran into something like an unexpected collision volume in-game but kept walking in real life?

Not really too bad right now. The controls are a bit soft and "cartoonish". I think if you had finer control, you might get some perceptual dissonance and even VR sickness, but since they are so far from perfect it doesn't get that weird. The thing that kept happening to me though is that I kept unintentionally walking through plants and tree branches since the motion control is a bit wonky - and in 3D it makes you flinch.

BillRoeske wrote:

Also, have you thought about trying out FriiSpace in a homespun application where you can test it with a 1:1 mapping rather than keyboard/game pad emulation? While perhaps not useful for retrofitting games, I think it'd be a good exercise for the technology.

Right now, I am just trying to get a simple, low-cost system working that people can use for gaming. It works in my favor right now that the game control interfaces are so limited, because the simple version of this system is so inherently imprecise anyway. I have a lot of ideas for much more complex setups and multi-sensor fusions that would allow for more precision and more degrees of freedom. Those would be fun to integrate with some custom rendering software to see how closely you could match reality. I might get into that kind of thing a bit down the road.

@cybereality: Those in-game videos are really hard to put together, and right now there is not much new to show. The motion control is identical to the last video I uploaded. A redirected motion video would just be a whole lot more of the same. When the motion tracking improves I will post another video. However I have thought about posting a satellite view showing the redirected path that I take during a game session.

I've been meaning to post a GPS track of my redirected walking for some time now - so here it is. Since I live in Texas, we have football fields everywhere and they make perfect locations for this setup. This represents about 20 minutes of exploration in SkyRim. Walking from Helgen to Hallidir's Cairn - maybe a couple of kilometers in real-world and game-world. I killed a wolf and 3 bandits along the way. There's nothing really new here from my last post but I hope it gives an idea of how the system works. Mostly the guidance is imperceptible. You can see a few places where I got close to the edge and the system pushed me away from it. I got pretty close to the southern goal post at one point before the system thankfully guided me away. The steering is noticeable at those points but just for a few moments. There is also one spot on the left side where I actually did cross out of bounds which is why it's great to have a running track around the field so you can feel the physical boundary with your feet. For the vast majority of time however, you cannot tell that you are being guided and the system does a great job of keeping you in bounds. I play at night in pitch black and have learned to trust the system and let myself become completely involved in the environment - an amazing sensation . When I take off my HMD, I have absolutely no idea where I am on the field. So this backs up the academic research that I have read. It takes about a 150 x 150 ft (50x50m) area to implement redirected walking well.

I have been doing a lot of other stuff with this project but I can't give too many details. Mostly I have been optimizing/testing/and replacing pieces of hardware to help improve the performance and usability of the system. Everything from sensors, to displays (ST1080), to physical wiring, to tracking algorithms. A LOT has changed from my initial design plans at the beginning of this thread, but it's all starting to come together (slowly but surely).

I REALLY want to get this in some other peoples' hands soon, but it's just not ready at this point to let out of the lab into the wild. My plan is still to release a commercial version of the system sometime early next year. The project has been re-christened as Red Rovr and I have a blank website up in anticipation of that.http://www.red-rovr.com/

However I want to have some sort of beta this year, and since I think this product could complement the Rift so well, I am considering having a freeware/demo version first that is customized and hard-coded specifically for the Rift and a single game - SkyRim being the likely candidate. Now there's still some crucial pieces of hardware that the user would have to provide (backtop, sensors, etc). But the software would be free and the cost of the hardware is not very high. I just think it would be so cool to have that one killer demo that you could Wow your friends with. Doom 3 BFG is going to be insanely cool, but being able to physically explore SkyRim with the Rift will just be completely over-the-top (I hope).

Last edited by brantlew on Fri Jul 20, 2012 9:23 am, edited 1 time in total.

Great work Brant! For the moment, I assume that you are spoofing keyboard input to get it working with Skyrim. In the future, will your software have an interface to the actual position data, for those of us who want to make stand-alone demos?

@FingerFlinger: That was the intent - but I'm not sure if the first versions will allow that. It also depends on the quality of the signals that I can get out of it. Directional accuracy is one of my biggest problems so limiting it to only 8 directions masks a lot of those errors. In the coming months I want to experiment with swapping the mouse/keyboard interface for the XBox 360 gamepad interface. The gamepad interface allows analog direction and speed so I can test how accurate the signals are along those dimensions. Unfortunately this comes at the expense of head-tracking accuracy because gamepads only model angular velocity instead of anglular position. I am really curious to see what's more important for realism. Accurate motion simulation or accurate head simulation. I suspect the latter which means that keyboard/mouse would be the default. And of course that also makes it compatible with every FPS since Quake 1.

I've been meaning to post a GPS track of my redirected walking for some time now - so here it is. Since I live in Texas, we have football fields everywhere and they make perfect locations for this setup. This represents about 20 minutes of exploration in SkyRim. Walking from Helgen to Hallidir's Cairn - maybe a couple of kilometers in real-world and game-world. I killed a wolf and 3 bandits along the way. There's nothing really new here from my last post but I hope it gives an idea of how the system works. Mostly the guidance is imperceptible. You can see a few places where I got close to the edge and the system pushed me away from it. I got pretty close to the southern goal post at one point before the system thankfully guided me away. The steering is noticeable at those points but just for a few moments. There is also one spot on the left side where I actually did cross out of bounds which is why it's great to have a running track around the field so you can feel the physical boundary with your feet. For the vast majority of time however, you cannot tell that you are being guided and the system does a great job of keeping you in bounds. I play at night in pitch black and have learned to trust the system and let myself become completely involved in the environment - an amazing sensation . When I take off my HMD, I have absolutely no idea where I am on the field. So this backs up the academic research that I have read. It takes about a 150 x 150 ft (50x50m) area to implement redirected walking well.

My group did some of the academic work on redirected walking and I'm really impressed by your set-up: I haven't seen many people actually try this scale before. I think I gather you are just doing a constant rate of turn whilst moving? There is lots of work, outside VR, on how much constant rotation people can detect and I think the research predicted you would need an area about what you are using for this. However its easy to generate degenerate paths where you would walk out of the tracked space if you had constant rate turning, so you will either need a (virtual) safety net ,or you'll end up with very strong steering in certain situations which could cause discomfort/ You can, however, significantly shrink the space if you are more agressive in steering depending on the acceleration of the head. The basic theory about this was pointed out in Sharif Razzaque's PhD thesis at UNC-CH (where redirected walking was introduced), but I haven't seen much development of this, other than to note that you can significantly up the rate of turn in situations where the head is accelerating (to 10s of degrees per second).

If you release this, I can think of half a dozen colleagues of mine who would get one so they could study human locomotion (and play some games no doubt). How accurate do you think you can push the head-tracking?

This is exactly the type of research I want to do with the system once Brant lets us play with it. Do you have any other resources about what specific work has already been done in this field?

One thing I am really curious about is whether it is possible to steer the user toward a particular region of the play area, in order to interact with props or particular terrain, like a staircase or doorway. Towards that end, I will need to develop a quiver of techniques for more aggressively steering the user.

@profvr: Thanks. I read all the academic literature I could find on the subject. A lot of it seems to deal with optimized paths in special situations when the virtual environment is known in advance (ie. 90 degree turns, etc). For the general case, it seems that constant turning is the only viable solution.

I knew about the methods for accelerated turning during head motion. When I was working within basketball court size areas I was planning to implement it that way. But my curiosity and the challenge of it led me to aggressively push the scale of the system instead. Once I was able to achieve football size areas, I just decided to keep the algorithm simple with a constant turning rate. Besides - I saw some potential pitfalls to linking turning rates to head rotation. Since the tension of your neck muscles and shoulders serve as a fixed reference point you would be able to detect that the world was rotating if you turned your head from side to side and back again. In the pathological scenario, you could make the world spin around you by shaking your head back and forth.

At the size and parameters I have now, I only encounter a boundary about once every 6 minutes (give or take). My simple solution is just to amplify the turn and aggressively steer the player back into bounds. It's noticeable but infrequent enough that it barely detracts from the overall experience. And currently there are plenty of other technical glitches that are much more annoying and attention grabbing than boundary enforcement.

How much can I improve the performance of the system? I'm not sure. There is still a good bit of work left to do on the sensor fusion and signal processing algorithms. But, since I am currently just trying to support PC games right now, the game interface itself is a big limiting factor. I don't have to work as hard to support the course granularity of the keyboard and mouse. Later however, I would like to expand the project goals to see if I can achieve finer grain control. Sorry if I am being vague. In a few months I will be able to talk in more detail.

Last edited by brantlew on Sat Jul 21, 2012 12:26 am, edited 1 time in total.

One idea I had for hacking analog motion into games designed for keyboards was to strobe the key presses you send into the game at a high frequency, and with a duty cycle that matches the unit vector of the player's motion, then scale it by their speed. So for instance, to achieve a heading of 37 degrees, your duty cycle for the 'Forward' key would be 1, and for the 'Left' key, it would be .75. Then scale that vector by the player's speed.

I never got around to trying it out, and whether or not it would actually work completely depends on how a particular game captures user input.

Last edited by FingerFlinger on Fri Jul 20, 2012 11:38 pm, edited 1 time in total.

@Krenzo: It's the motion that's the issue. If I could write directly into a motion vector then that would be cool but I wouldn't even know where to begin to find that. It's not like hacking into the rendering pipeline (ie. TriDef). Those hacks change the viewport but not the character position in the game model.

@FingerFlinger: That's a pretty good idea. I have no idea if it would work or not. It's worth a shot though because it would be pretty easy to implement.

One idea I had for hacking analog motion into games designed for keyboards was to strobe the key presses you send into the game at a high frequency, and with a duty cycle that matches the unit vector of the player's motion, then scale it by their speed. So for instance, to achieve a heading of 37 degrees, your duty cycle for the 'Forward' key would be 1, and for the 'Left' key, it would be .75. Then scale that vector by the player's speed.

I never got around to trying it out, and whether or not it would actually work completely depends on how a particular game captures user input.

I'd be worried about weird effects when you overrun the input buffer. You could also introduce some nasty lag as the buffer fills up faster then it's consumed by the application. It probably depends on how the game uses the inputs from keyboard, since it may only read them once a frame, in which case, the frequency would have to be much slower then the framerate for it to work. I'd love to see the results of any tests though.

@brantlew: I'm sure lots of people would be interested in the control system. I wouldn't worry too much about the shaking the head and returning to the same direction, in our experience this is not a problem if your control feedback is reasonable; you would anyway only apply the accelerated turning during certain frequencies of head turn where the vestibular and visual cues are not sensitive (roughly equal to "looking around" rather than "shaking the head").

Of course there are lots of other interesting control problems such as which direction to steer them to avoid the boundaries. I am not up to date, but turning to steer them through the center of the tracked region is almost certainly not optimal.

Yeah, that's pretty much my concern as well. A lower frequency can still be useful for isolating chest and head orientation. It wouldn't be terribly smooth, but you could then walk forward and turn your head without changing the direction of ingame motion.

@FingerFlinger: Well it was an interesting idea but unfortunately it just doesn't seem to work out. I did a quick test with a constant W key and a 50% pulsed D key which should move you at clockwise 27.5 degrees. It works at low frequencies - moving the character in the correct direction, but it appears as a shaky-cam effect. As I increase the frequency I cannot get rid of the vibrational qualities. Instead, the game poller just starts missing pulses and I start losing the angle - but it happens irregularly as the poller and key-pulser move in and out of harmony. So the end effect is that the intended angle is way off, plus there is a vibration in the camera.

It also operates completely different from engine to engine. In SkyRim a frequency of about 12 Hz was the maximum I could push it and maintain the correct angle (with a huge shaky effect). With the Avalanche engine however I could not push it past 4Hz because they employ a small acceleration effect.

Possibly as a transitional effect but I wouldn't count on it. For example if you were walking forward and made a 90 degree continuous pan with your head, you might be able to hit some of the transitional angles with this pulsing effect. But I wouldn't want to linger too long at one of these angles because I think the vibrational effect would be annoying at best, and nauseating at worst. And there is a very good chance it would just look really bad - even during transition.

thats very cool. i knew where i could use it. if u turn around 10 times on a soccer field u could walk for a km. the immersion would be very nice. imagine runnig away with top speed from a monster when u ran out of ammunition.... . it would be cool if there will be some kind of Virtual Reality Theme Parks. with halls full of omnidirectional treadmills where u pay 10 dollars for an hour and big open courts on which u fight against other gamers.

thats very cool. i knew where i could use it. if u turn around 10 times on a soccer field u could walk for a km. the immersion would be very nice. imagine runnig away with top speed from a monster when u ran out of ammunition.... . it would be cool if there will be some kind of Virtual Reality Theme Parks. with halls full of omnidirectional treadmills where u pay 10 dollars for an hour and big open courts on which u fight against other gamers.

I haven't calculated how far I walk, but I sometimes go for several kilometers on the field. Running would be cool but you would really want some compact, ruggedized hardware like a smart phone instead of a backpack and laptop to do that. Plus you would need to significantly increase the size of the play area to accommodate the increased player speed. But it would be a great way to exercise. It would be ironic if hard-core video gaming could make you physically fit.

it would be so cool if u did a 2 hour workout and game.i am currently looking for a used treadmill at ebay. i also will buy an arduino controller. i will try to control the motor by foot contact. if you raise one foot it would automatically start to move. if u put both down it will stop. sounds very easy.... for moving left and right i would use the nunchuk controller on a wii gun.later i could locate the peak of the gun with Leap Motion.

thats very cool. i knew where i could use it. if u turn around 10 times on a soccer field u could walk for a km. the immersion would be very nice. imagine runnig away with top speed from a monster when u ran out of ammunition.... . it would be cool if there will be some kind of Virtual Reality Theme Parks. with halls full of omnidirectional treadmills where u pay 10 dollars for an hour and big open courts on which u fight against other gamers.

I haven't calculated how far I walk, but I sometimes go for several kilometers on the field. Running would be cool but you would really want some compact, ruggedized hardware like a smart phone instead of a backpack and laptop to do that. Plus you would need to significantly increase the size of the play area to accommodate the increased player speed. But it would be a great way to exercise. It would be ironic if hard-core video gaming could make you physically fit.

The real irony is in 50 years, people will be thinking how mad it was that we had to sit around and twiddle gamepads to play our games, getting fatter and fatter doing it.

I haven't really updated in a while, but I have been hard at work on the project. After QuakeCon, I decided to design the system specifically for the Rift and have spent the last couple of months reworking and optimizing for that platform. Designing for a specific hardware platform has really helped solve some issues I was having with the older design and the performance is getting better and better each month. I get a pre-kickstarter Rift in November and I am anxious do the final integration. But I'm a little nervous because it's either going to be the most nauseating activity you can do with a Rift, or it will be the most immersive experience that you can have outside of a military installation.

Right now I am wrapping up a major round of improvements, so I can send some private demo kits out to a couple of forum members at the end of this month. And then I need to rewrite and polish the prototype software as a professional application. So the goal is still to release the product shortly after the Rift. Maybe in December, but most likely at the start of the next year.

I also want to film another set of in-game videos soon so people can better understand what it's like. I am a huge advocate for natural motion systems now. I feel like they complete and amplify the experience a lot. Notice that the Oculus guys often have to seat or restrain the players because the tendency to start moving is so strong. So that's what I'm trying to offer - the ability to just naturally follow your instincts. But I am having a hard time convincing anyone (even in this VR forum), that it's worth it to go out and try this.

There is definitely a scaling problem with this technology. Once VR is mega-popular, we clearly can't have everybody running around soccer fields; there aren't enough of them. But until that point, this will be the best omni-motion solution available to the average person.

One of the things that I want to experiment with before release is scaled turning to see how much I can comfortably reduce the playing field. It's still going to be a big area. I suspect that a basketball court will be too small, but you might be able to squeeze it into 2 side-by-side basketball courts (or roughly 100' x 100'). That is only partially helpful I realize, but any amount you can reduce the playing field opens up more options for people.

I suspect that you can train the player to accept more aggressive turning as they become used to the setup, also. Perhaps you could globally scale the "turning factor" with the amount of time they've spent playing, up to maximum level, or at least allow user adjustments.

I get a pre-kickstarter Rift in November and I am anxious do the final integration. But I'm a little nervous because it's either going to be the most nauseating activity you can do with a Rift, or it will be the most immersive experience that you can have outside of a military installation.

I know this is a challenge with VR... everything has to be right, or it is useless.

One way to address some weakness in a system is to accept a less complicated set-up. I have always thought that one way to address the issue of trying to allow someone to walk, is to give up walking for the idea of riding a vehicle. An electric mobility scooter can become an ATV you are riding, or even a tank.

I wonder if it would be safer to use a mobility scooter than actually walking. This way you don't fall down if your sense of balance is upset for any reason.

Note that I am suggesting only that you keep this scooter idea in mind as you design your system. This is so that you don't accidentally prevent this sort of system as a fall back because of some sort of technical limitation. For example, if your system depends upon foot steps to allow someone to advance, can you turn that feature off?. It also may benefit those who have disabilities if they can use a scooter.

This next comment is more speculative. A future alternative system may depend upon stationary simulators. Things like the car simulators. It would be good for your system to integrate with those as well. I am not sure what sort of data exchange would be needed.

I guess what I am getting at with the last few comments is that your software would ideally allow for input about the type of system it is being used with. E.g. Free Walking, Walking in place, Moving vehicle, Stationary Vehicle. Are there other categories?

I guess what I am getting at with the last few comments is that your software would ideally allow for input about the type of system it is being used with. E.g. Free Walking, Walking in place, Moving vehicle, Stationary Vehicle. Are there other categories?

@cadcoke5: That's funny. I was just thinking about a stationary mode today after looking closer at the Wizdish. I think my motion software and that device could complement each other well since I handle the separation of torso and head. That's something that I have not seen yet in the Wizdish demos. I haven't thought about vehicles but I don't see any reason it couldn't be done. In fact, the problem is simpler because there would less noise in the signal.

A small update. One of the things that will be great about the Rift is that game developers will combine the analog motion controls of the XBox 360 Controller with the absolute coordinate head tracking of the Rift. Up till now, I had to choose to interface the game with either the mouse/keyboard or the XBox 360 (but never both simultaneously). Mouse emulation offers fairly accurate 1:1 head motion but the WASD keys only allow for 8 discrete directions. This makes motion in the game a bit clunky sometimes and can ruin the illusion of control - especially if your head is just slightly off center as you are walking. It feels like the character is drunk sometimes. The XBox 360 Controller on the other hand has analog direction control, but the head motion is velocity based and so very difficult to synchronize properly with your real head. Here is a video demonstrating the difference between the two interfaces. Here, I kept the head vector fixed to demonstrate directional motion, but the same issues apply if the body vector is fixed and the head is moving (looking around while walking).

But now with the Rift, I can get the best of both worlds. Perfect control of the head along with analog control of the motion vector. This should make Rift-enabled games operate much smoother than unsupported games. Unfortunately for games that do not directly support the Rift, I will probably default to WASD and you will have to deal with a somewhat drunken character.

PS. Interfacing the game with the XBox 360 emulator is not simple. A device driver must be written to accomplish this since Microsoft does not supply an API for it. Currently I am using a third-party prototype driver, but I will be writing my own driver soon and will open-source it with FreePIE to support gamepad emulation.

Last edited by brantlew on Mon Sep 10, 2012 9:01 am, edited 2 times in total.

Who is online

Users browsing this forum: No registered users and 2 guests

You cannot post new topics in this forumYou cannot reply to topics in this forumYou cannot edit your posts in this forumYou cannot delete your posts in this forumYou cannot post attachments in this forum