In Back to the Future's version of 2015, a couple of snot-nosed kids compare Wild Gunman to a "baby's toy" because "you have to use your hands." In recent years, new controllers like the Kinect have brought us closer to that imagined hands-free future. But even though you aren't holding anything, you still tend to end up waving your upper appendages in front of a camera. For true hands-free gaming, you're better off looking to the eye-tracking system Tobii is demonstrating at CES this week.

Unlike more general purpose camera systems, Tobii's gaze-trackers illuminate users' eyes with an infrared glint, then uses two sensors to take 30 frame-per-second images of that glint as it bounces off the back of the retina and the cornea. Using that data, the system can build a real time 3D model of a users' eye and track its position through three-dimensional space in order to accurately determine where the user is looking on the screen (after a one-time calibration).

Tobii has been making larger, more specialized gaze-trackers for about ten years, and has sold thousands of units for use primarily in the research community and to help aid the severely disabled. But the company is getting ready to make the leap into the consumer space with its latest device, the IS20, a USB peripheral that's less than seven inches wide and one inch tall. That's five times smaller than the version built into a laptop that was shown off at last year's CES. It's also much more energy efficient, working off just three watts of power.

At a pre-CES demo of the IS20, the creators showed off how the device could be used to quickly navigate a webpage as you scan it with your eyes, or to quickly zoom in on your home town in Google Maps. But what most interested me was an asteroid-destroying demo that was probably the first game I've ever played that didn't require me to use my hands, feet, or voice to play. Just holding my gaze on a moving asteroid for less than a second was enough to tell the game to send out a laser-pulse to shatter it into smaller pieces that ricocheted around the screen, as you can see in the video below.

Don't be fooled by my finger on the space bar at the beginning... that was just to start the game.

The new control method required a bit of concentration, and the game's lack of on-screen cues made it a little difficult to understand why the system didn't always register my focused gaze. All in all, though, it was still a pretty incredible experience. Sitting still, with my hands at my side, and watching on-screen asteroids explode with just a glance was the closest thing to effective mind-controlled gaming that I've ever experienced.

A spokesman for Tobii sees gaming as one of the primary industries to make use of gaze-tracking on a commercial level, alongside computer-aided design, financial services and military applications. Developers like Zynga and Blizzard are already using Tobii hardware and software in their internal testing to help refine their interfaces by tracking where players are looking as they play games. As far as using a gaze-tracker as a dedicated game controller, Tobii has developed customized demos that use the feature to enhance games like Minecraft and Starcraft.

Tobii is reaching out to game makers as it begins to sell the $995 developer version of the IS20 and the company plans to put out a limited edition run of 5,000 consumer-grade devices in the fall, targeting professionals and "prosumer"-level computer users. Early on, the device will likely face the same chicken-and-egg problem as any new control peripheral: game developers don't want to invest resources to support a device no one has, and gamers don't want to buy a device that isn't supported by any game developers. But the spokesperson I talked to optimistically suggested a $21 million investment by Intel last year is helping catapult Tobii to a point where it could be built-in to practically every PC within two years time.

If that comes to pass, don't be too surprised if the "Controls" section of future gaming configuration screens starts including a "gaze tracker" tab alongside the now-standard "mouse/keyboard" and "gamepad" options.

Promoted Comments

Sounds interesting, but I suspect 30 frames per second is not going to enough in some of the faster paced scenarios. Also having to focus for 1 second doesnt sound like a long time on paper, but in practice, it will limit usefulness. Although there are plenty or scenarios where this kind of tracking could enhance a game. For example, targeting enemies or incoming projectiles in a simulator like Mechwarrior maybe

Tobii makes eyetrackers with 60 Hz and 120 Hz tracking rates (and these are the same rates as everyone else's eyetrackers, too). The IS20 system is available in an embedded version (30 Hz) and a "host" version that is 60 Hz. But at this rate of improvement, I am sure that if there is a need for 60 Hz that will be a standard by next year. (If I read it correctly, the embedded version does all of the tracking with its own hardware. The host version runs software on the computer so it will put more processing load on the computer.)

There's also IS30 hardware which is suitable for wider displays. The IS20 is suitable for laptop-sized screens.

There are actually quite a few companies making eyetrackers, but Tobii is a leading name in the field and make very nice equipment, especially for integrated applications (e.g. built into a display). There are others that are better at other applications, like tracking gaze around a room. What has been interesting to watch is how Tobii has been shifting attention from research applications to consumer applications. I don't know if it is really ready for prime-time yet, but I'm sure it will be eventually and it sure looks like when it does, it will be Tobii that is the go-to brand, not Applied Science Laboratories or Sensomotoric Instruments or Optitrack or SR Research or etc.

Sounds interesting, but I suspect 30 frames per second is not going to enough in some of the faster paced scenarios. Also having to focus for 1 second doesnt sound like a long time on paper, but in practice, it will limit usefulness. Although there are plenty or scenarios where this kind of tracking could enhance a game. For example, targeting enemies or incoming projectiles in a simulator like Mechwarrior maybe

This is very exciting tech. The biggest issue I see with it is that it's great for tracking focus but not so great for deciding when to 'click' on something... there are lots of issues with a set time-frame for determining when to activate the icon you're staring at. For example, if you stop to think about what to do next for a second, it might (in certain use cases) fire up a program you didn't particularly want to use right then.

Combined with this, however, I can see some serious potential for touch-free computing. Simply look around on the screen to 'move your mouse', and then twiddle your fingers in the air to 'click'. The possibilities are endless...

Sounds interesting, but I suspect 30 frames per second is not going to enough in some of the faster paced scenarios. Also having to focus for 1 second doesnt sound like a long time on paper, but in practice, it will limit usefulness. Although there are plenty or scenarios where this kind of tracking could enhance a game. For example, targeting enemies or incoming projectiles in a simulator like Mechwarrior maybe

Tobii makes eyetrackers with 60 Hz and 120 Hz tracking rates (and these are the same rates as everyone else's eyetrackers, too). The IS20 system is available in an embedded version (30 Hz) and a "host" version that is 60 Hz. But at this rate of improvement, I am sure that if there is a need for 60 Hz that will be a standard by next year. (If I read it correctly, the embedded version does all of the tracking with its own hardware. The host version runs software on the computer so it will put more processing load on the computer.)

There's also IS30 hardware which is suitable for wider displays. The IS20 is suitable for laptop-sized screens.

There are actually quite a few companies making eyetrackers, but Tobii is a leading name in the field and make very nice equipment, especially for integrated applications (e.g. built into a display). There are others that are better at other applications, like tracking gaze around a room. What has been interesting to watch is how Tobii has been shifting attention from research applications to consumer applications. I don't know if it is really ready for prime-time yet, but I'm sure it will be eventually and it sure looks like when it does, it will be Tobii that is the go-to brand, not Applied Science Laboratories or Sensomotoric Instruments or Optitrack or SR Research or etc.

I hope this develops into real accessible hardware, that will finally add/change the default keyboard+mouse control scheme. For not-disabled people it doubt it will completely replace the keyboard and mouse, but it can add another dimension to interfacing with a computer. What i find to be biggest problem with this so far is the "clicking" - IMHO it can't be time-based, if it's possible they should make it so that you can setup wink patterns and bind them to actions. Wink left eye then right eye = left click, right eye then left eye = right click... No one winks their eyes in such manner normally, and even if it sounds a bit weird to do it i am pretty sure after 2-3 days of use you'll get used to it.Also, OS (and thus games) should not tie the mouse cursor to the eye-tracker. They are not the same controller and to me it makes no sense to have them tied up, it almost beats the purpose. The tracker should have a behavior similar to mice, but it shouldn't move the mouse cursor.

I've worked with research grade eyetrackers, and often, the calibration process is a pain in the ass, really.

swimboy wrote:

What about people with glasses? Can it see through the glare off the lenses?

Glasses don't really tend to affect the tracking very much.

Most of the ones I've worked with have relied on IR reflections ("glints") from the retina, and on a more sophisticated multi-camera/multi-angle version, by also visually tracking facial features like the edges of eyebrows, corners of the mouth, nose etc. to better reference the eyes. It's pretty sophisticated stuff, and as a result, doesn't work a fair amount of the time on high-end kit costing >$50K. I've also encountered occasional people who simply couldn't be eye-tracked.

Essentially, while this tech could be amazingly useful, the critical point is : can I reliably sit down in front of it and have it accurately tracking my gaze in less than 20 seconds, every time ?

If not, you're far better off with a mouse unless it's really, really necessary - e.g. getting the price down significantly would be fantastic for people with physical disabilities.

If this is reasonably fast and accurate, coupled with a trigger it would make an excellent controller for FPS.Picture just aiming with sight and a click for shooting, plus strafe/look with a mouse.

It would change strategy in online FPS.

Unfortunately, many shooters model gravity effects on bullets so...maybe? You would need to train yourself to visually lead targets which is a lot harder than it sounds I would think since your eyes will instinctively be attracted by movement. It also ignores another problem - you can only "glance" so far. After that you need to either rotate your head briefly or turn your whole body (the Occulus Rift excels at head tracking but even it needs a separate turning control).

I've been fortunate to have access to such technology until last year. Things are getting better but there are some issues. The biggest one I encountered was eye and neck fatigue.

When you stare intently, you are exerting more control than normal from your eyes. From my experience, your eyes will get very tired and will water less. You need to dutifully remember to take breaks so as to not hurt your eyes.

As for the neck fatigue, it takes some practice because at first you stiffen up when using the system. This is where systems are getting better by allowing more movement away from center. The author even said in the story how he was “Sitting still”, a very unnatural thing to do for a more than a short while.

The big benefit I can see is it being used as an enhancement. By knowing where you are looking, the system / game can make adjustments such as panning / tilting in the area of the gaze. Another one could be adaptive graphics quality (the edges of our vision is not sharp and some cycles could be spared when looking at the corners). I’m sure there are other enhancements possible from gaze tracking.

A final thing, it’s not a fast system to stare and ‘click’. You need the threshold high enough to prevent frequent accidental clicks when reading but low enough to be usable. This is usually a second or two. Most people mouse faster than that.

Sounds interesting, but I suspect 30 frames per second is not going to enough in some of the faster paced scenarios. Also having to focus for 1 second doesnt sound like a long time on paper, but in practice, it will limit usefulness. Although there are plenty or scenarios where this kind of tracking could enhance a game. For example, targeting enemies or incoming projectiles in a simulator like Mechwarrior maybe

If this is reasonably fast and accurate, coupled with a trigger it would make an excellent controller for FPS.Picture just aiming with sight and a click for shooting, plus strafe/look with a mouse.

It would change strategy in online FPS.

Gorilla Arm, meet Fatigued Eyes.

Mhh... you already have to look at your enemy when targeting with a mouse/gamepad/whatever. I don't see how that would be tiring. Just look at the threat an hit the trigger. It's not like you would have to think about it. It's something that you do mechanically.

Mhh... you already have to look at your enemy when targeting with a mouse/gamepad/whatever. I don't see how that would be tiring. Just look at the threat an hit the trigger. It's not like you would have to think about it. It's something that you do mechanically.

I think you'd be surprised at how much you look at places on the screen *besides* the exact spot the reticle is pointed.

Combined with this, however, I can see some serious potential for touch-free computing. Simply look around on the screen to 'move your mouse', and then twiddle your fingers in the air to 'click'. The possibilities are endless...

Holy crap yes, these two things need to be merged right now. I don't know which company is bigger or has more funding, but seriously put these two together and computing as we know it will change as drastically as when apple introduced the mouse. I say that as someone old enough to have bought the first mac in 1984. This is even more exciting than that was.

How I wish this technology could be miniaturized & incorporated into autofocus glasses -- I have three sets of glasses now (trifocals, glasses for computer work, & simple reading glasses). If the system knew where (how far) I was looking, maybe it could it could autofocus for me. Wow -- that would be great in and of itself.

Kyle Orland / Kyle is the Senior Gaming Editor at Ars Technica, specializing in video game hardware and software. He has journalism and computer science degrees from University of Maryland. He is based in Pittsburgh, PA.