How: To drive a wheelchair with your eyes (Generation 2)

In my previous post, I covered how Microsoft found the motivation to help create technology for people living with ALS and how we started our first project: creating a wheelchair you can drive with your eyes. Now let's look at how we do it and what's next. I'm describing a slightly different implementation than what was done in the //oneweek hackathon -- this is a 'generation 2' prototype that we've created here in Microsoft Research as part of the work of our new team.

Control Cable

We created a custom cable to control the wheelchair. This cable looks like a USB virtual serial port to Windows and it turns on and off the pins on the Omni input channel which in turn runs the wheels of the chair. We started with an 'old school' DB9 F/F serial cable which we chopped in half.

In the middle of the cable we added an Arduino which then connects via USB to the Surface:

Based on the Omni's port layout, we wired the cable to the pins of the Arduino:

The only tricky part of this work was realizing that the Omni is made to take input from 'old school physical switches' which are turned 'on' when you short them to ground and turned 'off' when they are open, e.g. there is no continuity to ground or power.

This means that for the Arduino to turn an Omni pin 'on' you call pinMode OUTPUT and digitalWrite LOW. To turn an Omni pin 'off' you call pinMode INPUT.

Software

We flashed the Arduino with Firmata which is a very simple 'remote control' protocol so a PC can easily turn the Arduino pins on and off. Firmata can be found in the Arduino development environment under File -> Examples -> Firmata -> StandardFirmata.

We spoke to Firmata over the virtual serial port using an existing library, Firmata.NET.

We wrote an app to take eye gaze input and turn it into commands for the Arduino. The app was based on the EyeX SDK for .NET with a simple Windows Presentation Foundation UI (literally four buttons on a screen). The code to control the arduino is very simple, for example the intialization logic and the Forward method are:

Mounting System

Eye gaze only works when the angle between the eyes and the eye gaze sensor (the bar on the bottom of the Surface in the picture above) are precisely aligned. Standard mounting systems for tablets like the Surface are well engineered and lightweight but they weren't made to keep the Surface from vibrating when the wheelchair is in motion. As such, we created a custom mounting system build from 80/20 and 3D printed parts in our machine shop. I'll cover this in more detail in future posts.

What we've learned

We recently visited Steve Gleason in New Orleans to learn more about his day to day life and to see if we're on the right track. We were in for a surprise. Steve can drive his wheelchair today. His Omni Control Interface has a simple one button system for driving the chair which Steve can control via a head switch. However he doesn't drive today because he can't drive safely.

You see, like many people living with ALS, Steve has a ventilator. To stay comfortable and ensure best operation, he is seated at a steep incline. This limits what he can see around him. And with a young boy in the house (and his toys...), it's all too easy to make a mistake which might have serious consequences.

So it turns out that being able to control the wheelchair with eye gaze is only the first step. Being able to safely navigate is just as important.

For More Information

Our goal is to provide enough information that you will be able to replicate our work. I'll continue to post detail here as we progress.