Mark Anthony Riccobono, who is blind, drove a modified Ford Escape hybrid on the Daytona International Speedway, turning to avoid obstacles. He navigated using feedback from the car’s laser sensors and cameras, installed by a team of researchers from Virginia Tech and the company Torc Technologies. PopSci spoke by phone to Riccobono, the executive director of the National Federation of the Blind’s Jernigan Institute, which researches new technology for the blind.

How did you know when to turn?
The car’s laser range-finding sensors sent data to a computer, which in turn sent me directional information. I was wearing gloves with motors in the knuckle. When I needed to make a sharp left, the motor on the left pinky vibrated. For a more gradual turn, the index finger, and so forth.

Vibrating Gloves Pick Up Steering Clues From an Onboard Computer: Courtesy National Federation of the Blind/Virginia Tech

Did you have a sense of how fast you were going?
My top speed was 25 or 26 miles per hour, and I had a general feel for what the vehicle was doing. A strip on my seat gave me a more exact impression, kind of like looking at a speedometer. The strip also contained motors—it was like a massage chair. If I felt vibrations around my knees, I knew I needed to speed up. As I accelerated toward what the car considered a proper speed, the vibrations moved up my leg, toward my back.

What would you do to make cars even better in the future?
The car I drove has just two interfaces [the gloves and the driver’s-seat strip]; I can imagine adding an auditory component that would provide advance information about oncoming turns or traffic. Like a GPS system: “Approaching a left turn in 500 feet.”

And someday, blind drivers could get enough data to make decisions for themselves, rather than following the car’s directions. But the Department of Motor Vehicles probably isn’t going to change its vision requirements anytime soon. In the meantime, are there other applications for this technology?
Right now driving is purely visual, which is limiting, when you think about it. The work we’re doing could encourage non-visual innovation. Everyone has a blind spot. Maybe there’s an auditory solution to that problem. And if it’s raining hard or there’s a blizzard, drivers can’t see the road clearly. The interfaces we’ve developed could help make driving easier and safer.