Heads-up display promises to help near-blind to navigate

Portland, Ore.  University of Washington students have turned a pair of Elvis Costello-style eyeglasses and a backpack into a system that helps the near-blind navigate around stationary objects. Their Wearable Low Vision Aid projects icons on top of obstacles seen in a heads-up display, using a laser diode and a vibrating crystal fiber made from components that cost less than $1.

"It's like having another set of eyes looking out for you," said Eric Seibel, the research professor and principal investigator on the National Science Foundation grant funding the project. The system, designed by seven graduate and undergraduate students at the university's Human Interface Technology Laboratory, was shown at the recent Society for Information Display Conference in Seattle.

The system projects a bright warning icon  visible even to the legally blind  into the eye so as to illuminate just the part of the retina where the approaching obstacle is imaged. As a consequence, approaching obstacles are brightly highlighted, making them easy for the blind person to avoid (even if his or her vision is too poor to discern what they are).

"We are very hopeful that the prototype will become successful, once it has been engineered into a smaller, less expensive package," said National Science Foundation program officer Gil Devey, an expert on disabilities research. "For now, anyway, these students have demonstrated that a vibrating optic fiber can project an image onto the retina in response to approaching obstacles that really helps the near-blind to get around."

By directing the output of a laser diode with a vibrating optical fiber, mounted on a pair of glasses, a rasterized image can be produced in the eye that is overlaid on top of the normal scene. That is the technology behind the "heads-up" display, so named because the users can keep their heads up, rather than having to look down at a computer screen. Heads-up displays now overlay tactical information atop images in fighter pilots' cockpits, for example.

Scaling the sizeThe UW students spent the last four years reducing the setup's size to fit a backpack computer and spectacles that contain the heads-up display. Also attached to the eyeglasses, on the opposite side from the vibrating optical fiber, is an array of 24 infrared emitters ringing a video camera that is sensitive to infrared. The computer deduces when obstacles are getting near and "writes" a warning icon on the wearer's retina atop the obstacle's location.

"We chose not to add any audible cues, since audio is already one of the key senses people with vision disabilities use to detect obstacles," UW's Seibel said. "Instead we augment the user's impaired visual system with more easily seen laser light." The fiber vibrates both vertically and horizontally, enabling it to scan an icon onto any part of the retina, thus yielding real-time tracking capabilities. The scanning fiber display, which is mounted inside a 1.5-inch tube, uses two bimorph piezoelectric actuators, the optical fiber and lenses. One end of a small core fiber (chemically etched to reduce its diameter) is attached to the laser diode and the other to a fast-scan piezoelectric actuator (horizontal speed of 3 kHz) and an orthogonally mounted slow-scan piezoelectric actuator (vertical speed of 60 Hz). An antivibration piezoelectric actuator, operating 180° out of phase with the fast piezoelectric actuator, cancels the 3-Hz "hum" (down to 14 dB).

The scanned image is reflected into the eye by a small optical-beam splitter that enables the overlay mode whereby the icons show up atop the obstacles. The laser diodes used were brilliant enough to enable the icons to be overlaid even on brightly lit daylight scenes. However, the researchers are currently upgrading to a light-emitting-diode source.

The prototype's red laser diode, with a 633-nanometer wavelength at 3 milliwatts, is the light source driven at only 5 percent of its rated power to create an intensity of 230 µW per square centimeter. The icon area ends up being just 4 x 3 mm, with 100 x 28 pixels refreshed 30 times/second. For the future, the students are experimenting with a blue 458-nm LED that initial tests indicate may be more soothing to the user's eye.

On the opposite side of the glasses from the scanner is the color video camera. It has a filter to block visible light ringed with 24 infrared LEDs that illuminate the scene. Custom circuitry then enables the computer to compare alternate scenes, one illuminated by the LEDs and the other by the ambient light in the scene, enabling the system to sense obstacles up to 12 feet away. Basically, the final system compares the IR image with that from ambient light, subtracting the latter from the former to arrive at a differential measurement.

One main task the UW students completed was coding the software that determines when an obstacle is approaching. Since closer objects reflect more of the IR light than distant ones, the software can judge distances and, by comparing consecutive frames to determine when objects grow in size, deduce when a collision is imminent. Then, the system projects the warning icon until the person steps away.

The researchers are partnering with visually impaired volunteers, who are helping them develop a regime of icons to represent common street hazards.