Thumbs up for gesturing with time-of-flight cameras?

ToF cameras from Infineon provide depth data as well as conventional 2D imagery.

ToF cameras from Infineon provide depth data as well as conventional 2D imagery.

Gestures can easily be understood by ToF systems from Visteon.

Time-of-flight (ToF) cameras are ready to let drivers control some of the many options of today’s infotainment systems with a mere wave of their hand. ToF-based systems can also monitor drivers to see if they’re drowsy or not watching roadways, which can could reduce accidents when automated driving systems need input from the driver.

These cameras work like sonar or radar, sending out a light beam then measuring the time until light returns. That lets the camera build on traditional imagery to create a 3D model so the human machine interface can understand subtle human gestures. It’s also simpler to determine whether drivers are paying attention to the road or whether they’re drowsy. ToF cameras are also useful outside the cabin.

“This basic technology is similar for internal vehicle applications like gesture detection and for external applications like parking assist,” said Paul Morris, Innovation Project Manager at Visteon Corp. “The primary differences would be optical aspects such as distance to object, field of view, etc.”

Numerous companies are ramping up, primarily for cameras inside the vehicle. Denso, Delphi and Continental are all working actively on systems that use the cameras. SoftKinetic, a Sony company focused on gesture recognition software, is targeting automotive environments in partnership with Melexis.

The march towards autonomous driving is a key factor behind many ToF-based strategies. The cameras can watch driver’s eyes to see where they’re looking, and they can detect drowsiness when drivers blink continuously or their heads droop.

“The first implementations are used for gesture control of infotainment, following the usual path of appearing first in luxury class vehicles,” said Martin Lass, Product Marketing Manager at Infineon Technologies AG. “But principal applications will be in highly automated driver-assist and self-driving cars, where the car needs to know the state of the driver and needs to be able to alert the driver to take back control."

He noted that in addition to driver attentiveness, this can also enable additional comfort features, such as precise display of heads-up information.

While there’s optimism, some concerns could slow acceptance. Even proponents say that it may take time for some users to embrace gesture input, noting that acceptance in consumer applications has been slow. Some companies don’t think consumers will want to use gestures unless early usage in cars shows real benefits.

Preh Inc., for example, has demonstrated HMIs that include gesturing, but has no active development projects at this time. Others are more bullish. Some OEMs may follow an approach being tried on the BMW 7, which touts gesturing as an alternative to other control techniques.

“We would expect most OEMs to introduce it gradually, making it available as an additional input option, rather than fully replacing other options like touchscreens or buttons,” Morris said. “At venues such as CES 2016 and Auto China 2016 in Beijing, we demonstrated a concept vehicle incorporating ToF technology that has become a centerpiece for discussions with customers and partners about future technology trends.”

Any rollout will face both business and technical challenges. On the business side, initial offerings have to prove that consumers want another control technique. OEMs must also demonstrate that people who talk with their hands don’t generate false positives. Problems that draw criticism could impact future plans.

On the technical side, one challenge is that most infrared lasers used to provide light for ToF sensing in consumer electronics do not yet meet automotive temperatures. Visteon avoids that by using near-infrared wavelengths.

“The system uses near-infrared illumination and a sensor array that provides high-accuracy distance measurement data,” Morris said. “These measurements occur in parallel in each pixel, providing one amplitude and one distance value per pixel for each frame, at multiple frames per second. This data is used by computer vision software algorithms to identify features, track motion, and ultimately determine a wide variety of hand gestures.”

The incorporation of lighting is important because illumination levels constantly change as vehicles move.

A ToF-based 3D camera captures accurate depth data and a grey-scale picture that can be analyzed. This combination of features lets systems monitor multiple functions with just one camera.

“For example, depth data determines exact head positioning and/or the position of other in-cabin objects,” Lass said. “Standard 2D-algorithms also analyze the grey-scale picture to enable functions such as eye-lid closure detection or passenger identification. All of this is possible in daylight, dark and heavily changing light conditions.”

Hardware and software factors

Beyond the inherent lighting issues, the technical hurdles are largely akin to other camera systems. Hardware and software designs must be symbiotic: alterations to one can alter performance of the other. Hardware factors such including depth map calculation, data post processing and camera calibration are important, while the application software algorithms will also influence the overall camera performance, cost and size.

Once these issues are addressed, ToF camera installations are straightforward since they are monocular systems. There’s little risk of de-calibration due to vibrations or thermal bending.

“Once properly specified, the physical implementation of a completed ToF camera into the car is rather simple because there is no extra calibration necessary,” Lass said. “The depth camera only requires one end-of-line calibration at the camera module maker [Tier 1], which is very easy compared to other depth technologies. Calibration time is less than 10 s for a consumer ToF camera module.”

Lengthy automotive development and production cycles have long prevented automakers and startups from working together. While that’s changed a bit, many young companies still find it difficult to work with OEMs.