Participate with SAE

Donate

Melexis and SoftKinetic have teamed up to create time of flight cameras that can detect driver gestures.

Time-of-flight cameras detect driver’s gestures, aid in autonomy

2014-09-04
Terry Costlow

Two major design trends, the push to provide more exotic human-machine interfaces (HMIs) and the march towards autonomous driving, are sparking interest in cameras not now used in vehicles. Time-of-flight (ToF) cameras provide gesture recognition for HMIs and tell autonomous controllers whether drivers are watching the road with their hands on the wheel.

ToF cameras work much like sonar or radar, sending out a light beam and measuring how long it takes for the light to return. That lets the camera build a 3D model. Sensors are capable of understanding subtle human gestures as well as the shape, size, and behavior of objects and people inside the car. The technology is getting increased interest from automotive developers, particularly those creating HMIs.

“Comfort will be the entry point,” said Gaetan Koers, Melexis’ Product Line Manager. “If the driver’s or passenger’s hand moves toward the HVAC or radio control, the system will provide feedback. Gestures like turning a knob can be used to adjust the temperature or radio volume.”

Melexis recently unveiled an automotive grade MLX75023 ToF sensor that’s bundled with software from SoftKinetic, a company that specializes in 3D vision and gesture recognition solutions.

“SoftKinetic developed a library that’s tuned to the automotive context, which is quite different than other consumer markets,” Koers said.

The linkup is yet another step for the merger of consumer and automotive technologies. ToF cameras for consumer applications are made by Texas Instruments and STMicroelectronics, which could easily focus on automotive uses. Along with reliability issues, usage in bright sunlight is a primary design concern for transportation cameras.

“The challenging part is to work in high levels of sunlight,” Koers said. “We’re at 120 kilolux, which is the amount of light you get at noontime in the Sahara.”

However, that’s only one of the challenges that must be met. Gesture recognition is still in its early stages. Much like voice recognition at a similar period, problems raise questions about gesture recognition’s role as an automotive technology.

“The jury is still out,” said Ian Riches, Global Automotive Practice Director at Strategy Analytics. “The second-generation Kinect sensor supplied with Microsoft’s Xbox One console uses a time-of-flight camera for its range imaging, but overall impressions seemed mixed as to the usefulness and accuracy of this particular implementation. Strategy Analytics does not see ToF entering automotive until the long-term, 2020 at the earliest.”

If that timetable plays out, the growing role of autonomous systems could be another driver for usage. In the initial phases of autonomous driving systems, vehicle control modules will have to determine whether the driver is distracted or actively involved in driving. Many systems are expected to activate advanced driver assistance systems more quickly when they see that the driver isn’t paying attention.

“Before the vehicle takes over something like braking, you may want to know if the driver is looking at the road or at the passenger or the radio,” Koers said.

He has a more bullish outlook on ToF’s adoption than Strategy Analytics’ Riches.

“We expect projects to reach production next year,” Koers said. “This is mostly driven by European OEMs. A lot of European carmakers feel cameras are limiting their ability to roll out more advanced ADAS systems and more advanced cockpit designs.”