Q&A

Perceptions In Autonomous Driving: Embedding A Multi-Sensor Solution Into A Single Microcontroller Platform

In the not-so-distant future, a person driving home from work might glance at the car next to him or her, only to realize that no one is driving – no human, that is. As top businesses already are implementing functional autonomous systems into delivery systems, a sight like this is well on its way to becoming “the norm.” Today, however, seeing a self-driving car on the road next to you would be a rather novel experience.

There remain many questions about autonomous systems, concerning everything from how different embedded sensor and perception technologies will work together, to how safe these systems really are. These are the types of questions that the LETI research institute of CEA Tech attempted to address when creating its sensor fusion solution intended for autonomous driving applications.

LETI’s SigmaFusion embedded sensor fusion solution is designed to be compatible with any type of sensor and provide fast, accurate environmental perception and performance, as well as transform distance data into clear information about the driving environment. The institute recently announced that SigmaFusion has been embedded into Infineon Technologies’ AURIX TC29x platform to enable automotive developers to control powertrain, body, safety, and ADAS applications with a single microcontroller family.

I recently interviewed Julien Mottin, a research engineer in sensor fusion / automotive perception at LETI, about SigmaFusion and the processes and challenges involved in its design. We also delved into the rationale behind embedding it into the AURIX microcontroller platform and what the future has in store for these combined systems.

PHOTONICS ONLINE — LETI claims SigmaFusion provides the most accurate environmental perception and real-time performance in a mass-market controller — What was the process involved in designing and developing the embedded sensor-fusion solution?

Julien Mottin — LETI is devoting a large part of its research toward the miniaturization and integration of various electronic components and systems. It has an entire division working on hardware/software design and integration in multiple application domains, but especially in embedded computing for automotive. Today, the automotive sector is facing very important challenges to progress towards autonomous driving, electric vehicles, and shared mobility.

Among many technical challenges, environment perception remains a key function for future intelligent vehicles. Today, the perception relies on multiple heterogeneous sensors with various operating performances and precision. The computing demand for gathering and understanding all the sensors measurements is very high, and many chip vendors are proposing novel architectures with more and more memory and computing capabilities (e.g., NVDIA Xavier, which offers 10 TFLops / 10 W).

Another fundamental constraint for automotive application is the safety requirements. Perception is a critical task for autonomous driving; it has to be assessed and guaranteed. Today, the automotive platforms already certified for critical function (ASIL-D ISO26262) typically are microcontroller platforms.

So, we started wondering, “how much of the fusion process would be feasible to integrate on a micro-controller processor?” We wanted to stick with a sound and widely used fusion framework — not re-invent a novel fusion theory — so we looked at the hardware/software integration.

PO — How is SigmaFusion able to transmit 100 times more efficient information about the driving environment than comparable systems?

JM — We have developed a novel arithmetic that is specifically tuned to compute Bayesian fusion. Its major benefit is that it can be implemented using only integer instructions – so, it can be executed by a very large class of processors, including microcontrollers.

It can be proven mathematically equivalent to a regular, floating point implementation. In addition, the numerical error induced by the quantization procedure of SigmaFusion can be fully handled and bounded by the user or the application.

Along with the fusion software, SigmaFusion also contains a set of tools to interface with various sensors, such as LiDARs, Radars, Time of Flight camera, stereo vision, etc.

PO — What challenges did Leti encounter in making SigmaFusion compatible with any type of sensor?

JM — As long as the sensor delivers spatial information (distance of a target, position of a detected object) it can be interfaced with SigmaFusion. At LETI, we have technology experts in vision, lasers, detectors, radio frequency (RF). We have been working across many research fields in a transversal way to be compatible with many sensors.

PO — What efforts were involved in tailoring the SigmaFusion to various platforms, as well as different vehicle types and models?

JM — As a research & technology organization (RTO), we don’t deliver products. Rather, we build IP and transfer it to industrial partners. It is then our mission to help them build the final solutions, but in the end, it is their job to tailor the IP to specific product configurations.

On our side, we have been working on an architecture that ensures a clear separation between core features and platform specific features, with proper abstraction layers.

PO — How does SigmaFusion ensure the same safety features and performance levels for different vehicle types and models?

JM — The overall safety can only be evaluated for a specific use-case or application by the supplier of the function. At LETI, we have been putting efforts into guaranteeing the predictability of the execution of the SigmaFusion software. We can specify, test, and prove the software behavior for many application configurations.

PO — What were the considerations when deciding to embed the SigmaFusion solution into Infineon Technologies’ AURIX TC29x microcontroller platform?

JM — We wanted to demonstrate that SigmaFusion can be run on an ISO26262 ASIL-D platform, so we picked a widely used, automotive-grade ECU – the Infineon Aurix TC29X.

PO — What applications do you see in the future for the SigmaFusion solution — outside of autonomous and driver-assist vehicles — as it is embedded in the AURIX TC29x microcontroller family? Do any of these applications have a timeline?

JM — We are currently evaluating SigmaFusion for drone applications, especially in assessing the flight plan and avoiding the unexpected obstacles.

We also demonstrated SigmaFusion on a STMicroelectronics STM32F7 platform at CES earlier this year. So, apart from critical applications, we could also embed SigmaFusion into consumer platforms for wearables or drone applications. We have, for example, started an EU-funded program called INSPEX, where we will use SigmaFusion on a very-low-power microcontroller to perform perception tasks to assist visually impaired people. Preliminary results should be available at the beginning of 2018.

PO — Are there any other current or future endeavors LETI is pursuing in this field?

JM — We are currently looking to see if parts of SigmaFusion could be integrated directly in sensor modules, allowing them to preprocess data at the sensor level and reduce the data bandwidth required on the car network.

About the Author:

Marissa Stonefield is a Web Content Specialist for Photonics Online, RF Globalnet, and Med Device Online. She graduated from Messiah College with a B.A. in marketing, and from the Community College of Beaver County with an A.B. in business administration. She has previously worked as a content writer for Vanko Trading Inc., and as a journalist for Examiner.com, and Weddings Year Round Magazine in Lancaster, Pa.

Newsletter Signup

Get the latest photonics industry news, insights, and analysis delivered to your inbox.