TI Launches Purpose-Built SoCs for Automotive Vision

TOKYO -- As the Advanced Driver Assistance System (ADAS) moves onto the central stage for safety and automation -- two key features critical to the automotive industry’s future -- on Wednesday, October 16, Texas Instruments rolls out a new family of SoCs purposely built for automotive vision-processing applications.

TI’s new family of ADAS SoCs, based on its heterogeneous system architecture, is designed for running high performance vision processing at low power, as demanded by carmakers and tier-one automotive suppliers.

TI is joining leading vision processor companies such as Cognivue, Mobileye, Freescale (which licensed Cognivue’s Image Cognition Processing IPs), and STMicroelectronics (which has been working with Mobileye on two generations of vision processors).

TI hopes to differentiate the company’s new family of ADAS SoCs, designated as TDA2x, through a flexible and scalable platform.

Its design goal is to enable a broad range of ADAS applications -- everything from a stereo front camera to side cameras, surround view, and sensor fusion. The common architecture of ADAS SoCs should make it easy to design into a variety of vehicles from entry-level cars to high-end luxury cars, claimed Brooke Williams, ADAS business manager at TI.

The building blocks of the TDA2x include: Two C66x DSP cores, up to four vision accelerators, two ARM Cortex A15 cores, two ARM Cortex M4 cores, internal memory, as many as six camera inputs, and multiple display outputs.

Building blocks inside TI's ADAS SoC.
(Source: Texas Instruments)

The TDA2x is designed to offer a real-time vision analytics engine necessary to analyze each video camera frame to extract the correct information for intelligent decision.

TI explained:

It [vision processing for ADAS] not only needs enormous computing capacity to process data in the split-second intervals required to allow a fast-moving vehicle to make the correct maneuver, it also needs wide I/O to feed the vision analytics engine inputs from multiple cameras to allow simultaneous correlation. Low power, low latency and reliability are also key aspects of the automotive vision systems.

Critical to the TDA2x are “purpose-built vision accelerators,” called Embedded Vision Engines (EVEs), said TI’s Williams. Developed to complement DSPs, EVEs run on a “low to mid-level kernel.” Each EVE core features special instruction sets to “make the vision accelerator ADAS friendly,” he added. The EVE core, for example, offers eight times compute performance, compared with Cortex A15 (with Neon) at the same power budget, according to TI.

Scalability
The flexible architecture of TDA2x allows a designer of ADAS to adjust the number of cores for EVEs and DSPs, depending on ADAS applications. Entry level applications, such as high-beam assist, might need only one EVE and one DSP core, while park-assist apps based on the surround-view will demand graphic overlay and multiple camera inputs in the ADAS SoC. Further, certain high-end ADAS apps require fusing of independent but redundant information sourced from cameras and radar. Hence, dual A15 cores in the ADAS SoCs become critical.

This technology will get big fast. Since TI does manufacture chips for auotomotive, they have a leg up on the qualification requirements. Not all semiconductor companies can meet the automotive requirements.

That's certainly true, Rick. I was just at ITS World Congress in Tokyo. It was clear that every carmaker is in need of "purpose built" SoCs for autoomtive vision that sees and analyze the situation faster and better. (and at low power!)