Driver Assistance: Radar or Vision?

PARIS — The message to consumers from the European Microelectronics Summit held here Thursday, Sept. 26, is this: Expect the emergence of vehicle-to-vehicle requirements and the build-out of vehicle-to-infrastructure. Drivers can also depend on their cars to help them avoid crises on the road.

But the emphasis is more on the advanced driver assistance systems (ADAS), rather than self-driving cars. Driving the business of automotive electronics suppliers today is none other than the ADAS, according to the summit's presenters.

STMicroelectronics, for one, is advancing ADAS solutions by adding more intelligence to the company's image sensors. ST is developing a black-and-white image sensor capable of capturing red, making it easier for the computer vision to see traffic lights, stop signs, and tail lights, EE Times has learned.

Freescale Semiconductor, meanwhile, is offering what the company claims to be the industry's most integrated system-level ADAS solution for automotive radar, by leveraging the state-of-the-art Si and SiGe technologies.

During his presentation at the conference, Martin Duncan, business unit director responsible for ADAS and microcontrollers at STMicroelectronics, predicted that the ADAS will become "more and more pervasive on the full range of cars from economy to luxury."

Fueling the trend is more stringent performance safety requirements advanced by the European New Car Assessment Program (Euro NCAP) and other regulators. Euro NCAP, based in Brussels, is backed by several European governments and by the European Union.

Carmakers, in hopes of high safety ratings from Euro NCAP, are working hard to make radar and vision sensors standard features on all vehicles. "We can envision at a certain point, it will become mandatory by law to install ADAS systems on all new cars as it happened in the past for ABS [anti-lock braking system] and airbag," Duncan said.

Radars or vision?
Carmakers have a number of choices when it comes to technologies enabling ADAS. They include short-range radar (SRR), long-range radar (LRR), ultrasonic, and vision.

Among the industry players, Freescale Semiconductor is focused on radar solutions, while ST currently holds the lion's share of the automotive vision sensor market.

Automotive Radar Applications

Source: STMicroelectronics

Asked about the future of ADAS and if there may be one winning ADAS technology, Gerard Maniez, director of the automotive segment for Western Europe at Freescale, said, "You really need to look at the whole package -- including both radar and vision." Euro NCAP isn't mandating either radar or vision. Nor is it asking carmakers to have both.

A carmaker can rely on a more advanced radar system combined with a lighter vision system or, conversely, choose to go with a much more advanced vision sensor integrated with a lighter version of radar system.

ST's Duncan stressed the point that there are certain things a radar/lidar cannot do. These include tasks like detecting lane markings and other road information, detecting and reading traffic signs, reliably detecting pedestrians, and performing lighting functions such as controlling the high/low beams.

On the other hand, there are a number of jobs vision technologies can't handle. Seeing through rain or after dark might be possible, but snow and fog are a tougher proposition. Dirt renders vision sensors blind. Unlike radar, vision technology can't see very far. LRR can comfortably handle between 30 and 150 meters, and SRR can detect objects within 30 meters.

The savings in collision insurance, actual collision costs, are offset by cost of replacing HUD with a stone hits windsheild (three times a year on some freeways due to construction trucks). But that's OK. I want mine soon as I am 77 and will need lots of DA in a few years. If we do not followup on this, we are missing the fact that every country has aging populations that STILL WORK and mass transport systems in many countries do not go from home to work in reasonable time. Not to mention those of us that like to take road-based vacations rather than flying in autonomous planes that take off and land at nightmare airports.

With autonomous cars rented in each city, tuned for that city's infrastructure, we can move this technology along city by city, state by state, with a few long distance standards groups interfacing all the local options. Almost like our phone systems that mix and match broadband, cellular and whatever is happening at 60 GHz. But phones do not kill us,,,unless we do not have DA in place when that text message comes.

If a separate driver's ed course is required, the technology will be a failure for 2 reasons. First, information retention from driver's education courses is minimal (and legacy drivers don't need to take it when they buy a new car) and secondly, these features will be car model specific for a long time.

That said, some of the new safety features do have a rocky road of introduction. I suspect we all remember the first time that we drove a car with anti-skid brakes and they engaged. I thought the brakes were failing and pumped them to compensate. Especially as a business travelers who is exposed to many different rental cars during the course of a year, I don't know the repertoire of safety features on a car until they are invoked.

I think this driver assistance is going to require some kind of "head-up display" on the windshield that from the driver's viewpoint is simply an enhanced view of what they're already seeing. At most, perhaps this display will work like the headlights high beam switch to flip between views (and as the rear-view mirror flips between the day and night views). We will need to be sure that the display doesn't become so fascinating that it becomes a distraction...

The challenge, therefore, is to integrate the various information channels and provide a composite overlaid output which is informative (not overwhelming) to the driver in real time.

That is going to be a real challenge. The more I think about ADAS, the more convinced I am that we will need a separate driver's ed course -- just so that we understand what to expect and what not to expect from all the so-called safety features ADAS offers.

As other writers have observed, radar and optical imaging have different strengths and weaknesses. It reminds me of Google Maps: having the street map, the satellite view, and the traffic overlay provide insights and navigational information that isn't available from any single channel. The radar may be very useful at alerting drivers to the car coming in from the right while the optical system may be very useful in alerting the driver to a yellow light that is on the verge of turning red. Of course this must all be overlaid with common sense. Having a green light doesn't make it right to cross an intersection - you may be "dead right" if some idiot is running the red light on the intersecting street. The challenge, therefore, is to integrate the various information channels and provide a composite overlaid output which is informative (not overwhelming) to the driver in real time. We also must ensure that a sudden splash of mud on the sensors doesn't "blind" a driver who, however unwisely, is driving in the dark depending upon on the instruments rather than his own view of the road.

Definitely, that's the current mainstream thinking in the U.S. Many in the automotive industry are leaned towards putting more intelligence into cars (i.e. ADAS). Especially, under the current political and financial climate the U.S. is in today, that's a good bet.

On the other hand, some countries in Europe, and Japan (up to the point) are driving toward more a balanced approach -- smart infrastructure and smart cars.

There is a huge geographical diversity in their approaches to the emerging field.

You will not put an infrastructure only network into place. That still leaves the person to do as they will behind the wheel. You presume to much that the driver is the center point of this design change. The US government has already stated that as a prelude to autonomous vehicles, the vehicles will have to communicate to the infrastructure and vehicle to vehicle. The 2014MY will bring to market the first multi sensor vehicles. That is optical and radar. This is a typical product development cycle where you allow the sensor technology to mature. Both networks will be required. Optical can not see the stop sign behind the overgrown vegetation or know a stop sign has been knocked down. This is where the vehicle is pulling GIS information from the Cloud (vehicle to infrastructure) and reading internal network data and acting on it. What ever the operator costs are will not enough to mitigate city congestion and other factors looming.