Tesla Autopilot And Other Braking Systems Are Blind To Parked Fire Trucks

It’s not just Tesla Autopilot that fails to see stationary objects.

As we previously reported, a Tesla Model S crashed into the rear of a parked fire truck near Los Angeles this week. According to the driver, the vehicle was in Autopilot mode, a semi-autonomous driving feature that assists the driver under certain conditions (but requires continuous driver engagement and hands on the wheel).

Of course, people were shocked that an advanced vehicle would just fail to notice a huge red truck in its path. Even if Autopilot wasn’t engaged, shouldn’t the car’s standard Automatic Emergency Braking (AEB) kick in? The answer is … not necessarily.

Tesla Autopilot 2.0 is designed to see at least one car ahead, so it may lessen this problem. However, if it’s simply a stationary object, you may be out of luck.

Although the AEB may have assisted, since the driver was said to have been traveling 65 mph and yet there was not a substantial amount of damage, nor were there any injuries. AEB’s job is not to completely stop the vehicle prior to a crash, but instead, to slow it down reasonably to lessen the impact.

The automaker has yet to confirm whether or not Tesla Autopilot was engaged. Tesla also hasn’t specified if AEB kicked in or not. The company does offer the following warning in its manual (via Wired):

“Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.”

Let’s think this over for just a minute. If your car hits the brakes every time there’s a piece of debris on the road, or it sees any type of stationary object, this could actually cause an accident, or give the driver whiplash. False positives can be a real problem for such technology. All current automatic emergency braking and adaptive cruise control systems are designed to be “blind” to various fixed objects. If not, the system wouldn’t be able to do its job.

According to Wired, Volvo’s Pilot Assist system is much the same. The vehicles’ manual explains that not only will the car fail to brake for a sudden stationary object, it may actually race toward it to regain its set speed:

“Pilot Assist will ignore the stationary vehicle and instead accelerate to the stored speed. The driver must then intervene and apply the brakes.”

Radar detects the speed of objects. It also sees medians, road signs, traffic lights, etc. While it’s good at detecting something that’s moving, it’s not so good at “seeing” the fixed objects, and programmers are making sure that it doesn’t react to something stationary. This is a good thing, however, since it would cause significant havoc if cars on the freeway were constantly slamming on their brakes for no reason.

Cruise Automation Chevrolet Bolt EV with lidar

What’s the solution?

Wired explains:

“The long term solution is to combine a several sensors, with different abilities, with more computing power. Key amongst them is lidar. These sensors use lasers to build a precise, detailed map of the world around the car, and can easily distinguish between a hub cap and a cop car. The problem is that compared to radar, lidar is a young technology. It’s still very expensive, and isn’t robust enough to survive a life of hitting potholes and getting pelted with rain and snow. Just about everybody working on a fully self-driving system—the kind that doesn’t depend on lazy, inattentive humans for support—plans to use lidar, along with radar and cameras.”

“Everybody” doesn’t include Tesla CEO Elon Musk. Musk has repeatedly spoken to his lack of commitment to lidar. Is this because he’s well aware of the above?

Lidar is new tech, and it doesn’t hold up to weather conditions. Not to mention, it’s currently pricey. One would think that once the tech is more heavily tested, improved, made to be more durable, and is cost-effective, Musk would have no grounds to argue against it. However, at this time, Tesla won’t be moving to lidar. Nonetheless, Tesla vehicles have been spotted out testing with lidar in place.

Lidar or not, you also won’t easily find a system in any of today’s cars that will be sure to stop for a parked fire truck taco truck. Cruise Automation’s autonomous Chevrolet Bolt uses lidar. While it didn’t crash into a parked taco truck (albeit this was at a very slow speed), it was surely confused by the presence of the stationary obstacle.

Yes – exactly. The driver needs to stay alert and not ‘check out’, and is ultimately responsible for controlling the vehicle. The problem with Autopilot – or any system that is not fully autonomous – is that it encourages the driver to check out to some degree, as drivers learn to trust the car more than they should. I’m convinced that AP is a waste in its current state – other than as a toy to ‘show off’ to your friends. Either the human or the car should have full autonomy – not a flawed mix of human and car.

“I’m convinced that AP is a waste in its current state – other than as a toy to ‘show off’ to your friends. Either the human or the car should have full autonomy – not a flawed mix of human and car.”

While your argument is logically valid, it fails a reality check. The NHTSA says that Tesla cars with Autopilot + AutoSteer installed have a 40% lower accident rate than Tesla cars without those installed.

I think what your argument fails to take into account is that the average driver doesn’t maintain full awareness of the road at all times. People get distracted, and not just by texting on a cell phone. I once rear-ended another car because I was looking for a street I was not familiar with, and I spent too many seconds trying to read a street sign instead of noticing that a car had stopped in my lane to make a left turn.

I don’t dispute the NHTSA report. However, there are very effective solutions to prevent rear-ending someone that don’t require AP. Until we reach higher levels of and ultimately full autonomy, I maintain that the best combination of driver and car to prevent accidents is one in which the driver is in full control, and the car plays a secondary role by assisting the driver with warnings, automatic braking, etc. Tesla AP flips those roles so that the car is in control and the human becomes the backup in assisting AP. That also can be effective. But when human drivers start to believe that AP can solely handle the driving duties and they check out mentally, redundancy is lost and safety suffers. My worry is that more Tesla drivers are becoming too comfortable with AP, and are not playing the necessary role of vigilant backup.

“The data (supplied by Tesla based on airbag deployment) show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation. ”

“Approximately one-third of the subject vehicles accumulated mileage prior to Autopilot installation. The crash rates are for all miles travelled before and after Autopilot installation and are not limited to actual Autopilot use. ”

The actual rate is 1.3 crash per million miles before the autosteer and 0.8 crash per million miles after autosteer.

Based on :
(Figure 11. Crash Rates in MY 2014-16 Tesla Model S and 2016 Model X
vehicles Before and After Autosteer Installation.)

There is NO DIRECT comparison of vehicles before and after autopilot, but only the total/diluted crash rate per miles traveled that included both cars.

That is ONLY based on airbag deployment rate which can be reduced by AEB to reduce crash speed.

But it doesn’t separate the two groups of vehicle out in direct comparison.

One more thing where GM seems to have the upper hand on Tesla in addition to autonomous driving tech. The Bolt/Ampera-E looks like it would have no issues detecting a stopped car (or firetruck) with its AEB system:

A low system is not the same as a high speed one. The Bolt EV doesn’t even have adaptive cruise control. If it did, it would not have stopped for the truck since ACC only detects changes in speed of moving vehicles. It’s counter intuitive, but the form of radar used in ACC doesn’t detect stopped objects.

The better contrast would be to Super Cruise used on the Cadillac CT6. AFAIK that’s the most advanced cruise available in a production car. While I’m not sure, my guess is it would have handled this situation and stopped. But that’s a different and more advanced technology.

“One more thing where GM seems to have the upper hand on Tesla in addition to autonomous driving tech. The Bolt/Ampera-E looks like it would have no issues detecting a stopped car (or firetruck) with its AEB system:”

Excuse me, but are you seriously trying to tell us that the Bolt EV will never (or even almost never) get into an accident involving it hitting a parked car or, in fact, any stationary solid object?

Because that’s what you’re claiming.

Hopefully it’s not necessary to point out it’s ridiculously easy to prove this is factually incorrect.

This accident happened at 65 mph. Not a single Bolt has high-speed AEB that operates at that speed. The Bolt only has what GM calls “low speed AEB”. Here is what GM says about the Bolt’s slow-speed AEB:

“The system works when driving in a
forward gear between 8 km/h
(5 mph) and 80 km/h (50 mph).
It can detect vehicles up to
approximately 60 m (197 ft).”

Even worse, that low-speed AEB isn’t even standard like in the Tesla. It is only available as an extra cost option. And then only in the highest trim. And then only after you buy two additional packages.

So the vast majority of Bolts don’t even have ANY AEB AT ALL, and the Bolts that do have AEB wouldn’t have been active at that speed.

You are trying to say that a system that either wouldn’t exist or wouldn’t have been active in the Bolt, is better than a system that very well may have applied the brakes and slowed the vehicle enough that the accident wasn’t deadly.

True but most people would argue that obstruction sensing tech is an overall net positive for safety. Crashes like this get all kinds of press but what about the crashes where there was no such technology and said technology would have prevented the crash? What about all of the accidents that were avoided because of this technology when it was in place. That never makes the news. In short, we are seeing a skewed view of how this innovative and soon to be ubiquitous tech fails instead of the countless lives it will and has saved. It is more fun and satisfying for the haters to say see… it sucks, Tesla sucks, its not ready for prime time instead of wow it isn’t perfect but we are going to be safer due to all of these innovations. I can’t wait until all cars have that technology.

I agree with you on welcoming the technology.
I just think naming it autopilot was way over the top and a really stupid decision. It’s just begs for bad press when something goes amiss, even when it’s working as designed. Hurts the whole progress of the industry.

“…we are seeing a skewed view of how this innovative and soon to be ubiquitous tech fails instead of the countless lives it will and has saved.”

Exactly, and thank you.

This is the same stupid argument that opponents of mandatory seat belt laws used, in the days before air bags. “Well, what if you’re trapped in a burning car by a jammed seat belt, and you can’t escape? Wearing a seat belt might kill you!”

Perhaps there have been very rare cases where someone was in an accident severe enough to jam his seat belt but leave him conscious and able to climb out of the car, but the odds are far, far greater that (in a car without air bags) you’ll simply be killed if you’re in a severe accident and you’re not wearing a seat belt.

I disagree. Auto-pilot was perfectly named, considering its origins for helping pilots fly aircraft for long distances. And, just like airbags, which can sometimes cause unwanted injuries, the overall benefit of putting airbags in cars is clearly obvious.

“A known flaw that Tesla chose to ship with, causing least one AP death in China.”

As usual with posts from “Six (Pretend) Electrics”, this is a lie or at best a half-truth.

A family in China sued Tesla, claiming — without evidence — that a driver was killed operating a car under Autopilot (or more precisely, AutoSteer). The family refused to allow Tesla to examine the car to find out if AutoSteer was actually engaged at the time of the accident.

And even if AutoSteer was activated, so what? The driver is warned repeatedly to observe the road and be ready to take over at any time. In fact, Autopilot occasionally tests to see if the driver is paying attention, and if he’s not, it pulls over to the side of the road and stops.

Use of AutoSteer does not remove the responsibility of driving from the driver.

The radar used for automatic cruise control is not terribly robust. It’s a crippled form of doppler that detects only relative moving speeds. Hence it won’t “see” a stationary object. Contrary to what the article implies, this is a limitation not a feature. Assuming it’s half functional there shouldn’t be a problem distinguishing between a piece of road debris and a fire truck.

The bigger problem is that AP apparently doesn’t work. AP is supposedly a camera based system. Given viewing conditions were good, if the system worked then the car would have literally “seen” the truck. Note this is not the same at the tractor trailer in Florida where the light reflecting off the trailer may have confused the camera. No excuses here.

“The bigger problem is that AP apparently doesn’t work. AP is supposedly a camera based system. Given viewing conditions were good, if the system worked then the car would have literally ‘seen’ the truck.”

Wow! That has so many errors of fact and logic that I hardly know where to start.

1. Cameras don’t literally “see” anything. Nor does the software optical object recognition system used by Autopilot to (unreliably) detect objects. Cameras may work like eyes, but it’s not your eye that “sees” something; it’s your brain with its highly evolved and very complex visual cortex, which interprets visual images. A cortex that is utterly lacking in computers.

2. This has little or nothing to do with “viewing conditions”. This has to do with the fact that current cars, even ones with driver assist features such as “Autopilot”, don’t have the kind of situational awareness that will be needed for full autonomy.

3. Proper situational awareness in a self-driving car will require a LiDAR based SLAM* system. No production car, including Tesla cars, currently uses LiDAR.

*SLAM stands for Simultaneous Localization And Mapping technology, a process whereby a robot or a device can create a map of its surroundings, and orient itself properly within this map in real time.

4. If DonC is correct in stating that the radars in cars only detect the relative speed difference in moving objects, and don’t detect stationary objects at all, then they are not even capable of providing a SLAM “view” of the environment. They are just intended to detect certain things under certain conditions.

5. Autopilot works reasonably well for what it is intended to do. The problem is that a lot of people can’t understand that there are varying degrees of autonomy; it’s not a all-or-nothing thing. Just because a Tesla car equipped with Autopilot + AutoSteer can perform lane-keeping reasonably well under most circumstances, and also does reasonably well at following the car ahead, that doesn’t mean it can “see” anything in the way humans (and other animals) see things. Its sensors can detect certain things, and it can (apparently reliably) read traffic signs. But it does not actually “see” the world around it, and potentially that includes a fire truck stopped in your lane.

You stated: “The problem is that a lot of people can’t understand that there are varying degrees of autonomy; it’s not a all-or-nothing thing. Just because a Tesla car equipped with Autopilot + AutoSteer can perform lane-keeping reasonably well under most circumstances, and also does reasonably well at following the car ahead, that doesn’t mean it can “see” anything in the way humans (and other animals) see things. Its sensors can detect certain things, and it can (apparently reliably) read traffic signs. But it does not actually “see” the world around it, and potentially that includes a fire truck stopped in your lane.”

You are making my case – although I would dispute that it does these tasks reasonably well. Your are perfectly describing a system that is not ready for use by the general public. In other words, it is not ready for prime time. See the link for a compilation of problems – from a single driver – with the system Tesla calls ‘Autopilot’.

We are currently in that uncomfortable middle ground where cars have achieved some autonomy, but full autonomy has not arrived. I believe the experience of the airline industry – which has already been on this path far longer – is most instructive.

Lidar is not a new tech. The military has used it for decades.When I served, it was used in airborn laser scanning of the ground, for mapping and strategic reasons. Was also used in automatic fire stations/ sentinens that scanned the set sectors for enemy soldiers or vehicles. At the same time, it has to see the difference between a soldier and a moose, so it does not slay all the wildlife passing it in the woods.
Not just to save the wildlife, but to save the ammo for the enemy.

It is also used in some missiles, troop transporters and so on.

New in civilian cars yes, but the technology has been used for decades.

LiDAR is currently being used in test vehicles and prototype self-driving cars, but not — so far as I know — in any production cars.

I think we’ll start seeing the first production cars equipped with LiDAR within the next couple of years or so. Tesla is currently resisting use of LiDAR in their production cars, but I expect that to change.

Tesla cars equipped with Autopiot do use ultrasound detectors, but they are only effective at very short ranges; as I recall, less than 20′.

More sophisticated forms of radar are available, such as high-resolution radar, altho I don’t think that tech is being used in cars. All around, LiDAR is best, and prices for solid-state LiDAR scanners have recently dropped down to where it’s reasonable to think auto makers can start putting them into mass produced cars.

This has/will kill more people. It is only a question of time. It is not only Tesla, I watched a new Audi A4 test drive recently on Youtube, it was equipped with a similar autopilot system along with road signs reading. It drove on the highway at 130 km/h then suddenly the silly road sign detection system spotted a 70 km/h speed limit sign on a small road that was just next/going parallel to the highway, it resulted in rather hard braking and no one is expecting a car to reduce its speed from 130km/h to 70km/h for no reason on the highway so the stupid autopilot could have very well caused an accident. I guess a few casualties are needed “for science”. As long as those victims are single male nerds I take it society will tolerate this rather well, a mother with 2 kids that would be another matter as far as public perception is concerned.

It’s notable that the only people here actually suggesting that Tesla should stop using Autopilot are those who have established reputations as Tesla Hater cultists: Steve, bro1999, Six Electrics, and you, Another Euro.

distracted, angry et al humans are bad drivers to be sure but humans as a general rule I would say are capable of exceptional driving. ever seen stunt car driving? or people with a spotless accident record despite sharing the road with idiots..

The thing about humans is we’re capable of driving well (at least some of us), but even as a good driver it’s hard not to get distracted or make a rare mistake, especially after driving for long periods of time, or if you’re tired.

I’m generally a good driver, but I had an incident in a mall last year where I came up to a stop sign, looked both ways, and then hesitated because I couldn’t decide which way I should go. Then when I had decided I almost drove out in front of an oncoming car.

The trouble was in my mind I had already looked, so I didn’t think to look again. But a few seconds had passed as I hesitated–more time than I realized–and things had changed. The moral of the story is that everyone makes mistakes, even the best of us.

I have no real doubt I am more aware of the road than a computer can be–what’s going on in multiple lanes, whether another car is acting erratically or looks like it’s “thinking of changing lanes”. I’ve had situations where I was able to anticipate a driver doing something weird and avoid danger just by observing how they were driving.

For example, one time a car turned the corner in a wide right turn, going across two lanes. That and just how the guy was driving gave me the impression that he just wasn’t with it. Sure enough a few seconds later, he moved over into my lane literally right out in front of me, even though I was moving much more rapidly. But because I’d already recognized him as acting strangely, I was prepared and smoothly changed lanes to get out of his way. In the end I’m not sure the guy ever realized I was there or that I’d just avoided an accident for us.

That’s something a computer just can’t do, at least not any time soon, perhaps not ever. But on the other side of the coin, even if the computer is only 80% as good, it’s 80% as good 100% of the time, whereas sometimes I might be less than that. Like in the mall, where I got distracted for just a few short seconds.

Nope, not at all. The human eye can’t see in the dark, can be blinded by glare reflecting off polished surfaces, and is subject to optical illusions. Furthermore, it’s not the human eye that gives humans superior vision; it’s the highly developed visual cortex in the human brain.

It’s absurd to talk about making self-driving cars that perceive the environment as humans do. Computers aren’t equipped with our “wetware”, and the goal of those designing self-driving cars isn’t to mimic human driving — it’s to produce cars which drive more safely than humans do. Far more safely.

Part of that safety is going to come from using LiDAR-based SLAM* systems to perceive the environment, vastly superior to the use of stereo camera images and unreliable optical object recognition software. Such software is a rather poor substitute for the visual cortex in the human brain, and anyway active scanning with LiDAR or radar is far preferable to, and much simpler and more reliable than, passive scanning using video cameras, which requires use of software to (unreliably) interpret visual images. Active scanning has the added advantage that it works just as well in the dark, unlike the human eye.

*SLAM stands for Simultaneous Localization And Mapping technology, a process whereby a robot or a device can create a map of its surroundings, and orient itself properly within this map in real time.

Then you need to read DonC’s posts here. If he’s right, then radars used in current cars are not designed to detect stationary objects at all.

I think that many or most people who are not computer programmers (as I am) fail to understand that computers, even when connected to cameras, simply don’t have any situational awareness. They don’t “see” anything. The radar systems in cars equipped with ABS can only detect certain things; the things that they are designed and built to detect.

Apparently that does not include fire trucks or other large obstacles stopped in the lane of traffic where you are driving.

Seems like a protocol is needed for emergency vehicles to broadcast some kind of signal to autonomous/semi-autonomous vehicles to alert them of their presence. Could also be a version engaged for stalled vehicles on the side of the road. Like emergency blinkers warn humans something is amiss and they should drive by with caution. When you engage the emergency blinkers, a signal could be recognized by autonomous systems in passing cars.

While that certainly will be needed in the future, when semi-self-driving cars achieve true Level 3 or better autonomy. But at the current state of development — Tesla Autopilot is Level 2 with some aspects of Level 3 — it would be like equipping a Roomba with radar. The radar isn’t going to be of any benefit in a robot that’s designed and built to only stop or change directions when it bumps into things, and cars with a mere Level 2 autonomy aren’t equipped to detect stationary obstacles and steer around them.

System capability: Under certain conditions, the car controls either the steering or the vehicle speed, but not both simultaneously. • Driver involvement: The driver performs all other aspects of driving and has full responsibility for monitoring the road and taking over if the assistance system fails to act appropriately. • Example: Adaptive cruise control.

Level 2 _ Partial Automation

System capability: The car can steer, accelerate, and brake in certain circumstances. • Driver involvement: Tactical maneuvers such as responding to traffic signals or changing lanes largely fall to the driver, as does scanning for hazards. The driver may have to keep a hand on the wheel as a proxy for paying attention. • Examples: Audi Traffic Jam Assist, Cadillac Super Cruise, Mercedes-Benz Driver Assistance Systems, Tesla Autopilot, Volvo Pilot Assist.

Level 3 _ Conditional Automation

System capability: In the right conditions, the car can manage most aspects of driving, including monitoring the environment. The system prompts the driver to intervene when it encounters a scenario it can’t navigate. • Driver involvement: The driver must be available to take over at any time. • Example: Audi Traffic Jam Pilot.

Level 4 _ High Automation

System capability: The car can operate without human input or oversight but only under select conditions defined by factors such as road type or geographic area. • Driver involvement: In a shared car restricted to a defined area, there may not be any. But in a privately owned Level 4 car, the driver might manage all driving duties on surface streets then become a passenger as the car enters a highway. • Example: Google’s now-defunct Firefly pod-car prototype, which had neither pedals nor a steering wheel and was restricted to a top speed of 25 mph.

Level 5 _ Full Automation

System capability: The driverless car can operate on any road and in any conditions a human driver could negotiate. • Driver involvement: Entering a destination. • Example: None yet, but Waymo—formerly Google’s driverless-car project—is now using a fleet of 600 Chrysler Pacifica hybrids to develop its Level 5 tech for production.

The only thing that should be surprising is that anyone would be surprised about this.

If ABS really did eliminate nearly all traffic accidents involving the moving car hitting a stationary object, then everybody would already know that. ABS systems certainly help reduce accidents, but anyone who thinks they will stop the car in most cases of potential collision, is pretty clueless.

* * * * *

The article says:

“If your car hits the brakes every time there’s a piece of debris on the road, or it sees any type of stationary object, this could actually cause an accident, or give the driver whiplash. False positives can be a real problem for such technology. All current automatic emergency braking and adaptive cruise control systems are designed to be ‘blind’ to various fixed objects. If not, the system wouldn’t be able to do its job.”

That explains pretty well why the ABS system has to be “blind” to certain obstacles, but the real problem with false positives isn’t a danger of whiplash or that it could cause an accident by the car stopping unexpectedly.

The real problem is that if the system is so sensitive that it results in many false positives, the unnecessary braking incidents become so annoying to the driver that he simply shuts the system off. Or if he can’t shut it off, he chooses to drive a different car instead.

An ABS system that nobody wants to use because it’s too sensitive, won’t be of any use at all.

If DonC is right about the limitations of radar detectors connected to current ABS systems, then perhaps Autopilot can detect a moving car two cars ahead, but not a stationary one.

One thing I haven’t seen discussion of, on EV forums, is just how much (or how little) data/resolution is available from the type of radar scanner in use for ABS systems. If the image return is as ill-defined as the example linked below, then the wonder isn’t that a car equipped with ABS hit a stationary truck; the wonder is that it will ever detect the need to brake at all!