The leaders in autonomous driving seem to be converging on what has been Nissan's opinion (and mine) for the last few years, that the human driver will be required in edge cases for some years to come, but that this requirement can best be met by removing that human back-up driver from the vehicle:

A helping hand for "confused" self-driving cars

Before long we won't need someone behind the wheel, but as we've seen, the computers that will be driving us around are not always going to know what do to – like in a construction zone. When that happens, the car is going to need a little help, and one small California startup says it has the answer when the car needs to "phone a friend."

As correspondent Kris Van Cleave was taken for a ride in a self-driving car, Ben Shuckman, a remote driver a few miles away in a Silicon Valley office, announced his presence: "Welcome everybody, my name is Ben and I will be your Phantom remote operator for this drive: I'll be monitoring your vehicle remotely."..

If you're wondering why an autonomous vehicle might need somebody like Ben: as self-driving technology advances -- we know that General Motors, for example, is going to build one without a steering wheel, gas pedals or brakes -- if there was a situation where the autonomous car had to stop and didn't know what to do, a passenger couldn't do anything to help. They would need somebody to intervene remotely.

Phantom Auto doesn't build self-driving cars, but they're hoping their technology can come to the rescue of a "confused" autonomous vehicle. It uses cell phone signals and cameras already mounted to the vehicle, so a remote driver can take over in a situation where the car doesn't know what to do – the ultimate backup...

Waymo, the self-driving company owned by Google's parent alphabet, is developing its own assist technology. Nissan is working on a system where the autonomous vehicle would stop and wait for a remote user to draw it a map around an obstacle...

"A product intended to circumvent motor vehicle safety and driver attentiveness is unacceptable," said NHTSA Deputy Administrator Heidi King in a statement. "By preventing the safety system from warning the driver to return hands to the wheel, this product disables an important safeguard, and could put customers and other road users at risk."

NHTSA ordered the Autopilot Buddy's manufacturer, Dolder, Falco and Reese Partners LLC, of California, to cease marketing, sales, and distribution of the device in the U.S. by June 29.

Following the order, the company's website says the Autopilot Buddy "is designed for closed-track use, not for use on public streets. ... Warning: The Autopilot Buddy is not a safety device. Using this device irresponsibly may cause injury or death. . . ."

By golly, the whole thing was just a big misunderstanding. The company never expected or intended that anyone would use this on public roads, and they're shocked, shocked, that anyone might have thought that. Sometimes I think there's something to be said for bringing back public stocks, rotten fruit supplied free of charge. The problem being that some people will choose to throw rocks instead.

Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'.Copper shot, not Silver bullets.

Well-designed semi-autonomous systems are not causing crashes into immobile objects at anywhere near the rate TSLA's does, for the same reason well-designed battery packs do not burst into flames after a crash (and at random other times), the way TSLA's do.

Non-TSLA semi-autonomous systems, like the non-TSLA battery packs, in the USA, are all (to date) designed and produced by competent companies.

Well-designed semi-autonomous systems are not causing crashes into immobile objects at anywhere near the rate TSLA's does, for the same reason well-designed battery packs do not burst into flames after a crash (and at random other times), the way TSLA's do.

Non-TSLA semi-autonomous systems, like the non-TSLA battery packs, in the USA, are all (to date) designed and produced by competent companies.

As you know I've been very critical of Tesla's implementation of A/P, but always based on peer-reviewed research as well as statistical data. What is your statistical data for making that claim? From the NHTSA's report on Joshua Brown's crash:

ODI’s analysis of Tesla’s AEB system finds that 1) the system is designed to avoid or mitigate readend [Sic. Rear-end]collisions; 2) the system’s capabilities are in-line with industry state of the art for AEB performacethrough MY 2016; and 3) braking for crossing path collisions, such as that present in the Florida fatalcrash, are outside the expected performance capabilities of the system.

Note that as well as surveying a dozen manufacturers for the capabilities of their AEB systems, they tested Tesla's system against a comparably-equipped Mercedes at the time. I have stated my disagreements with many of NHTSA's conclusions in that investigation, but not with their testing.

Now, it may be that other companies' AEB systems have significantly improved in the interim, but AFAICT the main reason that Tesla's are known to be having a lot of rear-end crashes when a car moves out of their lane to avoid a stopped vehicle is that they attract more media attention, and there are a lot of them on the road. I haven't seen any data that other AEB systems do any better with this issue. So, absent any hard information that shows that Tesla has fallen well behind other companies, the post belongs in the general AV rather than A/P specific topics. Should such evidence arise, it will be appropriate to move it.

Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'.Copper shot, not Silver bullets.

edatoakrun wrote:... AFAICT the main reason that Tesla's are known to be having a lot of rear-end crashes when a car moves out of their lane to avoid a stopped vehicle is that they attract more media attention, and there are a lot of them on the road. I haven't seen any data that other AEB systems do any better with this issue...

In fact, Autopilot has been implicated in a wide variety of crashes in a wide variety of conditions, other than the specific ones you describe above.

And I can't find any credible reports of other Autonomous systems being blamed for crashes, though I'm sure some will occur, if they haven't already.

It is easy enough to find a LEAF passing the euro AEB safety tests by avoiding a variety of collisions, beginning at ~2 minutes into this video.

edatoakrun wrote:... AFAICT the main reason that Tesla's are known to be having a lot of rear-end crashes when a car moves out of their lane to avoid a stopped vehicle is that they attract more media attention, and there are a lot of them on the road. I haven't seen any data that other AEB systems do any better with this issue...

Actually, Autopilot has been implicated for causing a wide variety of crashes in a wide variety of conditions, other than those you describe above.

And I can't find any credible reports of other Autonomous systems being blamed for crashes, though I'm sure some will occur, if they haven't already.

The Uber Volvo killing a pedestrian in Arizona (one of the same situations shown in the video below) wasn't credible? Although I expect that barring a malfunction, in that case it will likely prove to be the fact that the person was walking the bike and had a bag hanging off it that confused the classification.

edatoakrun wrote:It is easy enough to find a Propilot LEAF passing the euro AEB safety tests by avoiding a variety of collisions, beginning at ~2 minutes into this video.

I can't find the same successful test results from any TSLA autopilot vehicle.

Can you?

Haven't tried. Has any government or independent agency certified that this AEB system is superior? Is there even any such comparative certification process? BTW, there isn't any test shown in the video for the specific situation that applied in the Tesla crashes, i.e. a lane change by the vehicle the car is following to avoid a stopped vehicle ahead of it. All the AEB systems I'm aware of are likely to classify a stopped vehicle in that situation as a non-threat, part of the background (like a road sign on a curve). Now, if someone has a system that can routinely deal with the situation that's causing Tesla's problems, then any government has the power to require companies to incorporate that ability (and I obviously feel they should).

Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'.Copper shot, not Silver bullets.

GRA wrote:Now, if someone has a system that can routinely deal with the situation that's causing Tesla's problems, then any government has the power to require companies to incorporate that ability (and I obviously feel they should).

The government can make such a requirement regardless of whether a system that can handle the case exists today or not. After all, the reason the government makes traffic laws is for the safety of those who share the roadways.

GRA wrote:Now, if someone has a system that can routinely deal with the situation that's causing Tesla's problems, then any government has the power to require companies to incorporate that ability (and I obviously feel they should).

The government can make such a requirement regardless of whether a system that can handle the case exists today or not. After all, the reason the government makes traffic laws is for the safety of those who share the roadways.

Well, sure,but if they can't be achieved then what's the point? I for one would prefer that they simply ban "semi-autonomous" driving systems entirely. I like AEB because it's a backup system as long as you're still driving the car; it's still on you to react first, and only if you don't should AEB come into play. if it prevents or reduces the severity of a crash, great, and if not, well, them's the breaks. It's when you combine it with systems that allow the car to handle steering, cruise speed and following distance such that they encourage the driver to trust the car to do their job most of the time, but not when it's most critical, that you get into problems.

Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'.Copper shot, not Silver bullets.

GRA wrote:Now, if someone has a system that can routinely deal with the situation that's causing Tesla's problems, then any government has the power to require companies to incorporate that ability (and I obviously feel they should).

The government can make such a requirement regardless of whether a system that can handle the case exists today or not. After all, the reason the government makes traffic laws is for the safety of those who share the roadways.

Well, sure,but if they can't be achieved then what's the point?

It provides a path to compliance without an outright ban. There is, after all, a real demand for this technology.