A good summary of the business and legal implications of the “Enhanced Autopilot” fiasco:

Elon Musk Was Wrong About Self-Driving Teslas

He’s charging $8,000 for a feature that doesn’t exist yet while owners bemoan a current version that’s flawed. This is the story of how we got here.

A year ago, Tesla Inc. set up a hastily organized conference call between Elon Musk and reporters. Tesla had something big to announce—big enough for the chief executive to open the floor to questions. That doesn’t happen very often.

What followed wasn’t so much the unveiling of a new product as a plan for a product. Tesla’s driver-assistance platform, Autopilot, was about to begin a transformation to fully autonomous driving. Every Tesla would come with eight cameras, radar, 12 ultrasonic sensors, and a Nvidia Corp. supercomputer. Once testing and regulatory approval were complete, Musk said, the car would be able to drive entirely by itself.

“The foundation is laid,” Musk proclaimed. Tesla was so confident, in fact, that it started selling its “Full Self Driving” feature for an additional $8,000 on any new Model X or Model S. Tesla’s timeline was, as is so often the case, years ahead of what most believed possible...

What followed were months of setbacks, delays, and in-house turmoil. A year later, there’s still no sign of Full Self Driving, and even the less ambitious hasn’t quite reached parity with an earlier, discontinued version. The head of Tesla’s Autopilot division left in January, and six months later his successor did, too. Meanwhile, Tesla owners who paid thousands of dollars for the options filed a class action lawsuit, alleging they were tricked into buying a feature that doesn’t exist and—in some cases—an unsafe car...

I think the tenor of the Bloomberg article is a simple notion that FSD was a bridge too far. Tesla hasn't delivered.If I paid $3k for FSD last year, I think Tesla owes me a refund or something of equal value.I did pay $5k for EAP and I'm enjoying it. Not wishing for a refund at all. Will select the EAP option on my Model 3.

“The maximum speed at which Automatic Emergency Braking is available has increased from 50 mph to 90 mph.”

The newest Autopilot software update also adds back the vehicles’ folding side mirrors and reportedly better Autosteer. The mirrors will stay folded at 30 mph or less if you’ve opted to already fold them. The notes explain:

The article below explains why Autopilot probably would not have prevented that crash, or any other crashes into stationary objects, even if it is operating perfectly as designed:

WHY TESLA'S AUTOPILOT CAN'T SEE A STOPPED FIRETRUCK

..Raj Rajkumar, who researches autonomous driving at Carnegie Mellon University, thinks those assumptions concern one of Tesla's key sensors. “The radars they use are apparently meant for detecting moving objects (as typically used in adaptive cruise control systems), and seem to be not very good in detecting stationary objects," he says...

The long term solution is to combine a several sensors, with different abilities, with more computing power. Key amongst them is lidar. These sensors use lasers to build a precise, detailed map of the world around the car, and can easily distinguish between a hub cap and a cop car. The problem is that compared to radar, lidar is a young technology. It's still very expensive, and isn't robust enough to survive a life of hitting potholes and getting pelted with rain and snow. Just about everybody working on a fully self-driving system—the kind that doesn't depend on lazy, inattentive humans for support—plans to use lidar, along with radar and cameras.

Except for Elon Musk. The Tesla CEO insists he can make his cars fully autonomous—no supervision necessary—with just radars and cameras. He hasn't proven his claim just yet, and no one knows if he ever will...

CleanTechnica wrote:Later, Reddit user mikhpat added these details. “The driver of the Tesla is my dad’s friend. He said that he was behind a pickup truck with AP engaged. The pickup truck suddenly swerved into the right lane because of the firetruck parked ahead. Because the pickup truck was too high to see over, he didn’t have enough time to react. He hit the firetruck at 65mph and the steering column was pushed 2 feet inwards toward him. Luckily, he wasn’t hurt. He fully acknowledges that he should’ve been paying more attention and isn’t blaming Tesla.

Apparently, Tesla's autopilot system is designed to run into stopped firetrucks in this case:

CleanTechnica wrote:Wired on Thursday asked this question: “How is it possible that one of the most advanced driving systems on the planet doesn’t see a freaking fire truck, dead ahead?” Wired dug into that topic and came up with this answer: Autonomous driving systems are programmed to ignore stopped vehicles. Volvo offers a system very similar to Tesla’s Autopilot. It’s driver’s manual warns people that the system can actually accelerate into a stopped vehicle if the car ahead suddenly swerves out the way. The traffic aware cruise control will ignore the stopped car and try to get back up to the designated speed set by the driver.

The reason is that if cars went into full “anchors away” panic braking mode every time their sensors detected a stationary object in the road ahead, they would be slowing most of the time and at risk of being rammed from behind themselves. Software engineers deliberately design these systems to ignore stationary objects in order to eliminate the problem of false positives.

Tesla covers this explicitly in its owner’s manual. “Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.”

Sorry, but that doesn't make a bit of sense to me. The reason is that even if emergency braking results in being rear-ended, it does two important things:1) It alerts those behind you of impending danger.2) It might save the occupant of the Tesla or those in or near the stopped vehicle ahead.

This is the third case I know of where a Tesla has run into a stopped emergency vehicle. It strikes me that Tesla's autopilot is not obeying "Move Over" laws and is therefore endangering emergency personnel. Fortunately no one has been killed in these accidents.

Since Tesla has not addressed this issue, it seems that the NHTSA needs to step in to protect emergency workers from this danger.

Electrek wrote:The bigger difference is the improvements to existing Autopilot features, like Autosteer, due to a much more advanced neural net system to power the Autopilot’s computer vision.

One question: Since the Tegra (X1?) processor doesn't not have a "deep-learning accelerator" like is found on the upcoming Xavier processor, are the neural networks in the Tesla Autopilot all implemented in the GPU?

This will be good news for Tesla owners:

Electrek wrote:This new update is likely going to fix that. Two drivers we spoke to said it was now equal or better than Autopilot 1 for highway driving.