According to Tesla data shared by South Jordan police in a statement, the driver repeatedly engaged and disengaged Tesla's Autosteer and Traffic Aware Cruise Control on multiple occasions while traveling around suburbs south of Salt Lake City.

During this "drive cycle," the Model S registered "more than a dozen instances of her hands being off the steering wheel." On two occasions, the driver had her hands off the wheel for more than a minute each time, reengaging briefly with the steering wheel only after a visual alert from the car.

"About 1 minute and 22 seconds before the crash, she re-enabled Autosteer and Cruise Control, and then, within seconds, took her hands off the steering wheel again," the police report says. "She did not touch the steering wheel for the next 80 seconds until the crash happened."

The car was programmed by the driver to travel at 60 mph. The driver finally touched the brake pedal "a second prior to the crash."

Police said the driver not only failed to abide by the guidelines of Autopilot use but also engaged the system on a street with no center median and with stop lights....The Utah driver was issued a traffic citation for "failure to keep proper lookout" under South Jordan City municipal code.

Also, the story says the "driver" says she was distracted by her phone.

This is well beyond the state of the art for A/P now, but still interesting video, especially to someone like me who walks and bikes a lot and has to frequently avoid being killed or injured by distracted drivers.

Last edited by GRA on Sun Sep 16, 2018 4:23 pm, edited 1 time in total.

Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'.Copper shot, not Silver bullets.

A report in Jalopnik Thursday cited several owners on Tesla user forums reporting that their cars' Autopilot systems no longer work since the update.

One user even said that his Model 3 displays a persistent message on the display screen that his car's automatic emergency braking, regenerative braking, and traction control aren't working, either.

While Autopilot so far is mostly a driver convenience feature, automatic emergency braking is a key safety feature available on a wide variety of modern cars.

The new software, version 8.1 2018.34.1, was supposed to make Autopilot more responsive, and some users have reported that it has made lane changes quicker and smoother.

The update has been rolling out over time. Some cars have had it for over a week, while some owners are just getting the software.

When owners having problems with the software have contacted Tesla, they report the company has said it is aware of the problem and is working on a fix. Some owners were promised the fix Wednesday. Others have been told Friday, others within two days.

It's not clear what cars are affected adversely and if any of those cars have received an effective fix.

Green Car Reports did not receive an immediate response when we reached out to Tesla to ask about the update.

The update has also generated complaints that the Autopilot system "nags" drivers more frequently to keep their hands on the steering wheel. At the same time, it reportedly makes it easier to dismiss those warnings by touching any button on the steering wheel without actually gripping the rim.

In a July call with investors, Tesla CEO Elon Musk said a new version 9 software update would start rolling out in August. So far, a few users have seen previews of version 9, which reportedly changes the vertical center control screen in the Model S and Model X to act more like the floating horizontal screen in the Model 3.

Musk said that the release of version 9 would begin to enable some of the first fully self-driving features in Autopilot . . .

In a Twitter post on Sept. 5, Musk updated the version 9-release time frame, saying that early users may get updates in another week, and that it will roll out broadly by the end of September..

Presumably a bug that can be quickly rectified.

Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'.Copper shot, not Silver bullets.

We just got back from a 1400 mile round trip to Cedar Point in Ohio. The trip was I90 all the way there and back with most of it driven using AutoPilot. AP was engaged for all of my highway time and maybe 40% of the time my wife drove -- she's a control freak and doesn't always turn on autosteer, and instead just uses TACC.

Every time I complete a trip like this I'm just amazed at what a pleasure AP makes it to drive. On highways, it just works. I paid extra attention in construction areas, but AP handled lane shifts just fine every time.

With this trip, I've now done almost 60K fully electric miles, and the S is quickly coming up on the mileage of the LEAF (which I only use for commuting 20 mi/day).

Last edited by jlv on Wed Oct 03, 2018 7:09 am, edited 1 time in total.

Because for us, it means that we can drive from point A to point B with less effort, but for retail companies, it will change where we'll want our supermarkets and how we want them to be organised, for real estate companies, it will change what houses we choose to buy and why, it will change the optimal locations for most service-oriented companies...

Autopilot was NOT on. The car ahead (under 100 meters away) was moving and coming to a stop, which means radar would've identified and been following it. The collision detection alert beeped, but the driver wasn't paying attention, so didn't slow down. EAB only triggered to lower the impact speed. Even the title of the article title doesn't mention autopilot. The body of the article only references the existing of Tesla autopilot as a reminder for drivers to be more vigilant, even when using autopilot.

I wish AP could detect idiot drivers and take them to the next exit or dirt road or black hole. This would basically eliminate these accidents, people driving through their living rooms ,and all the FUD from non-Tesla owners.

. . . one of the most important controversies surrounding Tesla Autopilot is tested and revealed.

We don’t know for sure what has changed and why this test is so much different from those initiated in the past, but clearly, the Tesla Model S “sees” the stationary car and alerts the driver and initiates automatic emergency braking. This is a huge step for Tesla vehicles — and for all vehicles for that matter — since the current/previous technology in almost every current vehicle was not programmed to stop in such situations. Still, there are many variables involved and this is simply one test. Drivers should not trust Tesla Autopilot or any other driver assistance system. Remaining engaged and alert, as well as following all the automaker’s precautions is always a must.

In addition to the video, the Euro NCAP website includes the following comments:

‘Autopilot’ on the Tesla Model S gives the driver a high level of support with the vehicle primarily in control in both braking and steering scenarios. This results in a risk of over-reliance as, in some situations, the system still needs the driver to instantly correct and override the system.

The name “Autopilot” implies a fully automated system where the driver is not required. However, the limited scenarios tested clearly indicate that is not the case, nor is such a system legally allowed. The handbook mentions that the system is intended only for use on Highways and limited access roads, but the system is not geofenced and can therefore be engaged on any road with distinct lane markings. The legally-required hands-off warning requires no more than a gentle touch of the steering wheel to avoid system deactivation, rather than ensuring the driver is still in control. To avoid misuse, Tesla has implemented a so-called ‘one-strike-you-are-out’ where Autopilot is not available for the remainder of a journey if the driver fails to nudge the steering wheel occasionally.

In the braking tests, the Model S shows full braking support by the system in nearly all scenarios except for the cut-in and cut-out scenarios where there is limited vehicle support. The full system support in the stationary scenario may result in over-reliance. However, in the cut-in and cut-out scenarios, the driver is required to apply the brakes in due time, which may reduce the driver’s over-reliance on the system.

In steering support, the Tesla does not allow the driver to input any steering himself and the system will provide all the steering required in the S-bend scenario. When system steering limits are reached, the vehicle will slow down to make the turn, again eliminating the need for driver input. In the absence of lane markings, Autopilot will stay engaged and will try to steer a safe path. However, with the sensors the Tesla has, this is nearly impossible to do reliably and implies to the driver that the vehicle can take all corners which, again, may result in over-reliance.

Overall, the Tesla system is primarily in control with a risk of driver becoming over-reliant on the system.

there's also a link to an NCAP video of Propilot doing many of the same tests.

Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'.Copper shot, not Silver bullets.

. . . one of the most important controversies surrounding Tesla Autopilot is tested and revealed.

We don’t know for sure what has changed and why this test is so much different from those initiated in the past, but clearly, the Tesla Model S “sees” the stationary car and alerts the driver and initiates automatic emergency braking. This is a huge step for Tesla vehicles — and for all vehicles for that matter — since the current/previous technology in almost every current vehicle was not programmed to stop in such situations. Still, there are many variables involved and this is simply one test. Drivers should not trust Tesla Autopilot or any other driver assistance system. Remaining engaged and alert, as well as following all the automaker’s precautions is always a must.

In addition to the video, the Euro NCAP website includes the following comments:

‘Autopilot’ on the Tesla Model S gives the driver a high level of support with the vehicle primarily in control in both braking and steering scenarios. This results in a risk of over-reliance as, in some situations, the system still needs the driver to instantly correct and override the system.

The name “Autopilot” implies a fully automated system where the driver is not required. However, the limited scenarios tested clearly indicate that is not the case, nor is such a system legally allowed. The handbook mentions that the system is intended only for use on Highways and limited access roads, but the system is not geofenced and can therefore be engaged on any road with distinct lane markings. The legally-required hands-off warning requires no more than a gentle touch of the steering wheel to avoid system deactivation, rather than ensuring the driver is still in control. To avoid misuse, Tesla has implemented a so-called ‘one-strike-you-are-out’ where Autopilot is not available for the remainder of a journey if the driver fails to nudge the steering wheel occasionally.

In the braking tests, the Model S shows full braking support by the system in nearly all scenarios except for the cut-in and cut-out scenarios where there is limited vehicle support. The full system support in the stationary scenario may result in over-reliance. However, in the cut-in and cut-out scenarios, the driver is required to apply the brakes in due time, which may reduce the driver’s over-reliance on the system.

In steering support, the Tesla does not allow the driver to input any steering himself and the system will provide all the steering required in the S-bend scenario. When system steering limits are reached, the vehicle will slow down to make the turn, again eliminating the need for driver input. In the absence of lane markings, Autopilot will stay engaged and will try to steer a safe path. However, with the sensors the Tesla has, this is nearly impossible to do reliably and implies to the driver that the vehicle can take all corners which, again, may result in over-reliance.

Overall, the Tesla system is primarily in control with a risk of driver becoming over-reliant on the system.

there's also a link to an NCAP video of Propilot doing many of the same tests.

"In steering support, the Tesla does not allow the driver to input any steering himself "

Yeah, no. Giving NCAP the benefit of the doubt, I'm going to chalk this up as poor wording. I can make very slight adjustments to auto-steer (hugging closer to the left or right of the lane). Too much, and auto-steer relinquishes control, so the driver has full steering authority. I'm assuming NCAP meant that the driver can't make larger course corrections and still have auto-steer stay in control.