When my car is locked in the garage, the charging port doesn't respond to touch or the pushbutton on the Tesla wall connector. I have to unlock it first to be able to plug in...Not sure if that is related to your issue...

Randy wrote:When my car is locked in the garage, the charging port doesn't respond to touch or the pushbutton on the Tesla wall connector. I have to unlock it first to be able to plug in...Not sure if that is related to your issue...

Maybe it's a bug fix?

I think it has always been the case that you can't disconnect the charger if the car's off. Only after unlocking the car can you disconnect the cable.

And not being able to open the charge port door while off/locked was an intended design choice? But plugging in a j1772 plug to start charging was not intended?

Oils4AsphaultOnly wrote:Terrific. Now let's have Tesla release all their data which Elon has claimed show that A/P-operating Teslas are safer than non-A/P cars. Professional statisticians pointed out the numerous methodological flaws behind his claims at the time he made that statement, and Consumer Reports and other auto safety organizations have asked for that data to be released. I'll be perfectly happy to acknowledge that semi-autonomous systems such as A/P have lead to an overall reduction in accidents (if not a reduction in accidents that A/P is responsible for) if the data is validated by an independent entity and shown to be scientifically valid.

As I noted previously (maybe in other topics) every TACC/AEB system is currently unable to handle this sort of event reliably because they are unable to recognize real positives among the false ones, and as such fails to meet the necessary safety requirements. Since people will continue to use the systems improperly either due to misunderstanding their capabilities (which leads to automation complacency), such systems are simply too ineffective to be safe for use by the general public. As has been previously mentioned, lack of understanding of system capability has been and is a major problem in automation-involved aviation accidents, even among highly-trained commercial/military pilots, never mind the much less qualified and trained general driving public. Until the level of idiot-proofing for these systems is much higher than it currently is, they don't belong in the public sphere. As an extreme example of automation complacency:

Safer is better than waiting for safest. Considering your tagline, you're a hypocrite.

No hypocrisy at all - I'm a big fan of AEB, as it provides an extra level of safety backstopping a human driver, and if/when I buy a new car it must be equipped with AEB. But AEB isn't a substitute for a human driver, which is what AVs must be. OBTW, about that NHTSA stat you quoted:

FOR MORE THAN a year, Tesla has defended its semiautonomous Autopilot as a vital, life-saving feature. CEO Elon Musk has lambasted journalists who write about crashes involving the system. “It's really incredibly irresponsible of any journalists with integrity to write an article that would lead people to believe that autonomy is less safe,” he said during a tumultuous earnings call this week. “Because people might actually turn it off, and then die.”

This wasn’t the first time Musk has made this argument about Autopilot, which keeps the car in its lane and a safe distance from other vehicles but requires constant human oversight, and has been involved in two fatal crashes in the US. “Writing an article that’s negative, you’re effectively dissuading people from using autonomous vehicles, you’re killing people,” he said on an October 2016 conference call.

Wednesday’s haranguing, however, came a few hours after the National Highway Traffic Safety Administration (NHTSA) indicated that Tesla has been misconstruing the key statistic it uses to defend its technology. Over the past year and a half, Tesla spokespeople have repeatedly said that the agency has found Autopilot to reduce crash rates by 40 percent. They repeated it most recently after the death of a Northern California man whose Model X crashed into a highway safety barrier while in Autopilot mode in March.

Now NHTSA says that’s not exactly right—and there’s no clear evidence for how safe the pseudo-self-driving feature actually is.

The remarkable stat comes from a January 2017 report that summarized NHTSA’s investigation into the death of Joshua Brown, whose Model S crashed into a truck turning across its path while in Autopilot mode. According to its data, model year 2014 through 2016 Teslas saw 1.3 airbag deployments per million miles, before Tesla made Autopilot available via an over-the-air software update. Afterward, the rate was 0.8 per million miles. “The data show that the Tesla vehicles' crash rate dropped by almost 40 percent after Autosteer installation,” the investigators concluded.

Just a few problems. First, as reported by Reuters and confirmed to WIRED, NHTSA has reiterated that its data came from Tesla, and has not been verified by an independent party (as it noted in a footnote in the report). Second, it says its investigators did not consider whether the driver was using Autopilot at the time of each crash. (Reminder: Drivers are only supposed to use Autopilot in very specific contexts.) And third, airbag deployments are an inexact proxy for crashes. Especially considering that in the death that triggered the investigation, the airbags did not deploy.

Tesla declined to comment on NHTSA’s clarification.

The statistic has been the subject of controversy for some time. The research firm Quality Control Systems Corp. has filed a Freedom of Information Act lawsuit against NHTSA for the underlying data in that 2017 report, which it hopes to use to determine whether the 40 percent figure is valid. NHTSA has thus far denied its FOIA requests, saying it agreed to Tesla’s requests to keep the data confidential, and that its release could threaten the carmakers’ competitiveness.

Tesla’s oft-touted figure is flawed for another reason, experts say: With this data set, you can’t separate the role of Autopilot from that of automatic emergency braking, which Tesla began releasing just a few months before Autopilot. According to the Insurance Institute for Highway Safety, vehicles that can detect imminent collisions and hit the brakes on their own suffer half as many rear-end crashes as those that can’t. (More than 99 percent of cars Tesla produced in 2017 came equipped with the feature standard, a higher proportion than any other carmaker.)

Which is all to say, determining whether a new feature like Autopilot is safe, especially if you don’t have access to lots of replicable, third-party data, is super, super hard. Tesla’s beloved 40 percent figure comes with so many caveats, it’s unreliable.

The Insurance Institute for Highway Safety has tried to come at the question another way, by looking at the frequency of insurance claims. When it tried to separate Model S sedan incidents after Autopilot was released, it observed no changes in the frequency of property damage and bodily injury liability claims. That indicates that Autopilot drivers aren’t more or less less likely to damage their cars or get hurt than others. But it did find a 13 percent reduction in collision claim frequency, indicating sedans with Autopilot enabled got into fewer crashes that resulted in collision claims to insurers.

Oh, but it gets more complicated. IIHS couldn’t tell which crashes actually involved the use Autopilot, and not just sedans equipped with Autopilot. And it’s way too early for definitive answers. “Since other safety technologies are layered below Autopilot, it is difficult to tease out results for Autopilot alone at this time,” says Russ Rader, an IIHS spokesperson. “Data on insurance claims for the Model S are still thin.”

Over at MIT, researchers frustrated with the dearth of good info on Autopilot and other semiautonomous car features have launched their own lines of inquiry. Human guinea pigs are now driving sensor- and camera-laden Teslas, Volvos, and Range Rovers around the Boston area. The researchers will use the data they generate to understand how safely humans operate those vehicles.

The upshot is that Autopilot might, in fact, be saving a ton of lives. Or maybe not. We just don’t know. And Tesla hasn’t been transparent with its own numbers. “You would need a rigorous statistical analysis with clear data indicating what vehicle has it and what vehicle doesn’t and whether it’s enabled or whether it isn’t,” says David Friedman, a former NHTSA official who now directs car policy at Consumers Union. Tesla said this week that it would begin publishing quarterly Autopilot safety statistics, but did not indicate whether its data would be verified by a third party.

NHTSA, too, could be doing a better at holding innovative but opaque carmakers like Tesla accountable for proving the safety of their new tech. “To me, they should be more transparent by asking Tesla for disengagements of the system: How often the systems disengaged, how often the humans need to take over,” Friedman says. California’s Department of Motor Vehicles requires companies testing autonomous vehicles in the state to provide annual data on disengagements, to help officials understand the limitations of the tech and its progress.

Tesla is not alone among carmakers in trying to shield sensitive info from the public. But today, humans are deeply bewildered about the semiautonomous features that have already made their way into everyday drivers’ garages. . . .

Tesla still hadn't released the data which they say supports their claim, leading this past May to:

On Wednesday, the Center of Auto Safety and Consumer Watchdog mailed a joint request to the chairman of the Federal Trade Commission, Jospeh Simons, requesting that the FTC investigate how Tesla Motors has marketed its controversial "Autopilot" semiautonomous driver aid suite.

In the letter, the two organizations accuse Tesla of "deceiving and misleading consumers into believing that the Autopilot feature of its vehicles is safer and more capable than it actually is." The groups cite two known deaths and one injury as a result of drivers relying on Autopilot to control their vehicle as reason to investigate the marketing of Autopilot. They insist that the FTC examine Tesla's advertising practices surrounding the feature to determine whether Tesla can be faulted for its customers' misuses of Autopilot. . . .

Since it's Tesla-specific examples you wan, here's another:

The Tesla’s automated vehicle control system was not designed to, and could not, identify the truck crossing the Tesla’s path or recognize the impending crash. Therefore, the system did not slow the car, the forward collision warning system did not provide an alert, and the automatic emergency braking did not activate.

The Tesla driver’s pattern of use of the Autopilot system indicated an over-reliance on the automation and a lack of understanding of the system limitations.[

If automated vehicle control systems do not automatically restrict their own operation to conditions for which they were designed and are appropriate, the risk of driver misuse remains.

The way in which the Tesla “Autopilot” system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement[.

"and lack of understanding of the system limitations". H'mm, mandatory pre-purchase and recurrent training requirements?

"The Tesla’s automated vehicle control system was not designed to, and could not, identify the truck crossing the Tesla’s path or recognize the impending crash. Therefore, the system did not slow the car, the forward collision warning system did not provide an alert, and the automatic emergency braking did not activate . . . If automated vehicle control systems do not automatically restrict their own operation to conditions for which they were designed and are appropriate, the risk of driver misuse remains." H'mm, full disclosure of autonomous system limitations (missing in this case, as no mention of the lack of ability to detect and properly classify crossing traffic had been made by Tesla or anyone else to the public prior to this crash), along with more mandatory driver training, or else (preferred) "restricting operation to only those conditions for which they are designed and are appropriate."

“While automation in highway transportation has the potential to save tens of thousands of lives, until that potential is fully realized, people still need to safely drive their vehicles,” said NTSB Chairman Robert L. Sumwalt III. “Smart people around the world are hard at work to automate driving, but systems available to consumers today, like Tesla’s ‘Autopilot’ system, are designed to assist drivers with specific tasks in limited environments. These systems require the driver to pay attention all the time and to be able to take over immediately when something goes wrong. System safeguards, that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways, were lacking and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened,” said Sumwalt.

That's from the Joshua Brown NTSB investigation findings.

Do you think the NTSB isn't going to reach many of the same findings in the death of Walter Huang? I mean, supposedly he'd experienced the same problem at the same intersection before when using A/P, and yet he still chose to put his life in the hands of A/P in the same place. If that isn't an example of automation complacency, what is? Then Tesla claimed that there'd been a couple hundred thousands cases of cars using A/P successfully negotiating that very same intersection, which was really dumb of them considering legal liability, especially once amateur video appeared of people duplicating the accident conditions and showing A/P having exactly the same problem dealing with a freeway gore (in fact, in at least one video, the very same gore) where Huang died.

Understand, I'm a huge fan of AVs being deployed as quickly as is safe, but I'm not a fan of any vehicle design which puts immature systems which are less safe than humans, and also less safe and effective than existing systems (e.g. touchscreens versus physical controls) into the public's hands, and until someone provides evidence that these systems actually are considerably more safe (at least overall), no one should be put at risk by them.

Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'.Copper shot, not Silver bullets.

EVDRIVER wrote:What Tesla drivers as a percentage or number that know how to use the system are not comfortable with auto climate? I know at least 30 owners and none of them have an issue. Since that's my sample please provide yours. Don't confuse reaching for a touch screen as a substitute for buttons. I rarely need to touch my screen for driving. In fact I don't need to take my eyes of the road and can use steering and voice controls for almost all driving needs. I would say ICE cars with small screens and terrible UI are worse. The LEAF is terrible compared to a Tesla. The majority of complaints come from people that seem to fiddle with things endlessly because the systems are poor in some regard. I bitched and complained about the Tesla climate control until I unlearned my bad habits.

You are talking about a self-selected sample, 'people who like that sort of thing say that's the sort of thing they like'. A more valuable metric is what % of potential buyers as a whole will simply reject the car outright because they simply don't want to put up with ACC or touchscreen controls, or spend the time to learn how to use it. <snip>

That group is also a self-selection bias.

At least with EVDriver's sample pool, those drivers started off with being used to buttons (since there weren't any other options) and have had to learn how to use the touchscreen settings.

No, because you are looking at ALL potential buyers and seeing what they say, rather than picking one subgroup or another. Most potential Tesla buyers, or car buyers in general which is what you really want, are people who are almost certainly familiar with touchscreens via prior experience with smartphones or tablets, as well as physical buttons, and may or may not be familiar with ACC and manual HVAC controls. Only by surveying the entire group can you get useful data and eliminate the effects of personal preference.

For instance, I have a friend I sometimes cross-country ski with: we have very similar heights, weights and body types, and move at similar speeds. Yet our clothing habits are completely different, as are our metabolisms; He's the type of person who puts on one set of clothes and can be comfortable wearing them all day; the most I've ever seen him adjust is by putting on or removing a windshell, and/or swapping a ball cap for a watch cap, almost regardless of physical output. I'm totally different; I change my clothing often, typically stripping down while moving until I'm skiing in just shorts, socks, boots, gaiters, fingerless bike gloves and a ball cap or visor, but putting multiple layers back on when I stop, and adjusting whenever my exertion level changes significantly, or I go from sun to shade, or what have you. Neither of us is right, and neither of us is wrong - we're both operating to maximize our own comfort. Same goes for those who fall into the "set it and forget it" ACC groups and those who belong to the "fiddling with temp, direction, and force regularly" groups.

Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'.Copper shot, not Silver bullets.

GRA wrote:You are talking about a self-selected sample, 'people who like that sort of thing say that's the sort of thing they like'. A more valuable metric is what % of potential buyers as a whole will simply reject the car outright because they simply don't want to put up with ACC or touchscreen controls, or spend the time to learn how to use it. <snip>

That group is also a self-selection bias.

At least with EVDriver's sample pool, those drivers started off with being used to buttons (since there weren't any other options) and have had to learn how to use the touchscreen settings.

No, because you are looking at ALL potential buyers and seeing what they say, rather than picking one subgroup or another. Most potential Tesla buyers, or car buyers in general which is what you really want, are people who are almost certainly familiar with touchscreens via prior experience with smartphones or tablets, as well as physical buttons, and may or may not be familiar with ACC and manual HVAC controls. Only by surveying the entire group can you get useful data and eliminate the effects of personal preference.

For instance, I have a friend I sometimes cross-country ski with: we have very similar heights, weights and body types, and move at similar speeds. Yet our clothing habits are completely different, as are our metabolisms; He's the type of person who puts on one set of clothes and can be comfortable wearing them all day; the most I've ever seen him adjust is by putting on or removing a windshell, and/or swapping a ball cap for a watch cap, almost regardless of physical output. I'm totally different; I change my clothing often, typically stripping down while moving until I'm skiing in just shorts, socks, boots, gaiters, fingerless bike gloves and a ball cap or visor, but putting multiple layers back on when I stop, and adjusting whenever my exertion level changes significantly, or I go from sun to shade, or what have you. Neither of us is right, and neither of us is wrong - we're both operating to maximize our own comfort. Same goes for those who fall into the "set it and forget it" ACC groups and those who belong to the "fiddling with temp, direction, and force regularly" groups.

No, you are looking at a SUBSET of all potential buyers - those who rejects the tech outright, because of a personal proclivity towards or against a tech. Whatever experience they've had with tablets and smartphones does NOT translate into an affinity for touch-screen controls of automotive functions. A teenager with touchscreen experience isn't going to be able to tell you squat about how good those controls are for driving, because you're not interacting with a tablet/smartphone in the same manor as automotive controls. You're not watching a video, playing a game, or surfing the web on the Tesla touchscreen. Only drivers who've had experience with the screen and regular automotive buttons can tell you which is better (some still prefer buttons). Everyone else's opinion is noise.

GRA wrote:As I noted previously (maybe in other topics) every TACC/AEB system is currently unable to handle this sort of event reliably because they are unable to recognize real positives among the false ones, and as such fails to meet the necessary safety requirements. Since people will continue to use the systems improperly either due to misunderstanding their capabilities (which leads to automation complacency), such systems are simply too ineffective to be safe for use by the general public. As has been previously mentioned, lack of understanding of system capability has been and is a major problem in automation-involved aviation accidents, even among highly-trained commercial/military pilots, never mind the much less qualified and trained general driving public. Until the level of idiot-proofing for these systems is much higher than it currently is, they don't belong in the public sphere. As an extreme example of automation complacency:

Safer is better than waiting for safest. Considering your tagline, you're a hypocrite.

No hypocrisy at all - I'm a big fan of AEB, as it provides an extra level of safety backstopping a human driver, and if/when I buy a new car it must be equipped with AEB. But AEB isn't a substitute for a human driver, which is what AVs must be. OBTW, about that NHTSA stat you quoted:

self-contradict much?

As for your "citations", regardless of how many articles about how the journalists personally feel about something, the data came from an official NHTSA report.

Oils4AsphaultOnly wrote:thirdly, at 70%, the model 3 is still pulling down ~70kw (~4.5miles of charge per minute) most of the non-urban superchargers are near easy on/off ramps,

I don't think this is correct. The kW rate reported is an average for charging session. If you want to know the immediate power you have to multiply Amps * Volts

Unless the Model 3 works differently from the S, which I doubt, the kW number displayed is the current charging rate, not an average for the charging session. However, the rated miles per hour number is an average over the charging session, which might lead to some confusion.

Safer is better than waiting for safest. Considering your tagline, you're a hypocrite.

No hypocrisy at all - I'm a big fan of AEB, as it provides an extra level of safety backstopping a human driver, and if/when I buy a new car it must be equipped with AEB. But AEB isn't a substitute for a human driver, which is what AVs must be. OBTW, about that NHTSA stat you quoted:

self-contradict much?

Not at all. Dumb automation isn't reliant on outside sensors or processing, and makes no claims of being able to replace human attention. AEB is so reliant, but is of such limited reliability at the moment that only the most risk-tolerant individuals would ever assume it's the primary safety system and rely on it. TACC and autosteer are of higher capability/reliability (albeit far lower than needed for safety), so induce driver disengagement to a much greater extent, as research has demonstrated (previous links to such can be found in the "Tesla's Autopilot - On the road" and/or "Automated vehicles LEAF and others" topics).

Oils4AsphaultOnly wrote:As for your "citations", regardless of how many articles about how the journalists personally feel about something, the data came from an official NHTSA report.

. . . Since model year (MY) 2010, NHTSA has conducted testing of FCW system performance as part ofits New Car Assessment Program (NCAP). The tests include the rear-end collision crash modes validatedby the CIB project: Lead Vehicle Stopped (LVS), Lead Vehicle Moving (LVM), and Lead VehicleDecelerating (LVD). On November 5, 2015, the agency announced it would be adding AEB systemevaluations to NCAP effective for the 2018 model year. In March 2016, NHTSA issued a joint statementwith the Insurance Institute for Highway Safety (IIHS) providing information related to the commitmentby 20 automobile manufacturers, representing 99 percent of the U.S. new-car market, to voluntarily makeAEB “standard on virtually all light-duty cars and trucks with a gross vehicle weight of 8,500 lbs. or lessno later than September 1, 2022, and on virtually all trucks with a gross vehicle weight between 8,501 lbs.and 10,000 lbs. no later than September 1, 2025.” The predicted safety benefits cited in the statement arelimited to rear-end crashes:

NHTSA conducted a series of test track-based AEB performance evaluations shortly after the Maycrash using a 2015 Tesla Model S 85D and a 2015 Mercedes C300 4Matic peer vehicle. The vehicleswere tested in the three rear-end collision crash modes (LVS, LVM, and LVD) and three different vehicleoperating modes: manual driving; adaptive cruise control (ACC) systems activated; and ACC and LaneCentering Control (LCC) systems activated. This testing confirmed that the AEB systems in the Teslaand peer vehicle were able to achieve crash avoidance in a majority of the rear-end scenarios tested; thatACC generally provided enough braking to achieve crash avoidance without also requiring CIB tointervene; and that neither vehicle effectively responded to a realistic appearing artifical “target” vehiclein the SCP or LTAP scenarios.

ODI’s analysis of Tesla’s AEB system finds that 1) the system is designed to avoid or mitigate rearendcollisions; 2) the system’s capabilities are in-line with industry state of the art for AEB performance. . . .

ODI analyzed mileage and airbag deployment data supplied by Tesla [emphasis added] for all MY2014 through 2016 Model S and 2016 Model X vehicles equipped with the Autopilot TechnologyPackage, either installed in the vehicle when sold or through an OTA update, to calculate crash rates bymiles travelled prior to and after Autopilot installation. Figure 11 shows the rates calculated by ODIfor airbag deployment crashes in the subject Tesla vehicles before and after Autosteer installation. Thedata show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation.

Elon then claimed that Autosteer was responsible for a 40% reducton in accidents, following which (after FoI requests for the data from various consumer groups) NHTSA issued a statement:

'Effectiveness' of Tesla self-driving system was not assessed in probe: US traffic safety agency

In 2017, the National Highway Traffic Safety Administration closed a probe into a May 2016 fatal crash involving a driver using the system and cited data from the automaker that crash rates fell by 40 percent after installation of Autopilot's Autosteer function.NHTSA said Wednesday that this crash rate comparison "did not evaluate whether Autosteer was engaged."

WASHINGTON (Reuters) - A U.S. traffic safety regulator on Wednesday contradicted Tesla Inc’s claim that the agency had found that its Autopilot technology significantly reduced crashes, saying that regulators “did not assess” the system’s effectiveness in a 2017 report. . . .

The agency said on Wednesday its crash rate comparison “did not evaluate whether Autosteer was engaged” and “did not assess the effectiveness of this technology.”