The camera used to monitor driver behavior in a semi-autonomous vehicle is not analogous to the one on a computer.

They both represent a loss of privacy to those who are worried about it, employing the exact same technology to do so; only the resulting actions are different, so it seems very much on-topic.

edatoakrun wrote:Blocking the vehicle interior camera in the 3 would presumably result either in disabling some or all of the driver assist functions, or completely shutting down the car.

TSLA as not yet announced which option it will employ to discourage such behavior.

Alternatively, they could leave it up to the owner/driver as to whether or not they want the camera activated, with possible effects on the insurance rates. Or, mandatory camera usage may simply be outlawed, if enough of the public cares. I doubt they would, but it's possible.

Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'.Copper shot, not Silver bullets.

PALO ALTO, Calif.— Tesla Inc. TSLA 0.05% Chief Executive Elon Musk jolted the automotive world last year when he announced the company’s new vehicles would come with a hardware upgrade that would eventually allow them to drive themselves.

He also jolted his own engineering ranks.

Members of the company’s Autopilot team hadn’t yet designed a product they believed would safely and reliably control a car without human intervention, according to people familiar with the matter.

In a meeting after the October announcement, someone asked Autopilot director Sterling Anderson how Tesla could brand the product “Full Self-Driving,” several employees recall. “This was Elon’s decision,” they said he responded. Two months later, Mr. Anderson resigned.

In the race to develop autonomous vehicles, few companies have moved faster than Tesla, an electric-car pioneer that this year surpassed General Motors Co. as the nation’s most-valuable auto maker.

Behind the scenes, the Autopilot team has clashed over deadlines and design and marketing decisions, according to more than a dozen people who worked on the project and documents reviewed by The Wall Street Journal. In recent months, the team has lost at least 10 engineers and four top managers—including Mr. Anderson’s successor, who lasted less than six months before leaving in June.

Tesla said the vehicle hardware unveiled in October will enable “full self-driving in almost all circumstances, at what we believe will be a probability of safety at least twice as good as the average human driver.” The self-driving feature is subject to software development and regulatory approval, and “it is not possible to know exactly when each element of the functionality described” will be available, Tesla noted...

Weeks before the October 2015 release of Autopilot, an engineer who had worked on safety features warned Tesla that the product wasn’t ready, according to a resignation letter circulated to other employees and reviewed by the Journal.

Autopilot’s development was based on “reckless decision making that has potentially put customer lives at risk,” the engineer, Evan Nakano, wrote.

edatoakrun wrote:WSJ has seemingly overcome the barrier blocking accurate reporting on AP- and most other Tesla subjects.

The problem that I have with this article is that it's essentially trying to show that Elon Musk and Tesla Motors have been acting in bad faith, trying to push a product before it's ready and endangering the public. The current AP features, though, have been rolled out in what I'd consider a sufficiently conservative manner. Of course, not everyone agrees, but the other automakers are seeking to develop similar systems.

The biggest issue I see is that "full self driving" (FSD) appears to be far from deployment, and Tesla has been overly optimistic as to how long it's going to take. In the meantime, they're happily taking money from people who are eager to pre-pay for FSD, and they've used FSD to market their vehicles. I don't at all believe that they're acting in bad faith, but rather that they've made the all-too-common mistake among engineers of failing to fully appreciate the complex details that need to be addressed. Remember, Elon is a physicist and engineer himself - he's not simply some corporate executive blindly pushing forward.

Essentially, Elon Musk feels very confident that FSD is do-able in the near term, and other key people have understandably disagreed. Elon says that he is personally quite involved in AP efforts. I don't doubt that Elon desperately wants to achieve FSD. At this very moment, he's probably pushing his AP team to make significant personal sacrifices and "achieve the impossible". While FSD is not going to get done as quickly as hoped for, I wouldn't be too quick to count Elon out.

If I were in my 20s with no kids, I'd probably want to get a software job at Tesla and work on cutting edge stuff like AP/FSD. But for someone who's more established in life and desires a healthy work/life balance, a job under Elon's watchful eye could be a hard sell.

The problem that I have with this article is that it's essentially trying to show that Elon Musk and Tesla Motors have been acting in bad faith, trying to push a product before it's ready and endangering the public.

Go read the threads regarding AP on Tesla Motors Club and you will be left with the same impression.

The problem that I have with this article is that it's essentially trying to show that Elon Musk and Tesla Motors have been acting in bad faith, trying to push a product before it's ready and endangering the public.

Go read the threads regarding AP on Tesla Motors Club and you will be left with the same impression.

This one, for example:

...This morning when approaching a bridge overpass traveling at 65mph the car very quickly decelerated to 45mph, throwing everything in the passenger seat into the floorboard BUT MORE IMPORTANTLY causing everyone behind her to start braking hard and almost caused a major accident. My wife would not have been part of it but our car would have been the cause of it...

My point in this post is to offer a bit of acknowledgement to those that have had similar experiences and say "I hear'ya" AND to ask publicly WHERE IS OUR SS. Either SilkySmooth from months ago or our SomethingSpecial mentioned over 2 weeks ago. As much as I HATE to say it, Tesla either needs to turn off TACC and EAP until they have something safer or put out an update to make it an order of magnitude safer...

abasile wrote:I don't at all believe that they're acting in bad faith, but rather that they've made the all-too-common mistake among engineers of failing to fully appreciate the complex details that need to be addressed.

Right, the FSD problem is not deterministic in nature, i.e. it's solution is inherently probabilistic requiring the use of AI.

abasile wrote:Remember, Elon is a physicist and engineer himself - he's not simply some corporate executive blindly pushing forward.

Actually, he just has two undergraduate degrees, one in physics and the other in economics.

abasile wrote:Essentially, Elon Musk feels very confident that FSD is do-able in the near term, and other key people have understandably disagreed. Elon says that he is personally quite involved in AP efforts. I don't doubt that Elon desperately wants to achieve FSD. At this very moment, he's probably pushing his AP team to make significant personal sacrifices and "achieve the impossible". While FSD is not going to get done as quickly as hoped for, I wouldn't be too quick to count Elon out.

But Elon's ability to achieve success in FSD as has been achieved with SpaceX is unrealistic, where the problem is morethan just hiring the best PhDs, e.g. in rocket technology/science at SpaceX.

The National Transportation Safety Board determined Tuesday that a truck driver’s failure to yield the right of way and a car driver’s inattention due to overreliance on vehicle automation are the probable cause of the fatal May 7, 2016, crash near Williston, Florida.

The NTSB also determined the operational design of the Tesla’s vehicle automation permitted the car driver’s overreliance on the automation, noting its design allowed prolonged disengagement from the driving task and enabled the driver to use it in ways inconsistent with manufacturer guidance and warnings.

As a result of its investigation the NTSB issued seven new safety recommendations and reiterated two previously issued safety recommendations.

“While automation in highway transportation has the potential to save tens of thousands of lives, until that potential is fully realized, people still need to safely drive their vehicles,” said NTSB Chairman Robert L. Sumwalt III. “Smart people around the world are hard at work to automate driving, but systems available to consumers today, like Tesla’s ‘Autopilot’ system, are designed to assist drivers with specific tasks in limited environments. These systems require the driver to pay attention all the time and to be able to take over immediately when something goes wrong. System safeguards, that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways, were lacking and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened,” said Sumwalt.

Findings in the NTSB’s report include:

The Tesla’s automated vehicle control system was not designed to, and could not, identify the truck crossing the Tesla’s path or recognize the impending crash. Therefore, the system did not slow the car, the forward collision warning system did not provide an alert, and the automatic emergency braking did not activate.

The Tesla driver’s pattern of use of the Autopilot system indicated an over-reliance on the automation and a lack of understanding of the system limitations.

If automated vehicle control systems do not automatically restrict their own operation to conditions for which they were designed and are appropriate, the risk of driver misuse remains.

The way in which the Tesla “Autopilot” system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement.*

Tesla made design changes to its “Autopilot” system following the crash. The change reduced the period of time before the “Autopilot” system issues a warning/alert when the driver’s hands are off the steering wheel. The change also added a preferred road constraint to the alert timing sequence.

Fatigue, highway design and mechanical system failures were not factors in the crash. There was no evidence indicating the truck driver was distracted by cell phone use. While evidence revealed the Tesla driver was not attentive to the driving task, investigators could not determine from available evidence the reason for his inattention.

Although the results of post-crash drug testing established that the truck driver had used marijuana before the crash, his level of impairment, if any, at the time of the crash could not be determined from the available evidence.

The NTSB issued a total of seven safety recommendations based upon its findings, with one recommendation issued to the US Department of Transportation, three to the National Highway Traffic Safety Administration, two to the manufacturers of vehicles equipped with Level 2 vehicle automation systems, and one each to the Alliance of Automobile Manufacturers and Global Automakers.

The safety recommendations address the need for: event data to be captured and available in standard formats on new vehicles equipped with automated vehicle control systems; manufacturers to incorporate system safeguards to limit the use of automated control systems to conditions for which they are designed and for there to be a method to verify those safeguards; development of applications to more effectively sense a driver’s level of engagement and alert when engagement is lacking; and it called for manufacturers to report incidents, crashes, and exposure numbers involving vehicles equipped with automated vehicle control systems.

The board reiterated two safety recommendations issued to the National Highway Traffic Safety Administration in 2013, dealing with minimum performance standards for connected vehicle technology for all highway vehicles and the need to require installation of the technology, once developed, on all newly manufactured highway vehicles.

The abstract of the NTSB’s final report, that includes the findings, probable cause and safety recommendations is available online at https://go.usa.gov/xRMFc. The final report will be publicly released in the next several days. The docket for this investigation is available at https://go.usa.gov/xNvaE.

* The abstract goes into more detail on this point:

6. Because driving is an inherently visual task and a driver may touch the steering wheel withoutvisually assessing the roadway, traffic conditions, or vehicle control system performance,monitoring steering wheel torque provides a poor surrogate means of determining theautomated vehicle driver’s degree of engagement with the driving task.

Recommendations included in the full report:

RECOMMENDATIONS

New Recommendations

As a result of its investigation, the National Transportation Safety Board makes thefollowing new safety recommendations:

To the US Department of Transportation:

1. Define the data parameters needed to understand the automated vehicle control systemsinvolved in a crash. The parameters must reflect the vehicle’s control status and thefrequency and duration of control actions to adequately characterize driver and vehicleperformance before and during a crash.

To the National Highway Traffic Safety Administration:

2. Develop a method to verify that manufacturers of vehicles equipped with Level 2 vehicleautomation systems incorporate system safeguards that limit the use of automated vehiclecontrol systems to those conditions for which they were designed.

3. Use the data parameters defined by the US Department of Transportation in response toSafety Recommendation [1] as a benchmark for new vehicles equipped with automatedvehicle control systems so that they capture data that reflect the vehicle’s control status andthe frequency and duration of control actions needed to adequately characterize driver andvehicle performance before and during a crash; the captured data should be readily available to, at a minimum, National Transportation Safety Board investigators andNational Highway Traffic Safety Administration regulators.

4. Define a standard format for reporting automated vehicle control systems data, and requiremanufacturers of vehicles equipped with automated vehicle control systems to reportincidents, crashes, and vehicle miles operated with such systems enabled.

5. Incorporate system safeguards that limit the use of automated vehicle control systems tothose conditions for which they were designed.

6. Develop applications to more effectively sense the driver’s level of engagement and alertthe driver when engagement is lacking while automated vehicle control systems are in use.To the Alliance of Automobile Manufacturers and to Global Automakers:

7. Notify your members of the importance of incorporating system safeguards that limit theuse of automated vehicle control systems to those conditions for which they were designed.

Reiterated Recommendations

As a result of its investigation, the National Transportation Safety Board reiterates thefollowing safety recommendations: