"Self-driving cars are coming, and Tesla CEO Elon Musk has been pushing his engineers hard to make sure that Tesla stays on the cutting edge. Indeed, in October 2016 he promised that the latest version of the Model S and Model X—cars with Tesla's new "Hardware 2" suite of cameras and radar—would become capable of full self-driving in the future with just a software update.

But according to a new report from The Wall Street Journal, some Tesla engineers are skeptical that Tesla can keep this promise any time soon. Disagreement about deadlines—as well as "design and marketing decisions"—is causing turmoil inside the company.

"In recent months," the Journal reports, the Autopilot team "has lost at least 10 engineers and four top managers." That included the director of the Autopilot team, "who lasted less than six months before leaving in June."

Tesla told The Wall Street Journal that the company has simply been losing engineers to other technology companies hungry for workers with autonomous driving skills."

"Tesla’s Autopilot has been dogged by safety concerns

The Wall Street Journal doesn't go into a lot of detail about why the Autopilot team has been losing talent. But we do know that over the last three years, Tesla has faced both internal and external warnings about the safety risk of pushing Autopilot technology forward too quickly.

"Weeks before the October 2015 release of Autopilot, an engineer who had worked on safety features warned Tesla that the product wasn’t ready," the Journal reports. In a resignation letter, the engineer, Evan Nakano, warned about "reckless decision making that has potentially put customer lives at risk."

Another engineer raised concerns after he experienced strange driving behavior with a prototype vehicle in May 2015. The car was driving so erratically that a police officer pulled him over, suspecting drunk driving. The engineer was sober, but he warned colleagues about problems with the vehicle. Later, he says, he was "dismissed for what he was told were 'performance issues.'"

Tesla also suffered an acrimonious split last year with MobileEye, an Israeli supplier of self-driving car technology, after Tesla customer Joshua Brown died in a fatal crash in May 2016. The passenger engaged autopilot, took his hands off the wheel, and ignored repeated warnings to grab the wheel. His car crashed into a semi-truck that turned in front of the vehicle.

In a September statement, MobileEye wrote that Tesla had been "pushing the envelope in terms of safety." Other carmakers had been more aggressive about disabling driver-assistance features if a driver refuses to put his hands on the wheel.

Since Brown's death, Tesla has tightened up these rules—the Model S will now pull over to the side of the road if the driver ignores three warnings to put his hands back on the wheel.

After its breakup with MobileEye, Tesla developed its own "Hardware 2" sensor package for use on the Model X and Model S. The big question is whether Tesla can keep its promise to enable full self-driving capabilities with these vehicles—and if these vehicles will actually be safer than human drivers.

That could be challenging because Tesla is attempting to develop self-driving technology that relies only on cameras and radar. Other companies, including Waymo, have built their self-driving technology around a lidar sensor. Lidar provides high-resolution 3D information about the surrounding environment, but a single lidar sensor can cost tens of thousands of dollars.

Tesla hopes it can achieve similar levels of safety and reliability with much cheaper cameras and radar sensors. But the turmoil in the company's Autopilot division suggests that effort may not be going smoothly."

"(Reuters) - A fatal crash and vehicle fire of a Tesla Inc Model X near Mountain View, California, last week has prompted a federal field investigation, the U.S. National Transportation Safety Board said on Tuesday, sparking a big selloff in Tesla stock.

Tesla tumbled 8.2 percent, or $25 a share, to close at $279.18, the lowest close in almost a year, after news of the investigation.

Late on Tuesday, Moody’s Investors Service downgraded Tesla’s credit rating to B3 from B2. Moody’s said the ratings “reflect the significant shortfall in the production rate of the company’s Model 3 electric vehicle.” It also “faces liquidity pressures due to its large negative free cash flow and the pending maturities of convertible bonds.”

Tesla shares fell another 2.6 percent in after-hours trading.

Tesla has $230 million in convertible bonds maturing in November 2018 and $920 million in March 2019.

Moody’s said its negative outlook for Tesla “reflects the likelihood that Tesla will have to undertake a large, near-term capital raise in order to refund maturing obligations and avoid a liquidity shortfall.”

Moody’s said Tesla is targeting weekly production of 2,500 Model 3 vehicles by the end of March, and 5,000 per week by the end of June, down from the company’s year-earlier production expectations of 5,000 per week by the end of 2017 and 10,000 by the end of 2018. Tesla plans to provide an update on Model 3 production next week.

Shares of chipmaker Nvidia Corp, which supplies Uber Technologies Inc [UBER.UL], Tesla, Volkswagen AG (VOWG_p.DE) and other automakers, closed down 7.8 percent after it disclosed it suspended self-driving tests across the globe.

QUESTIONS ABOUT ACCIDENT
In last week’s accident, it was unclear if Tesla’s automated control system was driving the car. The accident involved two other cars, the NTSB and police said. Tesla vehicles have a system called Autopilot that handles some driving tasks. The 38-year-old Tesla driver died at a nearby hospital shortly after the crash.

Late Tuesday, Tesla said in a blog post it does “not yet know what happened in the moments leading up to the crash,” but added data shows that Tesla owners have driven the same stretch of highway with Autopilot engaged “roughly 85,000 times... and there has never been an accident that we know of.”

The company said it is working with authorities to recover the logs from the computer inside the vehicle to try to gain a better understanding of what happened. The company statement did not address if the crashed vehicle was in Autopilot mode.

“We have been deeply saddened by this accident, and we have offered our full cooperation to the authorities as we work to establish the facts of the incident,” Tesla said in a statement earlier.

Government scrutiny of the Palo Alto, California company is mounting. This is the second NTSB field investigation into a Tesla crash since January.

The California Highway Patrol said the electric-powered Tesla Model X crashed into a freeway divider on Friday and then was hit by a Mazda before colliding with an Audi.

The Tesla’s lithium batteries caught fire, and emergency officials consulted company engineers before determining how to extinguish the battery fire and move the vehicle safely. NTSB said the issues being examined include the post-crash fire and removing the vehicle from the scene.

The Tesla blog post said Tesla battery packs are designed to ensure that a battery spreads slowly in the rare circumstance it catches fire.

In January, the NTSB and U.S. National Highway Traffic Safety Administration sent investigators to California to investigate the crash of a fire truck and a Tesla that apparently was traveling in semi-autonomous mode. The agencies have not disclosed any findings.

The NTSB can make safety recommendations but only NHTSA can order automakers to recall unsafe vehicles or fine automakers if they fail to remedy safety defects in a timely fashion. Before the agency can demand a recall, it must open a formal investigation, a step it has not yet taken.

Tesla’s Autopilot allows drivers under certain conditions to take their hands off the wheel for extended periods. Still, Tesla requires users to agree to keep their hands on the wheel “at all times” before they can use Autopilot.

The NTSB faulted Tesla in a prior fatal Autopilot crash.

In September, NTSB Chairman Robert Sumwalt said operational limitations in the Tesla Model S played a major role in a May 2016 crash in Florida that killed a driver using Autopilot. That crash raised questions about the safety of systems that can perform driving tasks for long stretches but cannot completely replace human drivers.

Tesla in September 2016 unveiled improvements to Autopilot, adding new limits on hands-off driving.

Reporting by David Shepardson; editing by Lisa Shumaker, David Gregorio and Cynthia Osterman"

The National Transportation Safety Board has opened an investigation into a recent fatal Tesla crash in Mountain View, California.
The U.S. National Transportation Safety Board (NTSB) recently announced that they would be opening an investigation into a recent Tesla crash that has reportedly proven fatal. The NTSB posted a tweet about the incident which reads: “2 NTSB investigators conducting Field Investigation for fatal March 23, 2018, crash of a Tesla near Mountain View, CA. Unclear if automated control system was active at time of crash. Issues examined include: post-crash fire, steps to make vehicle safe for removal from scene.”

2 NTSB investigators conducting Field Investigation for fatal March 23, 2018, crash of a Tesla near Mountain View, CA. Unclear if automated control system was active at time of crash. Issues examined include: post-crash fire, steps to make vehicle safe for removal from scene.
— NTSB_Newsroom (@NTSB_Newsroom) March 27, 2018

According to the California Highway Patrol, a driver was killed when a Tesla vehicle collided with a freeway divider causing a fire and shutting down two lanes on the freeway. This isn’t the first time that a Tesla vehicle has been involved in a car crash, in January of this year the NTSB began gathering data on a case where a Tesla Model S sedan rear-ended a fire engine in Los Angeles. NTSB spokesman Chris O’Neil said in a statement that the board has not opened an official investigation into the crash but has begun gathering information on what happened.

The driver of the car in the Los Angeles incident said that he had the autopilot system engaged when he hit the fire engine traveling at approximately 65 miles per hour. The fire engine was reportedly on the shoulder of the road while assisting at the scene of another accident. The Culver City Firefighters Twitter account tweeted out photos of the crash, stating “Amazingly there were no injuries.” Sadly, the same cannot be said for the most recent Tesla crash in Mountain View, California, which took the life of the car’s driver."

Tesla Turmoil Sends Its Bonds on an Electric Slide
Published on Mar 28, 2018
Mar.28 -- Bloomberg Gadfly columnist Liam Denning discusses the plunge in Tesla bonds and the automaker’s financing on "What'd You Miss?" Denning's opinions are his own.

"The family of a Tesla Model X driver involved in a fatal accident on Highway 101 says he made several complaints to Tesla about the vehicle’s Autopilot technology prior to the crash in which he died.

According to KGO-ABC 7, family members of Walter Huang, the driver who died in the crash on March 23, said he took his Model X to a Tesla dealer several times. Huang reportedly complained that the car’s Autopilot option kept veering the car toward the same barrier on Highway 101, near Mountain View, into which he crashed the car last Friday.

The National Transportation Safety Board is investigating the accident to determine if the car’s Autopilot feature was on at the time of the crash.

Autopilot is an automated-driving function available in Tesla cars that includes features such as lane-centering,

When reached for comment on the accident, a Tesla spokesperson said the company has been searching its service records, “And we cannot find anything suggesting that the customer ever complained to Tesla about the performance of Autopilot.”

The spokesperson added that there had been “a concern” raised about the car’s navigation not working properly, but “Autopilot’s performance is unrelated to navigation.”"

Tesla says vehicle in deadly crash was on Autopilot
Published on Apr 1, 2018
The vehicle in a fatal crash last week in California was operating on Autopilot, making it the latest accident to involve a semi-autonomous vehicle.

Fire chief: Tesla crash shows electric car fires could strain department resources
Published on Mar 27, 2018
After Friday's fiery fatal crash of a Tesla Model X on U.S. Highway 101, police and firefighters are assessing how emergency response will need to change in a world where electric cars are becoming more common.

"The US National Transportation Safety Board (NTSB) has expressed displeasure with electric carmaker Tesla for releasing information relevant to a fatal Model X crash in California last month without alerting the agency beforehand.

The NTSB began investigating the killer smash last week.

"The NTSB is unhappy because parties to an NTSB investigation are required to inform the NTSB of information releases before doing so," said Christopher T. O’Neil, chief of media relations for the agency, in a statement emailed to The Register today.

"The uncoordinated release of investigative information can affect how other parties work with us in the future so we take each unauthorized release seriously," said O'Neil. "However, this release will not hinder our investigation."

On Friday, Tesla published a blog post about the accident, which occurred on US Highway 101 near Mountain View, California, on the morning of Friday, March 23. Tesla said it is working with investigators to understand what happened but now has more information recovered from the vehicle data logs.

Through the crash investigation remains ongoing, the carmaker nonetheless implied the deceased driver, identified as Apple engineer Walter Huang, may share some of the blame because he was using the car's Autopilot system and did not have his hands on the wheel.

Drivers are supposed to keep their hands on the steering wheel even when Autopilot is engaged. Think of the technology as a super-cruise-control, rather than a self-driving brain.

"The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision," Tesla said.

"The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken."

Following a 2016 fatal Tesla Model S crash in Florida in which Autopilot played a role, Tesla modified Autopilot to make its hands-off-wheel warning harder to ignore.

Tesla made a point of defending the safety of its Autopilot system, citing a government study that found the technology reduced crash rates by 40 per cent. The carmarker also said Autopilot reduces the likelihood of being involved in a fatal accident by factor of 3.7.
Software bug or careless driver?
Tesla's defense coincides with a report from San Francisco, California, ABC affiliate KGO-TV in which Huang's brother, Will, claimed that Walter had complained to his Tesla dealer about how his car swerved unexpectedly several times when passing the area where the accident eventually occurred.

The carmaker did not state a definitive cause for the crash but it pinned blame for the severity of accident on California's poorly maintained infrastructure.

"The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced," Tesla said. "We have never seen this level of damage to a Model X in any other crash."

O'Neil from the NTSB said the agency is investigating all aspects of the crash, including reports about the driver's stated concerns about Autopilot.

"We will work to determine the probable cause of the crash and our next update of information about our investigation will likely be when we publish a preliminary report, which generally occurs within a few weeks of completion of field work," he said.

Autopilot is one thing – you're supposed to stay in control of a vehicle while it is activated. Meanwhile, on Tesla's website, Model S and X customers have the option to buy "Full Self-Driving Capability" as a $3,000 add-on to its $5,000 Enhanced Autopilot package.

"The system is designed to be able to conduct short and long distance trips with no action required by the person in the driver's seat," it promises, subject to regulatory approval.

PS: Someone has posted video on YouTube claiming to show Tesla's Autopilot being confused by painted lane markings...

Watch a Tesla autopilot itself right up into a traffic barrier on the southbound Ryan in Chicago, because the I-90 lane line is more freshly painted than the I-94 line.: https://t.co/CBkdsjYPjw
— Thomas H. Ptacek (@tqbf) April 2, 2018 "

Video shows Tesla autopilot failing at site of fatal March crash
Published on Apr 5, 2018
A video posted on YouTube by a Tesla owner shows the car's autopilot steering the car toward a barrier before the driver turns off the software and corrects the vehicle's path. The video was filmed at the same location of a fatal March crash in which a Tesla was confirmed to be running on autopilot.

Fatal Tesla Model X Autopilot Crash: What We Can Learn
Published on Apr 4, 2018