Within days of a fatal crash involving a California motorist whose Tesla plowed into a highway barrier with its Autopilot mode enabled, a trove of relevant details that once would have taken months to assemble had already been gathered and released.

Among them: The driver’s hands had not been detected on the steering wheel for six seconds prior to the collision, and the adaptive cruise control was set to its minimum following distance. And perhaps a telltale nugget of information: The driver, Walter Huang, would have been able to see the approaching concrete divider for five full seconds and 164 yards before impact.

Advertisement - Continue Reading Below

All the data and information, which established early narratives, came not from the local police or the National Transportation Safety Board, which is investigating the March 23 crash, but from Tesla Motors itself.

Such disclosures broke with long-standing protocols that stipulated companies working alongside investigators would let the agency release official information after meticulous review. Here was Tesla, getting ahead of the investigation.

The dispute simmered for more than a week before tensions spilled over, and the NTSB took the rare step Thursday of revoking Tesla’s status as a party to the investigation. “While we understand the demand for information that parties face during an NTSB investigation, uncoordinated releases of incomplete information do not further transportation safety or serve the public interest,” NTSB chairman Robert Sumwalt said in a written statement.

Elon Musk and Tesla countered with a full-throttle offensive, claiming the NTSB is merely after headlines and is unconcerned with safety. Tesla said it intends to lodge an official complaint with Congress regarding the differences between the company and agency. “They were trying to prevent us from telling all the facts,” the company said in a written statement. “We don’t believe this is right.”

Changes in the Autonomous Era

Much coverage of the crash has focused on the public bickering. But the hubbub has obscured an underlying issue that will almost certainly prove more important over the long term: For better and worse, automated-driving technology has transformed the nature of crash investigations.

Where investigators once had scant bits of information from cars in crashes, they now have dozens, if not hundreds, of data points sensed, collected, and stored in the moments before collisions by advanced driver-assist systems such as Autopilot and full self-driving systems. There’s a broad promise that this data can help authorities determine the precise cause of a crash better than ever before.

Advertisement - Continue Reading Below

At the same time, it has become increasingly difficult, and in some cases impossible, for investigators to identify, retrieve, and make sense of that data without the help of manufacturers like Tesla. There’s an imbalance in expertise and knowledge, and manufacturers—who potentially have much to lose should their automated systems be at fault in a crash—hold the keys to unlocking that data.

“It’s a staggering new world, and the level of detail on this stuff is remarkable,” says Sean Kane, president of Safety Research & Strategies, a Massachusetts firm that provides research and analysis on motor-vehicle safety. “You start finding data stored in taillight modules and say, ‘Wow.’ It’s a treasure trove of information. Who’s got the upper hand in understanding it all? Manufacturers.”

Proprietary Data

The NTSB has three ongoing investigations probing crashes involving automated-driving technology. In addition to last month’s deadly Tesla crash, its investigators are scrutinizing a fatal crash involving an Uber self-driving vehicle that occurred only days earlier in Tempe, Arizona, and a fender bender that occurred in Las Vegas last November between an autonomous shuttle and a human-driven delivery truck.

Reports are still to come on those investigations, but the agency completed its first investigation into a crash involving automated technology last fall—one that also involved a fatal crash in which Tesla’s Autopilot mode was engaged. The NTSB report highlights the promise and pitfalls of this newfound wealth of data.

Advertisement - Continue Reading Below

Advertisement - Continue Reading Below

Neither the Tesla Model S sedan nor the Freightliner truck in the crash were equipped with electronic data recorders (EDRs), but the NTSB’s report noted there was nonetheless a “wide range” of operational data retrieved that covered both the lifetime of the vehicle and other points potentially related to the events at the time of the collision. Crucially, it was impossible for the agency’s investigators to evaluate that data on their own.

“The vast majority, including the vehicle log files containing all of the parametric data, discussed in this report, was stored in a proprietary binary format that required the use of in-house manufacturer software tools for conversion into engineering units,” according to the Driver Assistance System Report that’s part of the NTSB’s overall report on the crash.

For decades, the NTSB has used a “party system” in many of its investigations, in which it designates relevant manufacturers and organizations as parties to help the agency quickly glean technical expertise and specialized knowledge for its investigations. Tesla was a named party throughout the 2016 Williston, Florida, crash investigation.

Had the company not agreed to help with that data retrieval, “we would have had ones and zeros with very little ability to turn that into engineering information,” said Kristin Poland, deputy director of the NTSB’s Office of Highway Safety and part of the team involved in probing the crash.

Further, she said there were challenges in capturing raw data from the car’s radar and cameras sensors instead of data that had already been processed by the algorithms that run Autopilot. In this case, Poland, who spoke shortly after the NTSB released the Williston report in September, said Tesla’s engineers were once again vital and helpful.

Advertisement - Continue Reading Below

In a case like the ongoing Mountain View investigation, it’s unknown what will happen when the relationship between a manufacturer and the NTSB has been formally severed. Beyond the frayed relationship between the two in this case, there’s uncertainty in what happens when it’s not the NTSB investigating a high-profile crash, but local authorities in Tempe, Mountain View, or elsewhere who have far less expertise examining data from automated systems.

“The Florida Highway Patrol was a party to this investigation, as well as Tesla, and they did not have the ability to play out the data from this car without the involvement of Tesla,” said Deborah Bruce, the NTSB’s investigator in charge of the Williston investigation.“So you play that out across the country, and you realize what we would know about crashes in the future will be less than we knew about them in the past, unless there is some standard.”

Car and DriverAssociated Press

Proposed Standards

Three of the NTSB’s seven recommendations stemming from its Williston investigation encouraged the development of those standards. The agency urged the Department of Transportation to develop data parameters needed to understand a vehicle’s control status and the frequency and duration of control actions taken either by man or machine.

It urged the National Highway Traffic Safety Administration (NHTSA) to use those parameters to set benchmarks for new vehicles equipped with automated-driving systems and to ensure data captured is available, at a minimum, to NTSB crash investigators and federal regulators. And it recommended that NHTSA define a standard format for reporting incidents, crashes, and vehicle miles operated when automated systems are enabled.

Advertisement - Continue Reading Below

Advertisement - Continue Reading Below

“As far as data goes, that’s the heart of the watermelon,” Bruce said. “For the future, we will need to know what happened in a crash—who was in a vehicle, or what was in control of a vehicle. We are at a juncture here where we have to look forward and say, what must we know about automation in highway vehicles?”

A Hands-Off Approach

Other branches of the federal government appear content to leave that question unanswered. Seven months after the NTSB made its recommendations, neither the Department of Transportation nor NHTSA, an agency within the DOT’s organizational umbrellas, has acted on them.

So far, the DOT has taken a hands-off approach to issuing any related regulations regarding data collection in particular and automated vehicles overall.

In its latest federal automated-vehicle policy, Automated Driving Systems: A Vision for Safety 2.0, the DOT asks manufacturers to submit voluntary safety assessments that describe their efforts to develop safe vehicles across 12 categories, including data collection. The more recent policy, which at 27 pages is substantially shorter than the 112-page version it replaced, encourages companies to develop best practices related to data collection. But it emphasizes that “no compliance requirements or enforcement mechanism” back that up.

Meanwhile, pending legislation, Senate Bill 1885, may make it more difficult for data captured by automated vehicles to be deciphered by crash investigators. A section of the bill prohibits any federal department or federal agency from promulgating regulations regarding data ownership or access. Instead, the bill would establish a Data Access Advisory Committee to make eventual policy recommendations. But until those are codified into regulation or law, the bill would restrict access.

Advertisement - Continue Reading Below

In December, the DOT hosted a roundtable discussion with the intent of encouraging the industry to develop its own best practices and voluntary data exchanges. Forty organizations spanning the public and private sectors, including Tesla and Uber, participated in the meeting. A group identified performance data from near misses, crashes, disengagements, and re-engagements, as well as information on real-time road conditions and work-zone locations, as high priorities for these potential voluntary exchanges.

Parallels with Aviation

By exploring the potential of voluntary data exchanges, automakers and tech developers are borrowing a page from the aviation industry, which long ago established a series of similar networks that share anonymized data. These networks—including the Aviation Safety Reporting System, which collects voluntary reports from pilots, air-traffic controllers, and others—helps authorities identify safety hazards. Experts believe they have played a role in the long-term improvement of the entire industry’s safety record, a rise that has resonated with fliers. One day, a similar setup might do the same for automated vehicles.

“There is a parallel here to aviation,” Poland said. “There’s a very robust aviation-data safety system that we are familiar with, and thinking, ‘How do we put this in the highway environment?’ We think there’s some promise here. There’s an opportunity for them to come together and say all their data can’t be proprietary.”

Voluntary efforts may indeed hold potential, and there’s some precedent among automakers, which have banded together to trade information about cybersecurity intrusions with the recent formation of an Automotive Information Sharing and Analysis Center (Auto-ISAC). But whether with voluntary agreements involving companies developing automated technology or with safety-minded organizations like the NTSB, so far one prominent group has been left out of the data discussion: consumers.

Advertisement - Continue Reading Below

Advertisement - Continue Reading Below

Where Drivers Come In

Vehicular data should be ushering in an empowering era for consumers. Almost all new cars contain event data recorders, commonly known as EDRs, which are akin to the black boxes in aircraft. In 2015, Congress enacted a law that, among other things, affirmed that data from EDRs is owned by car owners and that it may not be accessed except under their express authorization.

Sounds good in theory, but there are shortcomings.

For crash investigators, the EDRs are usually not the miracle devices they’re often assumed to be in providing revealing or definitive information on the causes of collisions. Safety researcher Sean Kane told C/D that they capture limited preselected streams of information, which often have sampling rates that render the data less valuable. Data streams from different parameters are not synchronized.

“There’s this myth that an EDR will answer questions about who’s at fault and what the problems are with the vehicle, and that’s not the case,” he said. Furthermore, EDR data may not always be interpreted the same way by different parties, which heightens the need for someone other than manufacturers to have access when their vehicles are involved in crashes. Years ago, Kane investigated Toyota vehicles involved in alleged unintended-acceleration incidents, and he said, “I can tell you, what Toyota interprets an EDR to say and what we do are very different.”

Beyond the questionable usefulness of what’s stored on EDRs, a bigger problem might be what’s not stored on them.

The 2015 law that underscores car owners’ rights to their data only applies to EDRs, and EDRs only capture a narrow slice of the overall available data on cars, whether automated or more traditional vehicles. Data stored in other modules or control units is not necessarily owned by or accessible to car owners. Case in point: Tesla introduced EDRs on its Models S, X, and 3 earlier this year, and it has created a retrieval program for customers who want to examine that data. But Autopilot data logs are part of neither the EDR nor the retrieval program.

Advertisement - Continue Reading Below

“It’s obvious they are capturing an extraordinary amount of data, and that information is not flowing back to the consumer or owner of the vehicle,” Kane said. “The issue is really problematic. When you are stitching together information for an investigation and take a big part of that away, you can’t always get to the bottom of the situation and get an answer without it, especially when it comes down to a car malfunctioning.”

In the cases of the Autopilot crash in Mountain View, California, and the Uber death in Tempe, Arizona, it’s too early to say how the systems performed and how the interplay of humans and machines may have contributed to the crashes; NTSB investigations are ongoing.

But early crashes in the autonomous era and their subsequent investigations have already underscored the fact that manufacturers hold heightened power while both government and independent investigators, along with car owners, find themselves in uncertain territory. Potentially valuable data may exist in abundance, but the technical and regulatory means to understand it are very much undefined frontiers.