Role in IT decision-making process:Align Business & IT GoalsCreate IT StrategyDetermine IT NeedsManage Vendor RelationshipsEvaluate/Specify Brands or VendorsOther RoleAuthorize PurchasesNot Involved

Work Phone:

Company:

Company Size:

Industry:

Street Address

City:

Zip/postal code

State/Province:

Country:

Occasionally, we send subscribers special offers from select partners. Would you like to receive these special partner offers via e-mail?YesNo

Your registration with Eweek will include the following free email newsletter(s):News & Views

By submitting your wireless number, you agree that eWEEK, its related properties, and vendor partners providing content you view may contact you using contact center technology. Your consent is not required to view content or use site features.

By clicking on the "Register" button below, I agree that I have carefully read the Terms of Service and the Privacy Policy and I agree to be legally bound by all such terms.

Google's AI System Is the 'Driver' in Self-Driving Cars, NHTSA Rules

Having the agency interpret "driver" in Google's self-driving cars as referring to its Self-Driving System is an important milestone in its efforts to deliver fully autonomous vehicles.

The National Highway Traffic Safety Administration has granted a recent Google request that the artificial intelligence system behind its self-driving vehicles be considered the "driver" of the vehicle under federal automobile safety laws.

"NHTSA will interpret 'driver' in the context of Google's described motor vehicle design as referring to the [Self-Driving System], and not to any of the vehicle occupants," the agency said in response to a letter from Google last November asking for clarification on the topic.

"We agree with Google its [self-driving vehicles] will not have a 'driver' in the traditional sense that vehicles have had drivers during the last more than one hundred years," the NHTSA said.

The NHTSA's interpretation is an important milestone in Google's efforts to deliver fully autonomous vehicles that will be capable of driving on public roads entirely without any human interaction. Google is one of several companies that are working on delivering such vehicles over the next few years and are grappling with how to get the vehicles to comply with federal and state automobile safety requirements that were drafted with human drivers in mind.

Further reading

In its letter to the NHTSA last November and another one this January, the director of Google's self-driving car project, Chris Urmson, had wanted to know how the agency would interpret certain provisions in the Federal Motor Vehicle Safety Standards (FMVSS) within the context of self-driving vehicles.

The biggest question the company had was how it could certify its Self-Driving System (SDS) artificial intelligence technology to the federal motor vehicle safety standards.

Google's plans call for a vehicle that would be completely operated by the SDS technology, the NHTSA said in its lengthy response to Google's letter posted this week on its Website.

The vehicles would be without any of the standard controls available to human operators, such as a steering wheel, brake, accelerator pedal and many other functions commonly found in current-generation automobiles. In calling for the SDS to be considered a "driver" under federal law, Google had expressed concern that having controls that would allow a human to interact with the vehicle's operations would in fact be detrimental to safety, the NHTSA noted.

The NHTSA's response highlighted how self-driving vehicle technologies are raising several "novel issues" for federal automobile safety regulators. The standards were developed at a time when it was assumed that all motor vehicles would have a steering wheel, accelerator and brake pedal operated by a human driver. In drafting safety rules, it was also assumed that the driver's location would almost always be in the front left of the vehicle, the NHTSA said.

The advent of self-driving vehicles from Google and others challenges those assumptions and will require modifications to auto safety rules pertaining to myriad issues, the agency noted. For instance, federal requirements for turn signals and brakes and braking systems have always assumed that a human would be in full control of the vehicle.

Not only will existing rules need to be modified, new measures will also need to be put in place for testing the safety measures in autonomous vehicles and for ensuring compliance with federal safety standards, the NHTSA said. "As self-driving technology moves beyond what was envisioned at the time when standards were issued, NHTSA may not be able to use the same kinds of test procedures for determining compliance."

Given Google's proposed autonomous vehicle design, no human occupant would meet the federal definition of "driver," the NHTSA said. "Even if it were possible for a human occupant to determine the location of Google's steering control system, and sit immediately behind it, that human occupant would not be capable of actually driving the vehicle as described by Google." Because it is the SDS that is in complete control, "it is more reasonable to identify the 'driver' as whatever (as opposed to whoever) is doing the driving."

The NHTSA's interpretation could come as some relief for Google, which has been trying to get needed safety approvals for its self-driving vehicles in several states. California, which is where Google has been doing most of the testing to date, recently made clear it would not permit tests of fully autonomous vehicles without a licensed human driver in the vehicle capable of taking control as needed. It will be interesting to see how—or whether—the NHTSA's ruling will impact the state's stance on the topic.

In an analyst note, Richard Windsor, an analyst at Edison Investment Research, said the NHTSA's ruling represents a small step forward for the autonomous car industry, but numerous other issues need to be resolved before such vehicles can be put on public roads. One of them is liability.

"Liability is the biggest problem that faces autonomous driving as sending an algorithm to prison is not a practical option. When an autonomous vehicle crashes—and they will—the question arises as to who is responsible for the crash," Windsor said in the analyst note.