You are here

January 12, 2017, Trial News

California halts Uber's driverless fleet

Diane M. Zhang

On Dec. 21, Uber announced that it will pull its driverless cars out of San Francisco after the California DMV revoked the registrations of all 16 vehicles. An experiment that lasted only a week, the ride sharing company defied state officials who warned that it did not have the necessary state permits for testing driverless cars and that its service was illegal.

On Dec. 21, Uber announced that it will pull its driverless cars out of San Francisco after the California DMV revoked the registrations of all 16 vehicles. An experiment that lasted only a week in its hometown, the smartphone-based ride sharing company flouted state regulations when it rolled out its driverless service, defying state officials who warned that it did not have the necessary state permits for testing driverless cars and that its service was illegal.

Uber has pushed its robot car service in the last year, deploying test vehicles in Pittsburgh last September. The company considers its driverless vehicles a crucial element of its business; it hopes to replace its contract drivers. The program has been controversial, however, with many people citing reported instances of unsafe driving during its brief time in San Francisco—for example, the company’s driverless vehicles have drifted into bicycle lanes without warning, and one nearly caused a crash when it ran a red light.

Shortly after Uber rolled out its fleet of driverless cars in San Francisco, the California DMV sent a letter to Anthony Levandowski, who heads the program at Uber. The letter warned Uber that the DMV would seek injunctive relief unless the company immediately confirmed that it would halt its driverless fleet. Citing Uber’s failure to obtain the necessary state permit to test driverless technology, the DMV explained: “The permitting requirement serves the important public policy objectives of ensuring that those testing the vehicles have provided an adequate level of financial responsibility; have adequately trained qualified test drivers on the safe operation of the autonomous technology; and will notify the DMV when the vehicles have been involved in a collision and specify the instances when the technology had to be disengaged for safety reasons.”

California asks all manufacturers of driverless technology to apply to the state DMV and pay a $150 application fee to test a vehicle in autonomous technology mode on public roads. Companies approved by the DMV must share accident data and “disengagement” information about when human drivers take over. Current approved participants include companies such as Google and Tesla.

Uber, however, never applied for the permit and continued to operate its driverless fleet in San Francisco after receiving the letter, claiming that it did not need to comply with the state requirement because its cars are not autonomous as defined under the California regulation. Its driverless cars, the company argued, needed a human present to actively monitor the vehicle’s progress—and none of the vehicles in the fleet are currently capable of driving without active physical control or monitoring. In a statement from Levandowski, the company said, “The regulations apply to ‘autonomous vehicles.’ And autonomous vehicles are defined as cars equipped with technology that can ‘drive a vehicle without the active physical control or monitoring by a human operator.’ But the self-driving Ubers that we have in both San Francisco and Pittsburgh today are not capable of driving ‘without . . . active physical control or monitoring.’”

Uber has compared its driverless fleet to Tesla vehicles—electric cars with an autopilot feature—that are already sold to consumers. But attorney Adam Levitt of Chicago, whose practice focuses on consumer protection and products liability litigation, believes that there is a distinction between Uber’s robot cars and what Tesla makes—one that the National Highway Traffic Safety Administration (NHTSA) would recognize.

Along with many other regulatory bodies, NHTSA has adopted a taxonomy for automated vehicles that defines six levels of automation—and Tesla and Uber vehicles fall in different categories. In Level 2 vehicles, a human is ultimately responsible for monitoring the driving conditions; in Level 3 vehicles, a human driver does not need to monitor the situation but must be ready to take over if the car signals a request for driver intervention. “Think of a Level 2 vehicle as one in which you could comfortably take your hands off the wheel to make a phone call, and a Level 3 vehicle as one in which you could comfortably respond to a series of emails on your phone,” Levitt explained.

Although Tesla has an autopilot feature, a driver is expected to monitor the situation at all times—and the company has taken pains to communicate this to consumers. “For example, Tesla uses a warning system that cautions its drivers to keep their hands on the steering wheel and so is able to argue that the automation is more akin to an advanced cruise control system,” Levitt said. “The current versions of the Uber vehicles, however, more pointedly strive for a driverless experience. There is no warning system admonishing the driver to keep her hands on the wheel—rather, there is simply an electronic signal on the dashboard that ostensibly tells the driver when they should take control of the car.” The distinction between the level of human interaction required from a Tesla versus an Uber driverless vehicle reflects a similar distinction between Level 2 automation and Level 3 automation.

California Attorney General Kamala Harris followed up with a Dec. 16 letter to Uber, stating that unless the company pulled its robot cars off San Francisco roads, the state would seek legal action. On Dec. 21, less than a week after Uber introduced its driverless fleet in California, the state DMV revoked the registration of all 16 Uber vehicles, forcing the company to halt its service.

Although driverless vehicles may be imminent, Levitt emphasized that much work needs to be done. “Manufacturers will need to perfect the technology to a point where the vehicles almost entirely work without any fear of a dangerous malfunction,” he said. “However, the exact degree to which the technology is perfected—and the contours of that ‘perfection’—will directly dictate whether regulators will require a sentient or attentive driver in the vehicle at all times. We’re clearly not there yet, and speaking as a products liability attorney, for autonomous vehicle manufacturers to put revenue over public safety would be a deadly—and expensive—mistake for all involved.”