New research from Rice University and Texas Tech University has found that drivers often fail to spot hazards missed by automated vehicles, and it only gets worse the longer drivers ride in them. The study is published in the journal Human Factors.

The researchers examined the behavior of 60 licensed drivers operating an automated car in a simulator. Participants were told that due to the automation, they would not need to operate the steering wheel, brake pedal or accelerator pedal. They were instructed to monitor the roadway for vehicles that were stopped dangerously at intersections and intruding into the driver’s lane, which constituted a hazard that automated vehicles could not detect. Participants also had to distinguish between vehicles that were safely stopped and dangerously stopped at intersections.

The drivers’accuracy dropped between 7 and 21 percent over the 40-minute simulation. Even in the first 10 minutes the success rate was, at best, close to 88 percent, suggesting that all drivers missed at least some hazards.

Pat DeLucia, a professor of psychological sciences at Rice and the study’s co-author, said that one possible explanation for the results is that people get used to cars doing the driving and become complacent. Coupled with previous research that indicated people are terrible at monitoring for hazards that only happen every once in a while, and that over time their ability to respond decreases, the new study “suggests that this phenomenon of difficulty monitoring effectively over time extends to monitoring an automated car,” DeLucia said.

The bottom line is, until automated driving systems are completely reliable and can respond in all situations, the driver must stay alert and be prepared to take over. And this research clearly shows that is not happening, and gets worse as time passes.

To put it another way, this study confirmed what every other study on the same subject has said - semi-autonomy that's reasonably reliable (but well short of the level needed to provide an increase in safety) guarantees inattentive operators. I can hear my ex-girlfriend with her degree in Human Factors Engineering saying the equivalent of "Well, duh."

Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'.Copper shot, not Silver bullets.

Volvo Buses and Nanyang Technological University (NTU) in Singapore have demonstrated the world’s first 12-meter autonomous electric bus. The Volvo bus will soon begin trials on the NTU campus.

The 85 passenger Volvo 7900 Electric bus is equipped with sensors and navigation controls that are managed by a comprehensive artificial intelligence (AI) system. The AI system is protected with cybersecurity measures to prevent unwanted intrusions.

The Volvo bus has undergone preliminary rounds of testing at the Centre of Excellence for Testing and Research of Autonomous vehicles (CETRAN).

Plans are in place to test the bus on NTU campus and to extend the route beyond the university.

The fully autonomous electric bus provides a quiet operation with zero emissions. It requires 80% less energy than an equivalent-sized diesel bus.

This is Volvo’s first autonomous fully electric bus in public transportation. . . .

Related, also GCC:

Volvo Cars to limit all its cars to 180 km/h (112 mph); geo-fenced limits possible in future; proposals coming on intoxication and distraction

The company’s Vision 2020, which aims for no one to be killed or seriously injured in a new Volvo by 2020, is one of the most ambitious safety visions in the automotive industry. Since technology alone will not get it all the way to zero, Volvo Cars is now broadening its scope to include a focus on driver behavior. . . .

Apart from limiting top speeds, the company is also investigating how a combination of smart speed control and geo-fencing technology could automatically limit speeds around schools and hospitals in future.

We want to start a conversation about whether car makers have the right or maybe even an obligation to install technology in cars that changes their driver’s behavior, to tackle things like speeding, intoxication or distraction. We don’t have a firm answer to this question, but believe we should take leadership in the discussion and be a pioneer.

—Håkan Samuelsson

Above certain speeds, in-car safety technology and smart infrastructure design are no longer enough to avoid severe injuries and fatalities in the event of an accident. That is why speed limits are in place in most western countries; yet speeding remains ubiquitous and one of the most common reasons for fatalities in traffic. . . .

Besides speeding, the two other major gaps are intoxication and distraction.

Driving under the influence of alcohol or drugs is illegal in large parts of the world, yet it remains a prime reason for injuries and fatalities on today’s roads.

Drivers distracted by their mobile phones or otherwise not fully engaged in driving are another major cause of traffic fatalities. In many ways, they are equally dangerous as drunk drivers.

Volvo Cars will present ideas to tackle the problem areas of intoxication and distraction at a special safety event in Gothenburg, Sweden on 20 March.

Now if we could just get Tesla to prohibit A/P use where they know it doesn't work. But we'll probably have to do that through regulation, as they seem willing to continue risking the lives of their own customers and, more importantly, the lives of others for what, so they can gather data?

Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'.Copper shot, not Silver bullets.

Uber Technologies is not criminally liable in a March 2018 crash in Tempe, Arizona, in which one of the company's self-driving cars struck and killed a pedestrian, prosecutors said on Tuesday.

The Yavapai County Attorney said in a letter made public that there was "no basis for criminal liability" for Uber, but that the back-up driver, Rafaela Vasquez, should be referred to the Tempe police for additional investigation. . . .

Vasquez, the Uber back-up driver, could face charges of vehicular manslaughter, according to a police report in June. Vasquez has not previously commented and could not immediately be reached on Tuesday.

Based on a video taken inside the car, records collected from online entertainment streaming service Hulu and other evidence, police said last year that Vasquez was looking down and streaming an episode of the television show "The Voice" on a phone until about the time of the crash. The driver looked up a half-second before hitting Elaine Herzberg, 49, who died from her injuries.

Police called the incident "entirely avoidable."

Yavapai County Attorney's Office, which examined the case at the request of Maricopa County where the accident occurred, did not explain the reasoning for not finding criminal liability against Uber. Yavapai sent the case back to Maricopa, calling for further expert analysis of the video to determine what the driver should have seen that night. . . .

Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'.Copper shot, not Silver bullets.

Optimus Ride . . . plans to deploy its self-driving systems at two sites: the Brooklyn Navy Yard, a 300-acre modern industrial park with over 400 manufacturing businesses and 9,000 people working on site; and Paradise Valley Estates, a private 80-acre, nonprofit Life Plan Community located in Fairfield, California.

Optimus Ride will provide residents and workers at both sites with access to efficient and convenient self-driving mobility within defined, geofenced areas.

This news comes weeks after Optimus Ride announced a partnership with Brookfield Properties to deploy self-driving vehicles at Brookfield’s Halley Rise development, located just outside of Washington, D.C. With these new programs, Optimus Ride will be operating deployments in four US states.

Slated to launch in the second quarter of 2019, Optimus Ride’s deployment at the Brooklyn Navy Yard will be the first commercial self-driving vehicle program in the state of New York. Optimus Ride will deploy self-driving vehicles on the Brooklyn Navy Yard’s private roads, providing a loop shuttle service to connect NYC Ferry passengers to Flushing Avenue outside the Yard’s perimeter.

Paradise Valley Estates will welcome the Optimus Ride vehicle system onto its private, gated community this summer. During the initial phase of the program at Paradise Valley Estates, the primary service will be to provide prospective residents with self-driving tours of the community. Additionally, residents will be able to access Optimus Ride through its reservation and on-demand ride services to travel to-and-from their friends’ homes, as well as travel to the community/health center, and access outdoor activities within the property.

ZF Friedrichshafen AG has acquired a 60% share of 2getthere B.V. The company offers complete automated transport systems and is located in Utrecht/Netherlands, with offices in San Francisco, Dubai and Singapore. Applications range from driverless electric transport systems at airports, business and theme parks to dedicated urban transport infrastructures. . . .

2getthere was founded in 1984 and has accumulated more than 100 million kilometers of autonomous mileage with driverless passenger and cargo transport systems in several major cities worldwide, including Rotterdam, Abu Dhabi and Singapore, as well as numerous ports and airports.

2getthere’s fully electric driverless systems at business parks in Rivium (Capelle aan den IJssel) and Masdar City (Abu Dhabi) have transported more than 14 million people reliably and safely. The reliability of the systems installed by 2getthere, including vehicle controls and software architecture, exceeds 99.7%. . . .

Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'.Copper shot, not Silver bullets.

Tesla, its owners and fans, and its CEO Elon Musk crow a lot about the capabilities of its Autopilot self-driving system.

In independent comparisons, though, other autonomous driving systems keep coming out ahead. The latest is from Navigant Research, an automotive consulting firm, and the results aren't rosy for Tesla's Autopilot. It rated second from the bottom among a group of 20 companies working to develop self-driving systems.

One reason Autopilot didn't fare well here is that Navigant's study rates self-driving programs on things beyond the systems' performance. The study includes 10 criteria: the companies' self-driving visions; their technology itself; its capability, quality, and reliability; whether the companies have any partners and the quality of those partners; sales, marketing, and distribution strategies; marketing and production strategies; the product portfolio; and the companies' staying power.

The companies are ranked within those criteria using a proprietary matrix that Navigant holds close to the vest. "Companies developing suitable business models in conjunction with the core technology remain most likely to succeed in commercializing automated driving," Navigant said in a statement introducing the study. . . .

The top self-driving companies in Navigant's study have very different approaches from Tesla and are mainly working toward developing driverless taxi services: Google's (Alphabet's) Waymo, and GM's Cruise, along with Ford Autonomous Vehicles, which the company is developing without much fanfare. . . .

Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'.Copper shot, not Silver bullets.

. . .
One reason Autopilot didn't fare well here is that Navigant's study rates self-driving programs on things beyond the systems' performance. The study includes 10 criteria: the companies' self-driving visions; their technology itself; its capability, quality, and reliability; whether the companies have any partners and the quality of those partners; sales, marketing, and distribution strategies; marketing and production strategies; the product portfolio; and the companies' staying power.
. . .

It sounds to me like Navigant gave us their opinion of Tesla rather than of their opinion of Autopilot.

. . .
One reason Autopilot didn't fare well here is that Navigant's study rates self-driving programs on things beyond the systems' performance. The study includes 10 criteria: the companies' self-driving visions; their technology itself; its capability, quality, and reliability; whether the companies have any partners and the quality of those partners; sales, marketing, and distribution strategies; marketing and production strategies; the product portfolio; and the companies' staying power.
. . .

It sounds to me like Navigant gave us their opinion of Tesla rather than of their opinion of Autopilot.

Partly the case, but that is significant. The article goes on to say:

"As of the end of 2018, all automated driving market players have one thing in common: no company is providing commercial services without a human safety operator onboard the vehicle when carrying passengers. As work has progressed on many fronts in validating automated driving technology, engineers and developers are realizing how many years away they are from ensuring the stability of the technology," Navigant said.

"Indeed, earlier versions of Autopilot have been blamed in a number of fatal crashes around the world when their drivers weren't paying attention, and some which resulted in serious injury. In a recent conference call, Musk acknowledged that when the company's Full Self Driving Capability comes online, which he expects to happen later this year, "it won't be reliable at first," and drivers "will still have to pay attention."

So, Tesla continues to use their customers as beta testers, the price of testing failure being that they're injured or killed. As government regulators seem to be unwilling to stop this, I can only hope that lawsuits will sufficiently impact the company to force them to stop doing this.

Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'.Copper shot, not Silver bullets.

Elon predicts sometime this year. Then there's the Tesla AP software OTAs, i.e. the software engineers haven't learned yet
that any AP design change without a full simulation of all system paths shouldn't be released to manufacturing. It's probably
the same group writing software for the MCU (media control unit) and updating the music channels for Spotify. Tesla owners
selecting classical music but getting country is not a potential injury scenario after an OTA to the MCU. Hey for Tesla, beta testing
doesn't cost engineering time and a few deaths here & there apparently doesn't seem to bother the NHTSA.