Uber, Distracted Backup Driver Cited By NTSB In Fatal Self-Driving Crash

A U.S. safety agency on Tuesday faulted Uber for inadequate attention to safety and decisions in the company’s autonomous vehicle development in an investigation into the first-ever death involving a self-driving vehicle, which also cited the vehicle’s distracted back-up driver. The National Transportation Safety Board said state and federal regulators need to do more to safeguard drivers, noting a “lack of federal safety standards” for automated driving systems. The crash in March 2018 killed 49-year-old Elaine Herzberg as she was walking a bicycle across a street at night in Tempe, Arizona. The crash prompted significant safety concerns about the nascent self-driving car industry.

“The collision was the last link of a long chain of actions and decisions made by an organization that unfortunately did not make safety the top priority,” NTSB Chairman Robert Sumwalt said. The board criticized a series of decisions by Uber that it said were the result of “ineffective safety culture” at the time.

The NTSB voted 3-0 that the probable cause was the failure of the back-up safety driver to monitor the driving environment “because she was visually distracted throughout the trip by her personal cell phone.” She was behind the wheel and was supposed to act in the event of an emergency. In March, prosecutors in Arizona said Uber was not criminally liable in the self-driving crash. Police, who have said the crash was “entirely avoidable” and that the operator was watching “The Voice” TV program at the time of the crash, are still investigating.

Nat Beuse, head of safety for autonomous vehicle efforts of ride-sharing company Uber, said the company remains “committed to improving the safety of our self-driving program” after making significant improvements.

Uber made a series of development decisions that contributed to the crash’s cause, the NTSB said. The software did not properly identify Herzberg as a pedestrian, it did not adequately assess safety risks, and did not address “operators’ automation complacency.” It also deactivated the Volvo XC90’s automatic emergency braking systems in the test vehicle and precluded the use of immediate emergency braking, relying instead on the back-up driver.

Volvo found in 17 of 20 simulation tests the crash was avoided, the NTSB said. The board also cited the pedestrian’s crossing outside a crosswalk and Arizona’s insufficient oversight of autonomous vehicle testing. The NTSB urged the National Highway Traffic Safety Administration (NHTSA) to require entities testing self-driving vehicles to submit a safety self-assessment report to the agency and for the agency to determine if those plans include appropriate safeguards. It said states should do more to oversee the vehicles.

The NHTSA said it would carefully review the recommendations, adding, “It’s important for the public to note that all vehicles on the road today require a fully attentive operator at all times.” The NHTSA is also probing the Uber crash.

The board said companies actually submit the assessments and some offer little useful information. NTSB board member Jennifer Homendy said the NHTSA was failing to properly regulate automated vehicles. “In my opinion they’ve put technology advancement here before saving lives,” Homendy said.

While Uber has made significant improvements, Sumwalt will tell a U.S. Senate panel on Wednesday he has broader concerns. “We remain concerned regarding the safety culture of the numerous other developers who are conducting similar testing,” Sumwalt’s testimony seen by Reuters said.

In the aftermath of the crash, Uber suspended all testing of self-driving vehicles. It resumed testing last December in Pennsylvania with revised software and significant new restrictions and safeguards.

Some critics have questioned the focus on a single pedestrian death when U.S. pedestrians killed in vehicle crashes hit a 30-year-high in 2018, to almost 6,300.

The board is also investigating several crashes involving Tesla Inc’s driver assistance system, Autopilot, including a fatal crash in March 2018 in California. The NTSB last year revoked Tesla’s party status to the investigation after the agency clashed with Tesla and the company’s chief executive, Elon Musk, abruptly ended a call.

Sumwalt praised Uber’s cooperation. “I did notice that when I talked to their CEO he did not hang up on me,” he said. “It would be easy just to thumb it off. Blow it off. Say, NTSB, they’re wrong, they’re bad, and hang up on us. But Uber has not done that.”

Tesla did not immediately comment.

Source: Read Full Article