NTSB: Uber self-driving SUV saw pedestrian, did not brake
The autonomous Uber SUV that struck and killed an Arizona pedestrian in March spotted the woman about six seconds before hitting her, but did not stop because the system used to automatically apply brakes in potentially dangerous situations had been disabled, according to federal investigators.
In a preliminary report on the crash, the National Transportation Safety Board said Thursday that emergency braking is not enabled while Uber’s cars are under computer control “to reduce the potential for erratic vehicle behavior.”
Instead, Uber relies on a human backup driver to intervene. The system, however, is not designed to alert the driver.
The findings, which are not final, should be a warning to all companies testing autonomous vehicles to check their systems to make sure they automatically stop when necessary in the environment where they are being tested, said Alain Kornhauser, faculty chairman of autonomous vehicle engineering at Princeton University.
Uber, he said, likely determined in testing that its system braked in situations it shouldn’t have, possibly for overpasses, signs and trees. “It got spoofed too often,” Kornhauser said. “Instead of fixing the spoofing, they fixed the spoofing by turning it off.”