Uber has discovered the reason why one of the test cars in its fledgling self-driving car fleet struck and killed a pedestrian earlier this year, according to The Information.
While the company believes the car’s suite of sensors spotted 49-year-old Elaine Herzberg as she crossed the road in front of the modified Volvo XC90 on March 18th, two sources tell the publication that the software was tuned in such a way that it “decided” it didn’t need to take evasive action, and possibly flagged the detection as a “false positive.”
The reason a system would do this, according to the report, is because there are a number of situations where the computers that power an autonomous car might see something it thinks is a human or some other obstacle.
Uber reportedly set that threshold so low, though, that the system saw a person crossing the road with a bicycle and determined that immediate evasive action wasn’t necessary.
While Uber had an operator, or “safety driver,” in the car who was supposed to be able to take control in a failure like this, the employee was seen glancing down in the moments before the crash in footage released by the Tempe Police Department.
READ ALSO: Jealous Lover-boy pours acid on fiancee
All of Uber’s self-driving testing efforts have been suspended since the accident, and the company is still working with the National Transportation Safety Board, which has yet to issue a preliminary report on the progress that’s been made in its investigation. When reached for comment, a spokesperson for Uber issued the same statement to The Vergethat is found in The Information’s story:
We’re actively cooperating with the NTSB in their investigation. Out of respect for that process and the trust we’ve built with NTSB, we can’t comment on the specifics of the incident. In the meantime, we have initiated a top-to-bottom safety review of our self-driving vehicles program, and we have brought on former NTSB Chair Christopher Hart to advise us on our overall safety culture. Our review is looking at everything from the safety of our system to our training processes for vehicle operators, and we hope to have more to say soon.
In the wake of the crash, signs have emerged that Uber’s self-driving program was potentially fraught with risk. For one thing, Uber had reduced the number of “safety drivers” in its test cars from two to one, according to a New York Times report. This explained why the driver who was in the car that killed Herzberg was alone.
Then in late March, Reuters discovered that Uber had had reduced the number of LIDAR sensors on its test cars. (LIDAR is considered by most to be critical hardware for autonomous driving).