The first fatal car accident involving a self-driving car occurred on March 18, 2018 in Arizona when a pedestrian was walking her bicycle across the street at night. The autonomous car was an Uber test vehicle. The National Transportation Safety Board (NTSB) reported that Uber’s test vehicles had been involved in 37 crashes in the preceding 18 months. 33 of these crashes were caused by another vehicle striking the Uber’s test self-driving car. Following the fatal incident, Uber stopped all testing to improve the software and add new safety precautions. Uber resumed testing in December.
A report by the NTSB explained that the car had several systems for monitoring and interpreting the car’s environment including lidar, radar, and cameras. A human operator was also in the vehicle as a safety measure to monitor the vehicle’s operation and the environment. The human operator was themself monitored by systems in the car. The sensors are generally recalibrated approximately every 6 months; on the vehicle involved in the crash the sensors had been calibrated five days prior to the crash.
The pedestrian who died was jaywalking at the time. The car’s self-driving system was not prepared to handle a jaywalker. The NTSB report explained various methods that the self-driving car used to understand and react to objects around it, including assigning object a predicted trajectory. The report stated that in the old software “pedestrians outside a vicinity of a crosswalk are not assigned an explicit goal.” When the system detects an object’s trajectory as intersecting with the vehicle it can either modify its own path or engage hazard avoidance.
The test vehicles were equipped with a developmental system and the primary planned method for dealing with emergency situations like those in the fatal crash was for the human operator to intervene. If the human does nothing for 60 seconds and the hazard remains, the car will attempt to avoid the collision itself and alert the human operator if it predicts a crash. This system was designed to avoid the vehicle reacting to hazards that did not exist, which could result in extreme movements. Further updates have changed this aspect of the self-driving system.
Before the NTSB report was released, prosecutors in Arizona had already determined that Uber was not criminally liable for the crash. The human operator can take over control of the vehicle at any point by attempting to brake, steer, accelerate, or pushing a disengagement knob. Police reported that the crash was avoidable and that the human operator was watching a TV program when the crash occurred.
The emerging self-driving car industry has been working toward commercialization. The NTSB is set to meet on November 19 to discuss the cause of the crash and safety regulations for self-driving cars going forward. Ethan Douglas, senior policy analyst for Consumer Reports, told The Associated Press “without mandatory standards for self-driving cars, there will always be companies out there that skimp on safety.” The NTSB told Uber that the investigation revealed several safety issues. A spokeswoman for Uber’s autonomous cars said that they “deeply value the thoroughness of the NTSB’s investigation into the crash and look forward to reviewing their recommendations.” After the NTSB’s recommendations it will be up to federal and state lawmakers as well as the National Highway Traffic Safety Administration to place regulations on the industry.