A woman in Arizona was tragically killed when she was hit by a self-driving Uber vehicle earlier this year. Uber was testing the self-driving car when it happened, something the company was also doing in Pennsylvania. Those tests stopped, at least for the time being, as the fatal wreck was investigated.
The reason officials say it happened is that the software wasn't properly calibrated. The way the system works is that it tries to identify every single object that comes in front of the car, and then it assesses the risk. If there is a high risk, the vehicle can take evasive action, such as reducing speed. That never happened in Arizona, and the woman was hit at roughly 38 miles per hour.
The software did allegedly see the woman. However, it just did not think she was a significant hazard and so it opted to ignore her. The software is set up so that the car does not brake for every single thing that crosses the road. It would be more dangerous to slam on the brakes for a loose piece of trash blowing across the lanes, so the system ignores minor issues.
Unfortunately, due to the calibration, the software mistakenly believed that a woman with a bike, crossing the road in the dark, was a minor issue. The results were catastrophic, and they raise many questions about just how safe self-driving cars can really be.
Many believe that these autonomous vehicles are the future of transportation, so it will be very important to keep an eye on their development and the risks it may bring. Those who suffer injuries need to know their rights.
Source: Yahoo, "Software In Fatal Uber Crash Reportedly Recognized Woman, Then Ignored Her," Ryan Grenoble, May 07, 2018