Study: Autonomous Programming Key to Crash Reductioneditor@aashto.org June 12, 2020 0 COMMENTS
A new study by the Insurance Institute for Highway Safety indicates autonomous vehicles might prevent only around a third of all crashes if automated systems drive too much like people.
[Above photo by Joseph Thornton.]
“It’s likely that fully self-driving cars will eventually identify hazards better than people, but we found that this alone would not prevent the bulk of crashes,” explained Jessica Cicchino, IIHS vice president for research and a co-author of the study, in a statement.
She noted that “conventional thinking” postulates that self-driving vehicles could one day make crashes a thing of the past. But that reality is not so simple, according to a National Motor Vehicle Crash Causation Survey conducted by the National Highway Traffic Safety Administration. Of the more than 5,000 police-reported crashes analyzed by that NHTSA study, driver error was the “final failure” in the chain of events leading to more than nine out of 10 crashes.
“Building self-driving cars that drive as well as people do is a big challenge in itself,” added Alexandra Mueller, IIHS research scientist and lead author of the study. “But they’d actually need to be better than that to deliver on the promises we’ve all heard.”
As a result, the IIHS study suggests that only about a third of those crashes were the result of mistakes that automated vehicles would be expected to avoid simply because they have more accurate perception than human drivers and aren’t vulnerable to incapacitation.
To avoid the other two-thirds, they would need to be specifically programmed to prioritize safety over speed and convenience.
To that end, the IIHS research team reviewed the case files from that NHTSA study and separated the driver-related factors that contributed to the crashes into five categories:
- “Sensing and perceiving” errors included things like driver distraction, impeded visibility and failing to recognize hazards before it was too late.
- “Predicting” errors occurred when drivers misjudged a gap in traffic, incorrectly estimated how fast another vehicle was going or made an incorrect assumption about what another road user was going to do.
- “Planning and deciding” errors included driving too fast or too slow for the road conditions, driving aggressively or leaving too little following distance from the vehicle ahead.
- “Execution and performance” errors included inadequate or incorrect evasive maneuvers, overcompensation and other mistakes in controlling the vehicle.
- “Incapacitation” involved impairment due to alcohol or drug use, medical problems or falling asleep at the wheel.
The researchers also determined that some crashes were unavoidable, such as those caused by a vehicle failure like a blowout or broken axle.
For the study’s purposes, IIHS said its researchers imagined a future in which all the vehicles on the road are self-driving and assumed those future vehicles would prevent those crashes that were caused exclusively by perception errors or involved an incapacitated driver.
That’s because cameras and sensors of fully autonomous vehicles could be expected to monitor the roadway and identify potential hazards better than a human driver and be incapable of distraction or incapacitation, the organization added.
IIHS noted in its study that crashes due to only sensing and perceiving errors accounted for 24 percent of the total, and incapacitation accounted for 10 percent.
Those crashes might be avoided if all vehicles on the road were self-driving — though it would require sensors that worked perfectly and systems that never malfunctioned. The remaining two-thirds might still occur unless autonomous vehicles are also specifically programmed to avoid other types of predicting, decision-making and performance errors.
Planning and deciding errors, such as speeding and illegal maneuvers, were contributing factors in about 40 percent of crashes in the study sample.
The fact that deliberate decisions made by drivers can lead to crashes indicates that rider preferences might sometimes conflict with the safety priorities of autonomous vehicles.
For self-driving vehicles to live up to their promise of eliminating most crashes, they will have to be designed to focus on safety rather than rider preference when those two are at odds.
“Our analysis shows that it will be crucial for designers to prioritize safety over rider preferences if autonomous vehicles are to live up to their promise to be safer than human drivers,” Mueller noted.