In some ways, the 2018 accident where an Uber autonomous vehicle left a woman dead is ancient news. Uber doesn’t even have an autonomous vehicle division now, and the autonomous vehicle world has largely moved on. In a world of 24-hour news cycles and rapidly moving technology, it may have well happened in 1918. But, unlike the media and autonomous vehicle technology, the legal system still moves at a relatively glacial pace, so things that haven’t happened yet means that the unfinished business of that accident can still have profound consequences on the industry.
Uber should be facing criminal charges in this case, not Rafaela Vazquez.
Read @m_c_elish‘s #MoralCrumpleZones paper to find out why this case is one of the most important cases in emerging technology, and why Vazquez must win. https://t.co/GxXejWAx20 https://t.co/smfxeM4uK1
— E.W. Niedermeyer (@Tweetermeyer) June 3, 2022
A Complicated Case
As the case wound its way through the legal system, there have been many twists and turns. First, local authorities in Maricopa County, Arizona, said they couldn’t prosecute the case because they had worked with Uber in the past, due to potential conflicts of interest. Neighboring Yavapai County investigated, and using information from Uber, determined that Uber was not at fault. This determination resolved conflicts of interest, which allowed the case to go back to Maricopa County.
Once back in Maricopa County, the absolved Uber gave prosecutors a lot of information that led to the indictment of Rafaela Vasquez, the test driver who was in the vehicle at the time of the accident. The public and investigators were led to believe that Vasquez was watching reality TV at the time of the accident instead of performing her duties, and criminal charges followed.
But, Vasquez and her lawyers claim that Uber misrepresented the video footage. She says she was only listening to the TV program, which was one featuring music. Also, when it looked like she was viewing a smartphone, it was really an Uber-supplied screen for communications with coworkers, meaning that the distraction was caused by the company and not her music.
As often happens in criminal cases, prosecutors attempted to do a plea deal. Not only is this cheaper than going to court, but in this case it could really be prosecutors trying to get out of what’s surely going to be a really nasty trial. The over-reliance on what now appears to be incorrect information from Uber alone could make for a rather embarrassing situation in court where Vasquez is not only acquitted, but authorities are left looking terrible for not going after Uber.
Of course, they didn’t go after Uber because another county’s prosecutors said that wasn’t necessary, when in reality those other prosecutors may have been wrong. Yavapai County may have felt uncomfortable holding that hot potato, and wanted to pass it back to Phoenix as quickly as possible. So, this leaves Maricopa’s prosecutors in a catch-22, where they aren’t allowed to refocus the investigation even though it’s becoming clearer that they should have.
“Moral Crumple Zones” Are A Wider Issue
Another important point that my friend Edward Niedermeyer brings up is that “moral crumple zones” are at stake here. Moral crumple zones refers to the corporate practice of having a test driver in the vehicle not to supervise it as much as to be there to be the fall guy when things go wrong. The term “crumple zone” is used because when there’s a moral issue, the driver ends up sacrificing themselves for the complex system’s failure, just like a real crumple zone saves the occupants of the vehicle, at the cost of the vehicle.
When there’s a complex system, with no one person controlling all of it, you end up with at least one person who is liable for the operation but who has no real control over it. When we fail to fault the designers of the system and instead fault the person who had limited control, the system’s designers use them as a crumple zone.
In this case, a complex system (not only the Uber test vehicle, but also a display for developers and test drivers to interact, inactive automatic emergency braking, training, rules, and many, many other involved factors) failed to protect human life, and Uber quickly figured out how to get prosecutors to blame the driver instead of the company.
During the initial investigation, truths were found. Yes, the test driver had a reality TV show streaming to her phone. Yes, she was distracted by a display. But, whether her duties (as defined by the employer) were being performed is central to whether her or the company was at fault, and the whole truth is required to know that.
If Maricopa County can get Vasquez to go for a plea deal, AV testing moral crumple zones will continue. If she goes to trial, they could become a thing of the past whether she is acquitted or not. If she is acquitted, they’re almost all going away.
Alternatives Are Needed
In the long run, a fully autonomous vehicle can’t give any of its liability to the occupants. If it’s driving itself and makes a mistake, the liability falls on whoever built the system and not on who is merely riding inside.
What gets complicated is what to do when the autonomous computer isn’t in full control. For something as simple as automatic braking, the liability still falls completely on the driver, because it was still their duty to stop the vehicle instead of let it hit the back of the car in front of it while they watched Pornhub. The system was only there to catch as many mistakes as it could. But, as the amount of human control decreases and the amount of system control increases, we can’t rely on the same assumption of liability from 1% control to 99% control. At some point, the liability either needs to mix or fall onto the system’s designers.
How I Think We Should Mix It
The end result should probably look more like training liability for law enforcement. If a police officer is given a policy and training, and they stick with that policy and training, any resulting harm falls on those who came up with the policy and not on the guy who followed the rules (even if the rules were garbage).
In the case of Rafaela Vasquez, she may have been playing by Uber’s rules 100% at the time of the collision. If that’s the case, then it’s imperative that we put the liability on Uber instead of letting Uber use her as a fall girl.
In the case of test drivers who don’t follow manufacturer guidance, there’s still room for liability on the part of the driver who failed to follow policy, but only when the policy was possible to follow.
Featured image: screenshot from a federal government report detailing the investigation into the 2018 Uber crash. Public Domain.
Appreciate CleanTechnica’s originality? Consider becoming a CleanTechnica Member, Supporter, Technician, or Ambassador — or a patron on Patreon.