There have been radical changes in the driving landscape in the last 10 years from ridesharing to electric vehicles. Perhaps the most astonishing innovation, however, has been the creation of autonomous vehicles, or cars that can help you drive. This is not your father’s cruise control technology. This is automation of certain driving tasks such as braking, steering, and acceleration through advanced driver assistance systems (ADAS). The technology represents the gateway to more automation of driving and could one day herald fully autonomous self-driving cars.
The road to autonomous vehicles has not been without incidents including injuries and fatalities from crashes. As of January 6, 2023, the California DMV reported that there had been 546 autonomous vehicle collision reports filed in the state since 2014. Nationwide from July 2021 through October 2022, the Department of Transportation reported 605 crashes involving autonomous vehicles. There have been more crashes involving autonomous vehicles per million miles traveled – 9.1 – than with non-autonomous vehicles – 4.1 – but the vast majority of the crashes involving autonomous vehicles have been minor including injuries.
However, there have been at least 11 fatalities since 2016 including two accidents where the drivers – but not the car manufacturers – were charged. In 2018, a woman was hired to test an Uber autonomous vehicle when it hit and killed a pedestrian in Phoenix. She was charged with negligent homicide. While Uber was not held criminally culpable, it did settle with the family of the victim. In 2019, a Tesla driver who had engaged autopilot while driving was charged with vehicular manslaughter after his car ran a red light, slammed into a Honda Civic at 74 miles per hour, and killed the two occupants. Tesla was not charged.
While so far Uber and Tesla have not been held criminally liable, the question still remains: who is at fault in a crash involving an autonomous vehicle: the car or the driver? This is a far more complex question than it appears on the surface since it wades far out into uncharted waters.
For its part, Tesla has pushed back vigorously on suits alleging any type of fault on the part of its advanced driver assistance systems. Its first line of defense has been to point out that the driver is and will always be ultimately in control of – and implicitly responsible for – the car. This walks a fine line, however, when compared with Tesla’s justification behind the implementation of the advanced driver assistance system technology in the first place which is to give the driver more confidence when driving while relieving them of what Tesla considers the more tedious parts of driving – acceleration, braking, and steering.
Part of the problem, however, is that people who use these automation systems tend to overly rely on them to the point of treating the car as if it is fully automated. If the advanced driver assistance systems fail, the driver who is not paying attention will be unable to prevent the subsequent crash or collision. The question of who is at fault, however, remains unanswered. For civil liability purposes at least, the answer is probably somewhere in the middle.
For example, it is very possible that the car manufacturer who designed and installed the automation systems in the car could be held at least partially liable if they failed to adequately warn the driver that they were ultimately always in charge of the vehicle from the moment they turn on the ignition to when they park it and turn it off again. We already see this in our cars now when we are warned not to become distracted by the in-dash computer or navigation system. Obviously, these are warnings and if not heeded, they are useless but so far the warning alone has allowed car manufacturers to at least pass some of the risk of liability back on to the driver.
However, California also recognizes that the liability may shift back to the car manufacturer depending upon what the consumer and driver expects the car to be able to do. So, if the car manufacturer claims that its car would always recognize hazards in the road and stop and the car fails to do so, the manufacturer would certainly be held liable for the resulting accident.
Where the driver could see the majority of the liability shifted back on to them is when they expect the car to actually be able to drive itself and they relied on this even when they should have been paying attention to what was going on. This is more than just being complacent and letting the car drive. This is abdicating the responsibility to monitor the situation and react accordingly when there is an immediate threat.
In the case of the accident in Phoenix (woman testing Uber autonomous vehicle), the data from the car showed that the human driver did not react to the person in the cross walk until less than a second before the collision because she was not looking out of the front windshield. Whether she was over relying on the car to actually see the person in the cross walk or she was just being complacent and not paying attention remains to be seen and could mean the difference between being found guilty of manslaughter and being absolved of criminal liability.
To the extent that car manufacturers are agreeing to settle for at least some amount of damages to families of victims of autonomous car accidents, it would seem that they inherently recognize some level of liability. However, we have yet to see a civil court case where the judge or jury has imposed liability on the car manufacturer. In addition, neither of the criminal cases has resolved as well so we are still waiting to hear whether the drivers of these cars will be held criminally liable.
If you are in an accident with an autonomous vehicle, it is critical to contact a Los Angeles personal injury attorney who understands the myriad issues raised by these cases to preserve your rights and get you the most amount of compensatory recovery possible.