Who is liable when a self-driving car kills someone?


Posted on

A Tesla car driving in Autopilot mode.

When a self-driving car kills someone, who should be ethically and legally responsible for that death? Does that change if the vehicle is semi-autonomous rather than fully? Liability becomes a question of utilitarian calculation, weighing accountability against its effects on overall road safety. If the manufacturers are deemed liable, they will be less likely to make autonomous vehicles; if drivers are liable, they will be less inclined to purchase them. In saying this, recent lawsuits might change opinion from manufacturer to driver liability, particularly in semi-autonomous vehicles.

Putting responsibility on manufacturers and operators means more people will be inclined to use driverless vehicles, increasing overall road safety. Manufacturers will develop fully autonomous vehicles regardless, as each company wants to be the first to do so. Consumers have the option of choosing between traditional vehicles and autonomous ones, and shifting responsibility to the manufacturers gives consumers another reason to choose autonomous: they will not be liable for mistakes made. Currently, the majority of car accidents are caused by human error, which fully automated cars have the potential to eradicate. In this respect, manufacturer liability seems to be very utilitarian.

Regarding this, in 2017, the German national ethics commission for automated and connected driving presented a set of twenty ethical guidelines for the creation and use of self-driving cars, two of which concerned liability for mistakes. Guidelines 10 and 11 stated that when the driver cannot control the car fully in all situations, and is not required to control the car in all situations, then the driver is no longer accountable for the car’s behaviour. Instead, the Committee stated that the accountability is shifted to the manufacturers or those operating the car’s systems. While this guideline is only currently in effect in Germany, it will be highly influential in the self-driving car market. The thought behind Guidelines 10 and 11 has already been actioned with Volvo’s promise to accept all liability when their cars are in autonomous mode, which predates the 2015 German Ethics Code. This highlights to the developers of driverless cars that public opinion and moral intuition are seemingly in favour of them being liable for damages caused by their technology.

Whether this was the right decision to make is currently unclear, but shifting the blame onto the manufacturers and operators has the potential to make self-driving cars significantly safer. It will ensure that those who make self-driving cars and their operating systems are doing so in the knowledge that they will be liable for any mistakes made, and so their safety processes will have to be much more thorough. However, it is not always this simple due to the distinction between fully- and semi-autonomous vehicles. Tesla’s recent victory in their 2020 lawsuit against Justine Hsu illustrates the problems of applying ethics of fully autonomous vehicles to semi-autonomous ones. In Hsu’s case, the driver was found at fault for overestimating the vehicle’s ‘Autopilot’ mode, and not following the operating manual before the crash, which resulted in nerve damage to the driver. However, Tesla themselves have been cleared of fault, as the autonomous driving function worked as intended, as did the airbags. In this case, it was the driver who overestimated the abilities of the autonomous function, using it on roads that it was not designed for.

The approximately forty-two reported deaths (as of writing) involving Tesla’s Autopilot feature raises the issue of trust in both fully and semi-autonomous vehicles. Manufacturers seem to have planned for a lack of consumer trust in autonomous vehicles. Rather than make the leap directly to fully autonomous vehicles, they have chosen to slowly integrate automated features to acclimatise the public to life with autonomous vehicles. Incidents like Hsu’s show the downfalls of this choice. Ought manufacturers be liable for deaths because of this choice?

The question of moral and legal liability in autonomous vehicles has yet to be fully answered. Indeed, whether it even can be answered is up for debate, however if we are going to use autonomous vehicles it is going to have to be, no matter how difficult it may be. Despite the German Ethics Code’s suggested shift to manufacturer liability, thus far cases have determined that the driver remains responsible for accidents. This creates a new kind of human error in road accidents- overestimation of ‘autopilot’ functions in semi-autonomous vehicles. It seems that until we can reach full vehicular automation, driver liability should the norm in most cases. It will take time and effort to reach a conclusion. The question is, are we willing to put in this effort?

Related Blogs


Disclaimer

The opinions expressed by our bloggers and those providing comments are personal, and may not necessarily reflect the opinions of Lancaster University. Responsibility for the accuracy of any of the information contained within blog posts belongs to the blogger.


Back to blog listing