Who Should Bear the Risk When Self-Driving Vehicles Crash?

Journal of Applied Philosophy 38 (4):630-645 (2020)
  Copy   BIBTEX

Abstract

The moral importance of liability to harm has so far been ignored in the lively debate about what self-driving vehicles should be programmed to do when an accident is inevitable. But liability matters a great deal to just distribution of risk of harm. While morality sometimes requires simply minimizing relevant harms, this is not so when one party is liable to harm in virtue of voluntarily engaging in activity that foreseeably creates a risky situation, while having reasonable alternatives. On plausible assumptions, merely choosing to use a self-driving vehicle typically gives rise to a degree of liability, so that such vehicles should be programmed to shift the risk from bystanders to users, other things being equal. Insofar vehicles cannot be programmed to take all the factors affecting liability into account, there is a pro tanto moral reason not to introduce them, or restrict their use.

Similar books and articles

Light Trucks, Road Safety and the Environment.Nicholas Dixon - 2002 - Philosophy in the Contemporary World 9 (2):59-67.
Liability and risk.David McCarthy - 1996 - Philosophy and Public Affairs 25 (3):238-262.
Irresponsibilities, inequalities and injustice for autonomous vehicles.Hin-Yan Liu - 2017 - Ethics and Information Technology 19 (3):193-207.

Analytics

Added to PP
2020-03-03

Downloads
1,531 (#6,651)

6 months
298 (#6,855)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

Antti Kauppinen
University of Helsinki

References found in this work

Self-defense.Judith Jarvis Thomson - 1991 - Philosophy and Public Affairs 20 (4):283-310.
The basis of moral liability to defensive killing.Jeff McMahan - 2005 - Philosophical Issues 15 (1):386–405.
Killing in self‐defense.Jonathan Quong - 2009 - Ethics 119 (3):507-537.

View all 7 references / Add more references