In a previous post, I wrote about a course (which I taught together with Stephan Kornmesser in the summer term of 2024) for master’s students who had no previous contact with X-Phi at all. After learning some methodological and statistical basics and conducting their own small replication of Knobe (2003), they had the opportunity to develop their own questions and conduct their very own studies in small groups. Below, Frederike Lüttich and Jule Rüterbories present some results from their study on the perception of responsibility in accidents involving autonomous and human-controlled vehicles.
The Perception of Responsibility in Accidents Involving Autonomous and Human-Controlled Vehicles
Frederike Lüttich and Jule Rüterbories
The relevance of autonomous systems as potential moral agents is growing with their use in areas such as medicine, the military, and traffic, where they have – or will have – to make decisions in ethical contexts. The capacity of such systems to act has far-reaching legal and ethical implications. A frequently discussed example (see, e.g., Goodall 2014, Awad et al. 2018, Cecchini, Brantley, and Dubljević 2023) is this one: Although autonomous vehicles promise greater safety, they are not flawless. In the event of unavoidable accidents, they have to make decisions about which lives to protect. The programming of such systems is complex and raises key ethical questions. Below, we examine the perception of responsibility in accidents involving autonomous and human-controlled vehicles.
To investigate this, we created an online questionnaire in which we presented a vignette about a car and a pedestrian at a traffic light. Between subjects, we varied (a) whether the car was operated autonomously or was human-driven, (b) whether it hit the pedestrian or swerved and crashed into a wall (the outcome is deadly either for the pedestrian or for the driver), and (c) whether the pedestrian (rightfully) used a crosswalk or illegally crossed a red traffic light. This resulted in a total of eight different combinations, as shown in Table 1.
Behavior of Pedestrian / Car | ||
Hits Pedestrian | Hits Wall | |
Legally Uses Crosswalk | 1 | 2 |
Illegally Crosses Red Light | 3 | 4 |
Here is a translation of the vignette for variation 1 with a self-driving car:
Imagine standing on a foggy main road and observing the following scenario: A self-driving car is driving at approximately 50 km/h towards a traffic light, which is being crossed by a woman illegally on red. The self-driving car’s sensors notice the woman too late, and it is unable to brake. The self-driving car could swerve. In doing so, it would surely hit a house wall and be completely destroyed. The self-driving car does not swerve and hits the woman. The woman dies.
After reading the vignette, participants were asked to answer the following yes-or-no question: “Is the self-driving car [the person driving] morally responsible?” At the end of the survey, and after passing an attention check, participants provided socio-demographic data, including gender, age, and level of education.
420 participants successfully passed the attention check and completed the survey. 209 women, 210 men, and one non-binary person took part. Their age ranged from 18 to 74 years, averaging 52 years. According to their statements, two people had no school-leaving qualifications, 195 had a lower secondary school leaving certificate, 95 had a technical college or university entrance qualification, 113 had a university degree, seven had a doctorate, and eight were currently studying.
Let us compare cases with (a) autonomously or human-driven cars, (b) the pedestrian or the wall being hit, and (c) the pedestrian (legally) using a crosswalk or (illegally) crossing a red traffic light.
Regarding (a), 56% of participants do not attribute responsibility to the autonomous vehicle, while 42% consider the human driver not to be responsible (χ² ≈ 7.942, p < 0.01); see Figure 1.
Regarding (b), if the pedestrian dies, 70% of participants say that the car or driver is responsible. If the driver dies, the attribution of responsibility drops to 34% (χ² ≈ 46.662, p < 0.001); see Figure 2.
And finally, regarding (c), in scenarios where the pedestrian illegally crosses the road at a red light, 54% do not think the car or driver is responsible. If the pedestrian legally uses a crosswalk, this drops to 43% (χ² ≈ 5.002, p < 0.05); see Figure 3.
The attribution of responsibility is complex and highly dependent on the situation. The results show that responsibility is attributed more often to human-controlled vehicles than autonomous ones. Factors such as compliance with traffic regulations and the person affected by the crash further influence this.
To gain more detailed insights in the future, open questions and alternative scenarios would be useful. Demographic data could have revealed additional differences in age, gender, and education. The study was limited to German participants, so possible cultural differences were not considered. Also, a basic understanding of machine ethics and automation levels is essential to grasp the ethical and technical challenges of autonomous vehicles fully. Further studies should explore these aspects in more depth.
Data
Data and do files for analysis with Stata are available from https://github.com/alephmembeth/course-x-phi-2024/tree/main/autonomous%20systems.
Literature
Awad, Edmond, Sohan Dsouza, Richard Kim, Jonathan Schulz, Joseph Henrich, Azim Shariff, Jean-François Bonnefon, and Iyad Rahwan (2018): “The Moral Machine Experiment,” Nature 563, 59–64. (Link)
Cecchini, Dario, Sean Brantley, and Veljko Dubljević (2023): “Moral Judgment in Realistic Traffic Scenarios. Moving Beyond the Trolley Paradigm for Ethics of Autonomous Vehicles,” AI & Society. (Link)
Gogoll, Jan, and Julian Müller (2016): “Autonomous Cars. In Favor of a Mandatory Ethics Setting,” Science and Engineering Ethics 23 (3), 681–700. (Link)
Knobe, Joshua (2003): “Intentional Action and Side Effects in Ordinary Language,” Analysis 63 (3), 190–194. (Link)