Picture Credit: Matwor29
A driverless car with sudden break failure is headed towards a homeless person. The car could decide to swerve and crash into a wall, killing the passenger – a businessman. What should the car do? According to a study published in Nature, your answer may depend on your culture.
The study involved an online survey distributed to 2.3 million people from 233 countries and territories. The computer programme behind the survey, named the “Moral Machine”, presented each participant with a series of generated scenarios that explored nine factors: humans vs. pets, action (swerving) vs. inaction (staying on course), passengers vs. pedestrians, more vs. fewer lives, men vs. women, young vs. elderly, legally-crossing pedestrians vs. jaywalkers, fit vs. less fit, and higher social status vs. lower social status. The car inevitably killed one group of people, and it was up to the participant whether the car would stay on course or swerve.
Although these scenarios may seem like hypotheticals, autonomous cars will need to be programmed by humans with these kinds of moral rules in the near future.
One of the co-authors of the study Iyad Rahwan, of the Massachusetts Institute of Technology, says: “People who think about machine ethics make it sound like you can come up with a perfect set of rules for robots, and what we show here with data is that there are no universal rules”.
The experiment had some interesting results with regards to different cultures around the world. Three principles seemed to hold wherever: humans over pets, more lives over fewer, and children over the elderly. However, people from certain areas of the world were more likely to choose certain groups over others.
The team used hierarchical clustering to identify three distinct ‘moral clusters’ of countries, these seemed to be based on regions of the world and this coincided with religion and other aspects of culture. The “Western” cluster included the USA, Brazil, most of Europe, Canada, Australia, and Russia; the “Eastern” cluster contained countries such as India, China, Japan, Malaysia, and Iran; and the “Southern” cluster comprised Mexico, most of South America, France, Morocco, Hungary, Mongolia, among others.
The Eastern cluster, for instance, had a smaller preference for saving younger people over the elderly, perhaps because they place more emphasis on respecting older generations. Another interesting finding was that France had a strong inclination for sparing women over men. Furthermore, people from countries that have higher economic inequality seemed to be more likely to save a business executive instead of a homeless person.
The experiment raises multiple important issues. Should different countries have different ethical settings in their autonomous vehicles? Are some moral principles innate and others learnt through culture? And should studies that gather data from humans be used to influence policy and regulations?