When a driver slams on the brakes to avoid hitting a pedestrian crossing the road illegally, she is making a moral decision that shifts risk from the pedestrian to the people in the car. Self-driving cars might soon have to make such ethical judgments on their own — but settling on a universal moral code for the vehicles could be a thorny task, suggests a survey of 2.3 million people from around the world.
The largest ever survey of machine ethics, finds that many of the moral principles that guide a driver’s decisions vary by country. For example, in a scenario in which some combination of pedestrians and passengers will die in a collision, people from relatively prosperous countries with strong institutions were less likely to spare a pedestrian who stepped into traffic illegally.
People rarely encounter such stark moral dilemmas, and some critics question whether the scenarios posed in the quiz are relevant to the ethical and practical questions surrounding driverless cars. But the study’s authors say that the scenarios stand in for the subtle moral decisions that drivers make every day. They argue that the findings reveal cultural nuances that governments and makers of self-driving cars must take into account if they want the vehicles to gain public acceptance.