Self-Driving Cars

Who Should Your Self-Driving Car Save in a Crash? You or Pedestrians?

  • ---Shares

More than 75 percent of participants in a recent study said that it would be more moral for a self-driving car to sacrifice one passenger rather than kill 10 pedestrians. But most said they would not buy an autonomous vehicle if it was programmed to sacrifice its occupants.

The study, published today in the journal Science, asked hundreds of people a series of questions about the ethics of autonomous vehicles. Among the findings, participants did not think that the car should sacrifice its passenger when only one pedestrian could be saved, but their moral approval increased with the number of lives that could be saved.

 

"This is the classic signature of a social dilemma," the study's authors wrote, "in which everyone has a temptation to free-ride instead of adopting the behavior that would lead to the best global outcome."

The researchers also asked participants about their attitudes toward legally enforcing utilitarian sacrifices. Most believed that machines have a greater requirement to perform sacrifices for the greater good than do humans.

Finally, participants were much less likely to consider purchasing an autonomous vehicle if its safety algorithms were regulated by the government.

"Figuring out how to build ethical autonomous machines is one of the thorniest challenges in artificial intelligence today," the study's authors wrote. "As we are about to endow millions of vehicles with autonomy, a serious consideration of algorithmic morality has never been more urgent."

"For the time being, there seems to be no easy way to design algorithms that would reconcile moral values and personal self-interest," they continued.


More from PCMag