Scientists have developed a system that lets users of driverless cars take the moral decision of who should survive a potential car crash. Previous studies found that most people think a driverless car should be utilitarian, taking actions to minimise the amount of overall harm, which might mean sacrificing its own passengers in certain situations to save lives of pedestrians.
However, while people agreed to this in principle, they also said they would never get in a car that was prepared to kill them. “We wanted to explore what would happen if the control and the responsibility for a car’s actions were given back to the driver,” said Guiseppe Contissa at the University of Bologna in Italy.
Researchers designed a dial that switches a car’s setting along a spectrum ranging from “full altruist” to “full
egoist”, with the middle setting being impartial. The ethical knob would work not only for self-driving cars, but for all areas of industry that are becoming increasingly autonomous, the ‘New Scientist’ reported.
“The dial will switch a driverless car’s setting from full altruist to full egotist. The knob tells an autonomous car the value that the driver gives to his or her life relative to the lives of others,” Contissa said. The car would use this information to calculate the actions it will execute, taking into account the probability that the passengers or other parties suffer harm as a consequence of the car’s decision, researchers said.