Self-driving cars are supposed to be the safe, efficient, self-thinking vehicles of the future which allow us to put our feet up and look out the window -- uninterrupted.
But these cars might be programmed do a little too much thinking about those outside the car, and perhaps not enough about those inside.
A recent survey on self-driving shows that people want these smart cars to be programmed to minimize casualties during a crash, even if it causes the death of the rider but at the same time, people don't actually want to ride in cars that are programmed in this utilitarian manner. It's a classic Catch-22.
A new study published in Science magazine shows self-driving is fraught with moral dilemmas regarding the ethical programming aspect of the vehicles.
People surveyed generally agree that a car with a single rider should swerve off the road and crash (self-implode) to avoid a crowd of 10 people on the street.
However, when asked if they would actually ride in a car programmed with such idealistic morals, the survey's respondents said thanks but no thanks.
"Most people want to live in in a world where cars will minimize casualties," said Iyad Rahwan, a professor at MIT who co-authored the study, in an interview with Gizmodo.
"But everybody wants their own car to protect them at all costs."
This creates what scientists call a "social dilemma" whereby what people ideally want to see in society is in direct contradiction with their urge to act in their own self-interest. This could ultimately end up making road conditions less safe for everyone in the long run.
Trying to balance the see-saw of moral values and the understandable human desire to not die is a tough cookie to crack.