An MIT Media Lab game delves into the moral quandries related to autonomous machine ethics and society.
Ready or not, autonomous vehicles are on the move. But before these self-driving vehicles hit the road, there needs to be a comfort level that they’re safe––not just for passengers, but for any pedestrians who might come across their path.
One interesting aspect regarding autonomous vehicle safety is the moral questions relating to how humans maket snap decisions about whether to hit or avoid something (or someone) when the brakes give out and their their own life may become at risk.
Scalable Corp., in partnership with the MIT Media Lab, has created “Moral Machine,” a so-called game that explores those scenarios to help figure out the answer. As outlined in an article by Popular Science, participants are asked 13 questions and given two options. In every scenario, a self-driving car with sudden brake failure faces a choice—continue and run into whatever obstacle is ahead or swerve out of the way and careen into whatever is in the other lane.
The game of morals gets intense pretty quickly, according to described accounts. Should the unmanned vehicle swerve into pedestrians crossing legally, and would this change if they are jaywalking? Should the vehicle crash into people crossing, or crash so that it kills the car’s occupants? Which is less worse to take out—a homeless person or a pregnant woman?
MIT Media Lab is touting the exercise as a data collection effort to drive research on autonomous machine ethics and society. Count me out. I’m not ready to make those life-or-death choices let alone leave them to a self-driving car.
Leaders relevant to this article: