As self-driving cars slowly make their way onto our roads, how will developers help these autonomous vehicles make difficult decisions during accidents? A new MIT project illustrates just how difficult this will be by mixing gaming with deep moral questions.
The Moral Machine presents you with a series of traffic scenarios in which a self-driving car must make a choice between two perilous options.
Should you avoid hitting a group of five jaywalking pedestrians by hitting a concrete divider that will kill two of your passengers? If there’s no other choice, do you drive into a group of young pedestrians, or elderly pedestrians? Do you swerve to avoid a group of cute cats and dogs, or hit a doctor, a man and an executive? With only two choices, do you hit a large group of homeless people obeying traffic laws or a small child jaywalking against the traffic light? …
Via: Mashable: Tech