The Moral Machine

Who will a self driving car kill first? In developing self-driving cars ability to react to crashes, there are many moral dilemmas we will run into in the process. Should the car drive into a wall to avoid hitting people, but hurting the driver in the process? Or should it drive thru an intersection, hurting those in the crosswalk but sparing the driver? There are many moral dilemmas like this, all a variation of the classic trolley problem.

MIT has developed a platform which allows anyone to judge different scenarios a self-driving car could face (theoretically) and pick the “lesser of two evils”. The website also has the option to create your own scenario. I judged a few scenarios and then took a look to see how my results compared with others, and was surprised at all the biases the survey made apparent/accounted for.

Check it out - what was your most ethically challenging scenario? moralmachine.mit.edu

 
0
Kudos
 
0
Kudos

Now read this

CrimeIsDown.com: Changing the signal-to-noise ratio

“Simulcast currently unavailable. Units in 7…” The air is hot and sticky on a July summer night, and the radio crackles to life once again. A person has just been shot. I type furiously trying to keep up with the tempo of the dispatcher... Continue →