The Moral Machine

Who will a self driving car kill first? In developing self-driving cars ability to react to crashes, there are many moral dilemmas we will run into in the process. Should the car drive into a wall to avoid hitting people, but hurting the driver in the process? Or should it drive thru an intersection, hurting those in the crosswalk but sparing the driver? There are many moral dilemmas like this, all a variation of the classic trolley problem.

MIT has developed a platform which allows anyone to judge different scenarios a self-driving car could face (theoretically) and pick the “lesser of two evils”. The website also has the option to create your own scenario. I judged a few scenarios and then took a look to see how my results compared with others, and was surprised at all the biases the survey made apparent/accounted for.

Check it out - what was your most ethically challenging scenario? moralmachine.mit.edu

 
0
Kudos
 
0
Kudos

Now read this

2017 Projects

Been a while since I posted anything here, so I figure I’d give another status update and think out loud (feel free to email me with advice too). It’s now been one year since I started full time at Packback, and two and a half years... Continue →