The Moral Machine

Who will a self driving car kill first? In developing self-driving cars ability to react to crashes, there are many moral dilemmas we will run into in the process. Should the car drive into a wall to avoid hitting people, but hurting the driver in the process? Or should it drive thru an intersection, hurting those in the crosswalk but sparing the driver? There are many moral dilemmas like this, all a variation of the classic trolley problem.

MIT has developed a platform which allows anyone to judge different scenarios a self-driving car could face (theoretically) and pick the “lesser of two evils”. The website also has the option to create your own scenario. I judged a few scenarios and then took a look to see how my results compared with others, and was surprised at all the biases the survey made apparent/accounted for.

Check it out - what was your most ethically challenging scenario? moralmachine.mit.edu

 
0
Kudos
 
0
Kudos

Now read this

In response to “I will never need to use X in real life”

Six years ago today, as a high school sophomore thinking about skipping college, I wrote a short essay on my feelings about the education system in this country. In two weeks I will have graduated from the Illinois Institute of... Continue →