Chevron up

Can you help a car make moral decisions?

Who would you save, an elderly lady or a young girl? A criminal, or a doctor? Three cats, or one human?

We've talked about the issues with self driving cars before, and among them is the fact that it is difficult to leave moral decisions up to computers. What we mean by that is that often drivers have to make split second decisions about risk, and sometimes those decisions involve the issue of who to put at risk. Driving isn't always predictable, and there are a number of things the human brain has to take into account.

Here is a fascinating website designed by MIT which is part experiment, and part education. It's called the Moral Machine and it gives you a series of different scenarios whereby you must choose which groups of people to put into the line of fire. You'll have limited information about them (click the descriptions to read the details) and you must decide based on this.

They team behind the website want to learn more about why we make the decisions we do, and how we can use this information to help programme driverless cars. After selecting your choices you'll be able to see how you compare to other people, and what the summary of your choices says about you and your potential biases.

Don't forget - you're choosing the lesser of two evils so this programme won't conclusively tell you anything about your personality, but it is interesting to consider some of the dilemmas!

You can read more about the Moral Machine here.

References
Browse Blog
Road heading into the distance

Contact us today

X

Search