Autonomous Cars - will machines decide the outcome of an accident?

Autonomous Cars - will machines decide the outcome of an accident?

23 Aug

By Tom Boote

As cars become more autonomous, one of the largest dilemmas facing manufacturers and lawmakers is an ethical one.
In a situation where a self-driving car must decide who it should protect in the event of an accident i.e. the driver, passengers or pedestrians, some might take issue with a machine making moral decisions on behalf of a human.
Inspired by an ongoing experiment at MIT, the below simulation link created by is designed to collect human responses to such dilemmas to better understand how we expect self-driving cars to behave. The responses gathered from this experiment will help form an industry report on autonomous cars due to publish in September.
The ‘Conscious Car’ simulator allows you to programme an autonomous car’s ethical decisions by choosing who to protect in the event of a collision.
You are faced with a random selection of tough decisions such as….
  • Do you drive into a wall, or steer into three dogs?
  • Do you protect two elderly people, or one child?  
  • Does your opinion change when the driver has passengers?
Try the Conscious Car Simulator here

Frivolity aside, these are the decisions autonomous cars will have to make, although a third (35%) of UK drivers wouldn’t want a driverless vehicle to make decisions on their behalf in the event of a collision.
The simulation anonymously collects all answers and the data will be used as part of’s industry report on autonomous cars.

Subscribe to our newsletter

Receive Protyre offers and our latest news