Chris Blattman

Search
Close this search box.

Under what circumstances would your Google-driven car choose to kill you?

In 1967, the philosopher Philippa Foot posed what became known at “The Trolley Problem”.  Suppose you are the driver of a runaway tram (or “trolley car”) and you can only steer from one narrow track on to another; five men are working on the track you are on, and there is one man on the other; anyone on the track that the tram enters is bound to be killed. Should you allow the tram to continue on its current track and plough into the five people, or do you deliberately steer the tram onto the other track, so leading to the certain death of the other man?

This gives rise to an interesting philosophical challenge. Somewhere in Mountain View, programmers are grappling with writing the algorithms that will determine the behaviour of these cars. These algorithms will decide what the car will do when the lives of the passengers in the car, pedestrians and other road users are at risk.

That is Owen Barder on how Google is going to have to get a little more philosophically specific than “don’t be evil”.

Will car buyers get to choose their car’s philosophy from a menu? It’s either them, Google, the manufacturer, or the Department of Transportation.

59 Responses

  1. So a fat man, I’m concerned that after pooling data from the car and glass Google would opt to have me jump into the way.

Why We Fight - Book Cover
Subscribe to Blog