Welcome to Eureka Street

back to site

ENVIRONMENT

Adjustable ethics at the wheel of a self-driving car

  • 23 November 2015

The age of the self-driving car may be nigh. The South Australian government has recently announced it will allow trials of self-driving cars on its roads. Numerous US states are planning on, or already, doing the same.

Self-driving cars could save many lives. The Australian road toll remains substantial — 1193 deaths in 2013. Many deaths are caused by driver inattention, tiredness, being under the influence of drugs or alcohol, recklessness, and so on.

Computers, as we know, don't get bored, sleepy, drunk, or angry. They don't show off to passengers. They don't send texts on a freeway. Self-driving cars are, if we believe the advocates (and manufacturers), level-headed, infallible driving, well, machines.

But driving involves decision-making, and only some are purely mechanical. Others are ethical. How exactly will self-driving cars make those decisions?

Imagine, for example, you're in your self-driving car, travelling at speed on a highway. Suddenly an oncoming road train swerves into your lane and thunders towards you. You may just be able to swerve, but unfortunately five men are standing on the side of the road, and you will surely hit them. Should the self-driving car swerve, and probably kill five people, or stay the course and likely kill you (and maybe the road train driver)?

If you were driving, it seems likely your instincts would kick in and you would swerve, killing the five men but saving your own life. But you aren't driving. The car is driving, powered by algorithms written somewhere in California.

Those programmers will have to decide how the car's 'instincts' will work. Should it drive like a human, and protect the driver even where it harms others? Or should it be utilitarian, and sacrifice the driver for the common good where necessary?

There's undoubtedly something unsettling about computers deciding to sacrifice our lives for others. But maybe that's just too many viewings of 2001: A Space Odyssey speaking. Surely if we now have the power to overcome our base, selfish instinct for self-preservation and replace it with more noble values, we should do it.

But is it that simple? Self-driving cars will save many lives if they catch on. However, convincing people to buy them may be difficult. One way to really ensure self-driving cars never become popular is for it to become known that they are programmed to kill you and your loved ones if circumstances require. That isn't a feature any dealer's likely to mention alongside the