Within the next two or three decades, Google's self-driving cars will likely be commonplace, and our roads will be safer for it. Self-driving cars don't drink. They don't get tired after a long day at the office. And they'll have better reflexes, along with the ability to instantly communicate with one another to minimize accidents, leaving you free to text or tweet to your heart's content. Three states — California, Florida, and Nevada — have already deemed the vehicles legal, and it's only a matter of time before many of the other 47 follow suit.
But what happens to a self-driving car when it's time to make the morally unclear, split-second decisions that will inevitably come up on the road? It's a question Gary Marcus, a professor of psychology at NYU, tackles in a forward-looking New Yorker essay:
That moment will be significant not just because it will signal the end of one more human niche, but because it will signal the beginning of another: The era in which it will no longer be optional for machines to have ethical systems. Your car is speeding along a bridge at fifty miles per hour when errant school bus carrying forty innocent children crosses its path. Should your car swerve, possibly risking the life of its owner (you), in order to save the children, or keep going, putting all forty kids at risk? If the decision must be made in milliseconds, the computer will have to make the call.
That question will inevitably extend beyond our daily driving, too. Earlier this week, Human Rights Watch published a similar report tackling an innately more violent issue: Should military drones be able to decide who to kill? With the White House increasingly leaning on the drone program as a "counterinsurgency Air Force," one can reasonably assume that a growing percentage of the defense budget will be dedicated to drone R&D.
And the fact is, humans "aren't very good at codifying responses to dilemmas ourselves, particularly when the precise contours of a dilemma can't be predicted ahead of its occurrence," says Nicholas Carr at his blog. That ethical weight will fall heavily on programmers, whose code won't just have moral implications, but could be forced to face legal consequences as well. (Especially if your car decides to slam into the school bus after all.)
"We don't even really know what a conscience is," says Carr. "But somebody's going to have to program one nonetheless."
THE WEEK'S AUDIOPHILE PODCASTS: LISTEN SMARTER
- Why Pakistan won't hunt down the terrorists within its borders
- Sorry, GOP, tax cuts don't pay for themselves
- How academia's liberal bias is killing social science
- Pope Francis' American problem
- How to be the most productive person in your office — and still get home by 5:30 p.m.
- 43 TV shows to watch in 2014
- Are there dogs in heaven? Let's hope not.
- This week I learned your coin toss odds are better than you think, and more
- 10 things you need to know today: December 19, 2014
- Hey, bosses: Stop giving bonuses to your employees
Subscribe to the Week