Within the next two or three decades, Google's self-driving cars will likely be commonplace, and our roads will be safer for it. Self-driving cars don't drink. They don't get tired after a long day at the office. And they'll have better reflexes, along with the ability to instantly communicate with one another to minimize accidents, leaving you free to text or tweet to your heart's content. Three states — California, Florida, and Nevada — have already deemed the vehicles legal, and it's only a matter of time before many of the other 47 follow suit.
But what happens to a self-driving car when it's time to make the morally unclear, split-second decisions that will inevitably come up on the road? It's a question Gary Marcus, a professor of psychology at NYU, tackles in a forward-looking New Yorker essay:
That moment will be significant not just because it will signal the end of one more human niche, but because it will signal the beginning of another: The era in which it will no longer be optional for machines to have ethical systems. Your car is speeding along a bridge at fifty miles per hour when errant school bus carrying forty innocent children crosses its path. Should your car swerve, possibly risking the life of its owner (you), in order to save the children, or keep going, putting all forty kids at risk? If the decision must be made in milliseconds, the computer will have to make the call.
That question will inevitably extend beyond our daily driving, too. Earlier this week, Human Rights Watch published a similar report tackling an innately more violent issue: Should military drones be able to decide who to kill? With the White House increasingly leaning on the drone program as a "counterinsurgency Air Force," one can reasonably assume that a growing percentage of the defense budget will be dedicated to drone R&D.
And the fact is, humans "aren't very good at codifying responses to dilemmas ourselves, particularly when the precise contours of a dilemma can't be predicted ahead of its occurrence," says Nicholas Carr at his blog. That ethical weight will fall heavily on programmers, whose code won't just have moral implications, but could be forced to face legal consequences as well. (Especially if your car decides to slam into the school bus after all.)
"We don't even really know what a conscience is," says Carr. "But somebody's going to have to program one nonetheless."
THE WEEK'S AUDIOPHILE PODCASTS: LISTEN SMARTER
- 7 grammar rules you really should pay attention to
- The real reason conservatives should be outraged that police killed a white youth
- The secret to handling pressure like astronauts, Navy SEALs, and samurai
- What would a U.S.-Russia war look like?
- 10 things you need to know today: August 28, 2014
- Why you should stop believing in evolution
- Why the West should accept ISIS as a sovereign nation
- What Ann Coulter and atheist Richard Dawkins have in common
- After Ferguson, we don't need another dialogue on race
- 8 ways you're probably overspending without even realizing it
Subscribe to the Week