ithin the next two or three decades, Google's self-driving cars will likely be commonplace, and our roads will be safer for it. Self-driving cars don't drink. They don't get tired after a long day at the office. And they'll have better reflexes, along with the ability to instantly communicate with one another to minimize accidents, leaving you free to text or tweet to your heart's content. Three states — California, Florida, and Nevada — have already deemed the vehicles legal, and it's only a matter of time before many of the other 47 follow suit.
But what happens to a self-driving car when it's time to make the morally unclear, split-second decisions that will inevitably come up on the road? It's a question Gary Marcus, a professor of psychology at NYU, tackles in a forward-looking New Yorker essay:
That moment will be significant not just because it will signal the end of one more human niche, but because it will signal the beginning of another: The era in which it will no longer be optional for machines to have ethical systems. Your car is speeding along a bridge at fifty miles per hour when errant school bus carrying forty innocent children crosses its path. Should your car swerve, possibly risking the life of its owner (you), in order to save the children, or keep going, putting all forty kids at risk? If the decision must be made in milliseconds, the computer will have to make the call.
That question will inevitably extend beyond our daily driving, too. Earlier this week, Human Rights Watch published a similar report tackling an innately more violent issue: Should military drones be able to decide who to kill? With the White House increasingly leaning on the drone program as a "counterinsurgency Air Force," one can reasonably assume that a growing percentage of the defense budget will be dedicated to drone R&D.
And the fact is, humans "aren't very good at codifying responses to dilemmas ourselves, particularly when the precise contours of a dilemma can't be predicted ahead of its occurrence," says Nicholas Carr at his blog. That ethical weight will fall heavily on programmers, whose code won't just have moral implications, but could be forced to face legal consequences as well. (Especially if your car decides to slam into the school bus after all.)
"We don't even really know what a conscience is," says Carr. "But somebody's going to have to program one nonetheless."
- There is a better alternative to raising the minimum wage
- How the strange case of Obama's Uncle Omar complicates immigration reform
- Is Biden helping or hurting U.S. interests in Asia?
- Watch The Daily Show use Pope Francis to hammer Fox Business pundits
- Rick Santorum wins the prize for the worst Nelson Mandela tribute
- This is how much extra it costs to eat healthy every day
- Ryan Seacrest invested $1 million to transform your iPhone into a BlackBerry
- He said he was leaving. She ignored him.
- All the World Cup mascots, ranked
- 'Tis the season for having sex with old flames and ruining your office reputation
Subscribe to the Week