Can computer programmers account for morality?
As Google's self-driving cars proliferate and the U.S. drone program grows, software engineers face the daunting task of anticipating — and coding for — moral quandaries
Within the next two or three decades, Google's self-driving cars will likely be commonplace, and our roads will be safer for it. Self-driving cars don't drink. They don't get tired after a long day at the office. And they'll have better reflexes, along with the ability to instantly communicate with one another to minimize accidents, leaving you free to text or tweet to your heart's content. Three states — California, Florida, and Nevada — have already deemed the vehicles legal, and it's only a matter of time before many of the other 47 follow suit.
But what happens to a self-driving car when it's time to make the morally unclear, split-second decisions that will inevitably come up on the road? It's a question Gary Marcus, a professor of psychology at NYU, tackles in a forward-looking New Yorker essay:
That moment will be significant not just because it will signal the end of one more human niche, but because it will signal the beginning of another: The era in which it will no longer be optional for machines to have ethical systems. Your car is speeding along a bridge at fifty miles per hour when errant school bus carrying forty innocent children crosses its path. Should your car swerve, possibly risking the life of its owner (you), in order to save the children, or keep going, putting all forty kids at risk? If the decision must be made in milliseconds, the computer will have to make the call.
Subscribe to The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.
Sign up for The Week's Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
That question will inevitably extend beyond our daily driving, too. Earlier this week, Human Rights Watch published a similar report tackling an innately more violent issue: Should military drones be able to decide who to kill? With the White House increasingly leaning on the drone program as a "counterinsurgency Air Force," one can reasonably assume that a growing percentage of the defense budget will be dedicated to drone R&D.
And the fact is, humans "aren't very good at codifying responses to dilemmas ourselves, particularly when the precise contours of a dilemma can't be predicted ahead of its occurrence," says Nicholas Carr at his blog. That ethical weight will fall heavily on programmers, whose code won't just have moral implications, but could be forced to face legal consequences as well. (Especially if your car decides to slam into the school bus after all.)
"We don't even really know what a conscience is," says Carr. "But somebody's going to have to program one nonetheless."
Sign up for Today's Best Articles in your inbox
A free daily email with the biggest news stories of the day – and the best features from TheWeek.com
-
Why more and more adults are reaching for soft toys
Under The Radar Does the popularity of the Squishmallow show Gen Z are 'scared to grow up'?
By Chas Newkey-Burden, The Week UK Published
-
Magazine solutions - December 27, 2024 / January 3, 2025
Puzzles and Quizzes Issue - December 27, 2024 / January 3, 2025
By The Week US Published
-
Magazine printables - December 27, 2024 / January 3, 2025
Puzzles and Quizzes Issue - December 27, 2024 / January 3, 2025
By The Week US Published