America's drone program is widely seen as the weapon of choice for the Obama administration. In the president's first term alone, drones were used an estimated four times more than during the entirety of the Bush administration. Clearly, drones will continue to play a key role in the future of combat.
Currently, there's a human operator behind every machine, ensuring that someone can be held accountable for any misfires or civilian casualties. But what happens when technology advances to the point that humans are removed from the equation?
That's the question being posed in a new Human Rights Watch report that calls for an international ban on autonomous drones before they can be added to military arsenals worldwide. Realistically, the sophisticated software required to program a self-reliant "killer robot" that chooses its own targets is still 20 to 30 years away, but the advocacy group would rather not take chances.
"Giving machines the power to decide who lives and dies on the battlefield would take technology too far," said Steve Goose, the Arms Division director at Human Rights Watch. "Human control of robotic warfare is essential to minimizing civilian deaths and injuries."
But is human involvement really such a good thing? "History has shown that human soldiers are capable of committing the world's worst atrocities despite their supposed humanity," says Tech News Daily. Georgia Tech robotics researcher Ronald Arkin goes so far to argue that a robot wouldn't fall victim to fatigue, and thus would be less susceptible to making boneheaded decisions or getting angry and sadistically abusing its power.
We should all fear the possibility of autonomous death machines, says Tom Malinowski at The Washington Post. Imagine Syria's Bashar Assad commanding robots "programmed to track and kill protest leaders or to fire automatically on any group of more than five people congregating below." He'd possess a weapon that no other dictator in history had access to: "An army that will never refuse an order, no matter how immoral." Clearly, whether machines should be allowed to serve as both jury and executioner is a decision that will inevitably have to be confronted, preferably sooner rather than later.
THE WEEK'S AUDIOPHILE PODCASTS: LISTEN SMARTER
- Watch out, China — America is working on dogfighting drones
- How liberals are unwittingly paving the way for the legalization of adult incest
- How to be the most productive person in your office — and still get home by 5:30 p.m.
- 43 TV shows to watch in 2014
- How the Simpsons/Family Guy crossover revealed the worst of both shows
- Why America won't have enough money to battle ISIS
- The troubling persistence of eugenicist thought in modern America
- Why the Chinese military is only a paper dragon
- 6 things the happiest families all have in common
- Libertarianism's terrible, horrible, no good, very bad idea
Subscribe to the Week