America's drone program is widely seen as the weapon of choice for the Obama administration. In the president's first term alone, drones were used an estimated four times more than during the entirety of the Bush administration. Clearly, drones will continue to play a key role in the future of combat.
Currently, there's a human operator behind every machine, ensuring that someone can be held accountable for any misfires or civilian casualties. But what happens when technology advances to the point that humans are removed from the equation?
That's the question being posed in a new Human Rights Watch report that calls for an international ban on autonomous drones before they can be added to military arsenals worldwide. Realistically, the sophisticated software required to program a self-reliant "killer robot" that chooses its own targets is still 20 to 30 years away, but the advocacy group would rather not take chances.
"Giving machines the power to decide who lives and dies on the battlefield would take technology too far," said Steve Goose, the Arms Division director at Human Rights Watch. "Human control of robotic warfare is essential to minimizing civilian deaths and injuries."
But is human involvement really such a good thing? "History has shown that human soldiers are capable of committing the world's worst atrocities despite their supposed humanity," says Tech News Daily. Georgia Tech robotics researcher Ronald Arkin goes so far to argue that a robot wouldn't fall victim to fatigue, and thus would be less susceptible to making boneheaded decisions or getting angry and sadistically abusing its power.
We should all fear the possibility of autonomous death machines, says Tom Malinowski at The Washington Post. Imagine Syria's Bashar Assad commanding robots "programmed to track and kill protest leaders or to fire automatically on any group of more than five people congregating below." He'd possess a weapon that no other dictator in history had access to: "An army that will never refuse an order, no matter how immoral." Clearly, whether machines should be allowed to serve as both jury and executioner is a decision that will inevitably have to be confronted, preferably sooner rather than later.
THE WEEK'S AUDIOPHILE PODCASTS: LISTEN SMARTER
- Syrian women know how to defeat ISIS
- Will Kobani be ISIS's Waterloo?
- The U.S. Marines are developing laser weapons. Here's why.
- The one thing the New Atheists get right about religion
- 43 TV shows to watch in 2014
- How to be the most productive person in your office — and still get home by 5:30 p.m.
- 3 horrific inaccuracies in Homeland's depiction of Islamabad
- 10 things you need to know today: October 21, 2014
- Gamergate has backfired spectacularly on its nincompoop perpetrators
- Why the Supreme Court is allowing Texas to hold an unconstitutional election
Subscribe to the Week