merica's drone program is widely seen as the weapon of choice for the Obama administration. In the president's first term alone, drones were used an estimated four times more than during the entirety of the Bush administration. Clearly, drones will continue to play a key role in the future of combat.
Currently, there's a human operator behind every machine, ensuring that someone can be held accountable for any misfires or civilian casualties. But what happens when technology advances to the point that humans are removed from the equation?
That's the question being posed in a new Human Rights Watch report that calls for an international ban on autonomous drones before they can be added to military arsenals worldwide. Realistically, the sophisticated software required to program a self-reliant "killer robot" that chooses its own targets is still 20 to 30 years away, but the advocacy group would rather not take chances.
"Giving machines the power to decide who lives and dies on the battlefield would take technology too far," said Steve Goose, the Arms Division director at Human Rights Watch. "Human control of robotic warfare is essential to minimizing civilian deaths and injuries."
But is human involvement really such a good thing? "History has shown that human soldiers are capable of committing the world's worst atrocities despite their supposed humanity," says Tech News Daily. Georgia Tech robotics researcher Ronald Arkin goes so far to argue that a robot wouldn't fall victim to fatigue, and thus would be less susceptible to making boneheaded decisions or getting angry and sadistically abusing its power.
We should all fear the possibility of autonomous death machines, says Tom Malinowski at The Washington Post. Imagine Syria's Bashar Assad commanding robots "programmed to track and kill protest leaders or to fire automatically on any group of more than five people congregating below." He'd possess a weapon that no other dictator in history had access to: "An army that will never refuse an order, no matter how immoral." Clearly, whether machines should be allowed to serve as both jury and executioner is a decision that will inevitably have to be confronted, preferably sooner rather than later.
THE WEEK'S AUDIOPHILE PODCASTS: LISTEN SMARTER
- Why is American internet so slow?
- 7 ways to be the most interesting person in any room
- What the collapse of the Ming Dynasty can tell us about American decline
- What would a U.S.-Russia war look like?
- Colorado’s new ‘drive high, get a DUI’ commercials are actually pretty clever
- 22 TV shows to watch in 2014
- Who are the real gay marriage bigots?
- Sorry Belle Knox, porn still oppresses women
- Ukraine's fraught relationship with Russia: A brief history
- Pics or it didn't happen: Millennials are a bunch of selfie-loving skeptics
Subscribe to the Week