Have we entered the age of AI warfare?
Israeli military used AI to create 'kill lists' of suspected Hamas militants, say local media
The Israeli military allegedly used an artificial intelligence system to identify potential Palestinian targets in Gaza based on apparent links to Hamas, according to an Israeli media investigation citing military intelligence sources.
The AI system, called Lavender, at one point identified up to 37,000 Palestinians as potential Hamas militants and targets for possible air strikes. The claim comes from the testimony of six alleged Israeli intelligence officers given to Israel-based media organisations +972 Magazine and Local Call.
According to +972 Magazine, the Israeli army gave "sweeping approval for officers to adopt Lavender’s kill lists" in the early stages of the war. There was "no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based".
Subscribe to The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.
Sign up for The Week's Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
What did the commentators say?
Israel's alleged use of powerful AI systems in its war on Hamas "has entered uncharted territory for advanced warfare". It not only raises a "host of legal and moral questions" but it is also "transforming the relationship between military personnel and machines", said The Guardian.
One source told +972 Magazine that military personnel served only as a "rubber stamp" for Lavender's decisions, with about "20 seconds" devoted to each target before a bombing was authorised. This was despite knowledge that the system produced "errors" in about 10% of cases, and was "known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all", said the magazine.
The Israeli military has strongly denied the claims. "The IDF [Israel Defense Forces] outright rejects the claim regarding any policy to kill tens of thousands of people in their homes," it said in response to the allegations.
It said that the Lavender system was "simply a database whose purpose is to cross-reference intelligence sources, to produce up-to-date layers of information on the military operatives of terrorist organisations".
The supposed utilisation of powerful AI systems, such as Lavender, has enabled life-or-death decision-making processes based on "statistical mechanisms", said The Guardian, rather than human emotion and human-led decision-making. As one intelligence officer who allegedly used Lavender told the paper: "The machine did it coldly. And that made it easier."
"Technological innovation has always changed warcraft," said Andreas Kluth on Bloomberg in March. "It's been that way since the arrival of chariots, stirrups, gunpowder, nukes and nowadays drones, as Ukrainians and Russians are demonstrating every day." The most pressing "existential" question over the use of AI in warfare is now less about AI itself, and more to do with the level of human oversight. "Will the algorithm assist soldiers, officers and commanders, or replace them?"
The deployment of AI-enabled weapon systems has profound implications for the future of warfare, according to Dr Elke Schwarz, lecturer in political theory at Queen Mary, University of London. It may lead to the "objectification of human targets, leading to heightened tolerance for collateral damage" as well as weakening moral agency among operators of AI-enabled targeting systems, "diminishing their capacity for ethical decision-making" in the heat of battle.
"We don't want to get to a point where AI is used to make a decision to take a life when no human can be held responsible for that decision", said Dr Schwarz.
What next?
International efforts, such as the US-led Political Declaration on Responsible Military Use of Artificial Intelligence and Autonomy, have aimed to establish guidelines for the ethical deployment of AI in warfare, said Forbes. More than 50 countries have signed the declaration, which states that military use of AI must comply with international and humanitarian law and attempt to "minimize unintended bias and accidents".
But Israel is not a signatory to the non-binding declaration, nor are Russia, China and other major world powers. "Perhaps what is emerging about AI's role in Gaza will encourage the world to negotiate an actual treaty on such things," said Forbes.
"Autonomous weapons are an early test of humanity's ability to deal with weaponized AI, more dangerous forms of which are coming," said Paul Scharre, the director of studies at the Center for a New American Security, on Foreign Affairs. "Global cooperation is urgently needed to govern their improvement, limit their proliferation, and guard against their potential use."
Sign up for Today's Best Articles in your inbox
A free daily email with the biggest news stories of the day – and the best features from TheWeek.com
Sorcha Bradley is a writer at The Week and a regular on “The Week Unwrapped” podcast. She worked at The Week magazine for a year and a half before taking up her current role with the digital team, where she mostly covers UK current affairs and politics. Before joining The Week, Sorcha worked at slow-news start-up Tortoise Media. She has also written for Sky News, The Sunday Times, the London Evening Standard and Grazia magazine, among other publications. She has a master’s in newspaper journalism from City, University of London, where she specialised in political journalism.
-
Today's political cartoons - December 21, 2024
Cartoons Saturday's cartoons - losing it, pedal to the metal, and more
By The Week US Published
-
Three fun, festive activities to make the magic happen this Christmas Day
Inspire your children to help set the table, stage a pantomime and write thank-you letters this Christmas!
By The Week Junior Published
-
The best books of 2024 to give this Christmas
The Week Recommends From Percival Everett to Rachel Clarke these are the critics' favourite books from 2024
By The Week UK Published
-
Putin says Russia isn't weakened by Syria setback
Speed Read Russia had been one of the key backers of Syria's ousted Assad regime
By Peter Weber, The Week US Published
-
'It's hard to resist a sweet deal on a good car'
Instant Opinion Opinion, comment and editorials of the day
By Justin Klawans, The Week US Published
-
Will California's EV mandate survive Trump, SCOTUS challenge?
Today's Big Question The Golden State's climate goal faces big obstacles
By Joel Mathis, The Week US Published
-
'Underneath the noise, however, there's an existential crisis'
Instant Opinion Opinion, comment and editorials of the day
By Justin Klawans, The Week US Published
-
Why are lawmakers ringing the alarms about New Jersey's mysterious drones?
TODAY'S BIG QUESTION Unexplained lights in the night sky have residents of the Garden State on edge, and elected officials demanding answers
By Rafi Schwartz, The Week US Published
-
Will Biden clear out death row before leaving office?
Today's Big Question Trump could oversee a 'wave of executions' otherwise
By Joel Mathis, The Week US Published
-
How Assad's dictatorial regime rose and fell in Syria
The Explainer The Syrian leader fled the country after a 24-year authoritarian rule
By Justin Klawans, The Week US Published
-
How will the rebels rule Syria?
Today's Big Question Fall of Assad regime is a 'historic opportunity' and a 'moment of huge peril' for country and region
By Elliott Goat, The Week UK Published