Tara John
The Israeli military has been using artificial intelligence to help identify bombing targets in Gaza, according to an investigation by +972 Magazine and Local Call, citing six Israeli intelligence officials involved in the alleged program – who also allege that human review of the suggested targets was cursory at best.
The officials, quoted in an extensive investigation by the online publication jointly run by Palestinians and Israelis, said that the AI-based tool was called “Lavender” and was known to have a 10% error rate.
When asked about +972 Magazine’s report, Israel Defence Forces (IDF) did not dispute the existence of the tool but denied AI was being used to identify suspected terrorists. But in a lengthy statement it emphasized that “information systems are merely tools for analysts in the target identification process,” and that Israel tries to “reduce harm to civilians to the extent feasible in the operational circumstances ruling at the time of the strike.”
The IDF said “analysts must conduct independent examinations, in which they verify that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in the IDF directives.”
However, one official told +972 “that human personnel often served only as a “rubber stamp” for the machine’s decisions” and typically devoted only around 20 seconds to each target – ensuring they are male – before authorizing a bombing.
The investigation comes amid intensifying international scrutiny of Israel’s military campaign, after targeted air strikes killed several foreign aid workers delivering food in the Palestinian enclave. Israel’s siege of Gaza has killed more than 32,916 people, according to the Gaza Ministry of Health, and has led to a spiraling humanitarian crisis where nearly three-quarters of the population in northern Gaza are suffering from catastrophic levels of hunger, according to a United Nations-backed report.
The investigation’s author, Yuval Abraham, previously told CNN in January of his work looking into how the Israeli military has been ”heavily relying on artificial intelligence to generate targets for such assassinations with very little human supervision.”
The Israeli military “does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist,” the IDF statement on Wednesday said. But its analysts use a “database whose purpose is to cross-reference intelligence sources, in order to produce up-to-date layers of information on the military operatives of terrorist organizations.”
Human officers are then responsible for verifying “that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in the IDF directives,” according to the IDF statement, a process also described by +972.
Night attacks
The magazine also reported that the Israeli army “systematically attacked” targets in their homes, usually at night when entire families were present.
“The result, as the sources testified, is that thousands of Palestinians — most of them women and children or people who were not involved in the fighting — were wiped out by Israeli airstrikes, especially during the first weeks of the war, because of the AI program’s decisions,” it wrote.
The report, citing sources, said that when alleged junior militants were targeted, “the army preferred” to use so-called dumb bombs – unguided missiles which can cause large-scale damage.
Palestinians inspect the damage to a residential building after an Israeli airstrike in the Maghazi refugee camp, central Gaza Strip, Friday, March 29, 2024.
CNN reported in December that nearly half of the 29,000 air-to-surface munitions dropped on Gaza last fall were dumb bombs, which can pose a greater threat to civilians, especially in densely populated territories like Gaza.
According to the IDF statement, it does not carry out strikes where the expected collateral damage is “excessive in relation to the military advantage” and makes efforts to “reduce harm to civilians to the extent feasible in the operational circumstances.”
It added that the “IDF reviews targets before strikes and chooses the proper munition in accordance with operational and humanitarian considerations, taking into account an assessment of the relevant structural and geographical features of the target, the target’s environment, possible effects on nearby civilians, critical infrastructure in the vicinity, and more.”
Israeli officials have long argued that heavy munitions are necessary to eliminate Hamas, whose fighters killed more than 1,200 people in Israel and took hundreds of hostages on October 7, sparking the ongoing war.
No comments:
Post a Comment