The Israeli military has been using a secretive artificial intelligence program to analyze data and identify human targets for assassination in Gaza, a new report alleges. The AI, codenamed “Lavender” by the Israel Defense Force, has generated extensive kill lists naming tens of thousands of Palestinians as possible Hamas targets, which the IDF executes “as if it were a human decision”—despite the tech having an error rate as high as 10 percent.

The IDF’s new application of AI technology was first revealed in a joint report from Israel-based publications +972 Magazine and Local Call last week. Citing six anonymous Israeli intelligence officials, the outlets allege that the Lavender system has played “a central role in the unprecedented bombing” of Gaza, particularly during the early stages of the Israel-Hamas war in October.

“Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets,” +972 reported, adding that “during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants—and their homes—for possible air strikes.”

Learn the benefits of becoming a Valuetainment Member and subscribe today!

Israel has been using a secretive artificial intelligence program called 'Lavender' to analyze data and identify human targets for assassination in Gaza.
(AP Photo/Fatima Shbair)

According to the intelligence officials, Lavender’s recommended targets rarely received more than a cursory review by human operatives, meaning that even low-level Hamas operatives were targeted for bombing when they might otherwise have been ignored.

“Human personnel often served only as a ‘rubber stamp’ for the machine’s decisions,” one source said, adding that many commanders spend a mere “20 seconds” reviewing targets before approving strikes—“just to make sure the Lavender-marked target is male.”

The system “made it easier,” the source continued, because “the machine did it coldly.”

Two other sources indicated that Lavender-approved operations were known to have a disproportionately high rate of collateral damage, resulting in as many as 20 civilians killed for each junior Hamas operative. At one point, 100 civilians were killed in a strike to eliminate just one target—losses the IDF deemed to be acceptable.

“We had a calculation for how many [civilians could be killed] for the brigade commander, how many [civilians] for a battalion commander, and so on,” another source said. “There were regulations, but they were just very lenient,” another source added. “We’ve killed people with collateral damage in the high double digits, if not low triple digits. These are things that haven’t happened before.” 

Such decisions were made despite Lavender’s track record of making incorrect calls roughly 10 percent of the time. As +972 reported, “the system makes what are regarded as “errors” in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.”

Related: White House Gives Israel F-35s, Billions of Dollars in Weaponry

The IDF strikes were also frequently conducted on individual homes, most often at night, when entire families would be present. To that end, in parallel with Lavender, the IDF has employed a separate, older AI program codenamed “Where’s Daddy,” which “specifically tracks the targeted individuals and carries out bombings when they enter their family’s residences.”

According to the sources’ allegations, thousands of Palestinian women and children have fallen victim to these strikes, with the bulk of the devastation in Gaza directly attributable to the influence of AI programs.

“We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” an intelligence officer told +972 and Local Call. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

A third AI dubbed “the Gospel,” has been used to target high-rise buildings and public areas, with the intention of exerting “civil pressure” on the Palestinian people in violation of international laws against collective punishment.

Related: Israel Kills Aid Workers in “Unintended Strike” in Gaza, White House “Outraged”

In response to the +972 report, the IDF strongly denied each of the allegations, stating that it “outright rejects the claim regarding any policy to kill tens of thousands of people in their homes.” Rather, the IDF insists that Lavender is “simply a database whose purpose is to cross-reference intelligence sources in order to produce up-to-date layers of information on the military operatives of terrorist organizations.”

According to the Israeli military, human operators, not AIs, are responsible for determining whether “the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in the IDF directives.”

“This is not a list of confirmed military operatives eligible to attack,” a statement from the IDF asserted. “For each target, IDF procedures require conducting an individual assessment of the anticipated military advantage and collateral damage expected.”


Connor Walcott is a staff writer for Valuetainment.com. Follow Connor on X and look for him on VT’s “The Unusual Suspects.”

Add comment