Advertisement

Unifor Leaderboard

All the perfumes of Arabia

Israel’s human targeting software and the banality of evil

Middle EastWar ZonesHuman Rights

Lavender fields near Hitchin, England. Photo by DeFacto/Wikimedia Commons.

Here’s the smell of the blood still. All the perfumes of Arabia will not sweeten this little hand.

—William Shakespeare, Macbeth, Act 5, Scene 1

Lavender

For some unknown, but no doubt morbidly humorous reason—the same sick humour, perhaps, that leads the Israel Defense Forces (IDF) to refer to their periodic punitive strikes on Gaza as “mowing the grass“—the IDF have decided that “Lavender” is an appropriate name for the artificial intelligence (AI) software they use to identify “human targets” in Gaza.

Since October 7, IDF strikes have killed at least 33,545 Palestinians including 8,400 women and over 13,000 children, injured another 76,049, and left 8,000 more missing, presumed buried under the rubble. The blitzkrieg has destroyed over 60 percent of Gaza’s housing stock, made two million people homeless, and left most of the strip uninhabitable.

Israel’s supporters deny that this bombing has been “indiscriminate” (as Joe Biden himself recently called it). They are right. It is worse. Seldom in the history of human conflict have so many bombs been so deliberately aimed at targeted individuals.

The great science fiction fear has always been of AI escaping human control and the machines taking over, as in The Matrix films. The story of Lavender suggests, on the contrary, that the real danger arises when the awesome data-crunching capacities of AI are put in the hands of human beings.

The Lavender software

On April 3 the Israeli–Palestinian magazine +972 published an explosive article by journalist and filmmaker Yuval Abraham on the IDF’s use of the Lavender software, based on interviews with six Israeli intelligence officers, all of whom have served in Gaza during the current campaign. The story was shared with the Guardian, who ran it as an exclusive the same day, and was picked up by the Washington Post (April 5) and subsequently discussed in an opinion column in the New York Times (April 10).

Though CBC Radio’s daily podcast Front Burner carried a 30-minute interview with Yuval Abraham on April 8, the Lavender revelations have gained little traction in the mainstream media. They have been crowded out in the din of swiftly moving events—the continuing political fallout from the IDF’s killing of seven World Central Kitchen aid workers on April 1 that brought the number of humanitarian personnel killed in Gaza to at least 224, an attack that became newsworthy only because six of the victims were Westerners; and more recently, Iran’s missile and drone attack on military bases in Israel on April 13 in retaliation for Israel’s bombing of its embassy in Damascus. This is unfortunate, because Lavender in many ways encapsulates all that is most chilling about Israel’s genocidal treatment of Palestinian civilians in Gaza.

“The Lavender software,” says Yuval Abraham, “analyzes information collected on most of the 2.3 million residents of the Gaza Strip through a system of mass surveillance, then assesses and ranks the likelihood that each particular person is active in the military wing of Hamas or PIJ [Palestinian Islamic Jihad]” on a scale of 1–100.

Mining data culled from a multiplicity of sources, it reaches its conclusions in much the same way as Amazon’s algorithm decides that given my demographic and my fondness for Bob Dylan’s Blonde on Blonde and the Rolling Stones’ Exile on Main Street, I must be a fan of the Beatles and the Beach Boys. How could Amazon’s algorithm possibly know that my idea of hell is being forced to listen to Pet Sounds and Sergeant Pepper’s Lonely Hearts Club Band for all eternity?

“Characteristics of known Hamas and PIJ operatives” are fed into the machine as training data, against which the general population is then compared. Features that can increase an individual’s rating include “being in a WhatsApp group with a known militant, changing cell phone every few months, and changing addresses frequently.”

“An individual found to have several different incriminating features will reach a high rating,” Abraham explains, “and thus automatically becomes a potential target for assassination.”

“The system … is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all,” but is reckoned to be accurate nine times out of ten on the basis of a sample manual check on several hundred Lavender-generated targets carried out at the beginning of the war.

A common error occurred, however, “if the [Hamas] target gave [his phone] to his son, his older brother, or just a random man. That person will be bombed in his house with his family. This happened often,” admitted one source.

From database to kill list

Prior to October 7, IDF policy restricted the category of “human target” to “a senior military operative who, according to the rules of the military’s International Law Department, can be killed in their private home even if there are civilians around.” This was to ensure adherence to the principle of proportionality under international law, which measures the acceptability of civilian casualties (so-called collateral damage) relative to the military advantage to be gained from the strike.

Killing human targets while they are at home often inevitably takes out other family members, including children. For that reason, the IDF’s human targets were very carefully—and always manually—vetted by intelligence officers, and they never numbered more than “a few dozen.”

But after October 7, “the army decided to designate all operatives of Hamas’ military wing as human targets, regardless of their rank or military importance. And that,” says Abraham, “changed everything.” What started as a database morphed into a kill list.

Under constant pressure from above to generate “more targets for assassination,” says officer B., the senior source interviewed by Abraham, “We attacked at a lower threshold”:

the numbers changed all the time, because it depends on where you set the bar of what a Hamas operative is. There were times when a Hamas operative was defined more broadly, and then the machine started bringing us all kinds of civil defense personnel, police officers, on whom it would be a shame to waste bombs. They help the Hamas government, but they don’t really endanger soldiers.


“Training the system based on [the latter’s] communication profiles made Lavender more likely to select civilians by mistake when its algorithms were applied to the general population,” says another of Abraham’s intelligence officers, resulting in its “including many people with a civilian communication profile as potential targets.”

On this basis the Lavender database “marked some 37,000 Palestinians as suspected ‘Hamas militants,’ most of them junior, for assassination”:

if Lavender decided an individual was a militant in Hamas, they were essentially asked to treat that as an order, with no requirement to independently check why the machine made that choice or to examine the raw intelligence data on which it is based.

Twenty-second verifications

While the IDF claims that “analysts must conduct independent examinations, in which they verify that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in the IDF directives,” officer B. tells another story:

At 5 a.m., [the air force] would come and bomb all the houses that we had marked. We took out thousands of people. We didn’t go through them one by one—we put everything into automated systems, and as soon as one of [the marked individuals] was at home, he immediately became a target. We bombed him and his house.”


“At first,” B. said, “we did checks to ensure that the machine didn’t get confused. But at some point we relied on the automatic system, and we only checked that [the target] was a man—that was enough … I would invest 20 seconds for each target at this stage, and do dozens of them every day.” Ensuring the exclusion of women was not out of chivalry, but because women do not serve in Hamas’s military.

Another source, defending the use of Lavender, argued that “when it comes to a junior militant, you don’t want to invest manpower and time in it … So you’re willing to take the margin of error of using artificial intelligence, risking collateral damage and civilians dying, and risking attacking by mistake, and to live with it.”

“Everything was statistical, everything was neat—it was very dry,” said B. The Israeli military “essentially treated the outputs of the AI machine ‘as if it were a human decision,’” substituting the one for the other.

Palestinians inspect the ruins of Watan Tower destroyed in Israeli airstrikes in Gaza City, on October 8, 2023. Photo by Naaman Omar/Wikimedia Commons.

Where’s Daddy?

The extraordinarily high casualty rate from bombing, both in absolute terms and in relation to other recent conflicts like the Ukraine War, especially during the early stages of Israel’s bombardment, was a direct result of the application of Lavender.

When combined with two other AI programs, “Gospel” (which located buildings associated with Hamas operations) and the cutely-named “Where’s Daddy?” (which tracked individuals’ movements in real time), the whereabouts of those on the Lavender-generated kill list could be determined with a lethal degree of accuracy.

While the IDF claims that “analysts must conduct independent examinations, in which they verify that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in the IDF directives,” officer B. tells another story:

“We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” said A., an intelligence officer. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”
By adding a name from the Lavender-generated lists to the Where’s Daddy? home tracking system, A. explained, the marked person would be placed under ongoing surveillance, and could be attacked as soon as they set foot in their home, collapsing the house on everyone inside …


Eventually everyone on Lavender’s list was entered into the Where’s Daddy? tracking program.

“You put hundreds [of targets] into the system and wait to see who you can kill,” explained another source. “It’s called broad hunting: you copy-paste from the lists that the target system produces.” “Even if an attack is averted,” adds officer C., “you don’t care—you immediately move on to the next target. Because of the system, the targets never end. You have another 36,000 waiting.”

In the first 45 days of bombing, at least 6,120 of the 11,078 reported Palestinian deaths in Gaza came from just 825 families. Many entire families were wiped out in single strikes. Among several cases documented by Amnesty International early on in the war:

On 10 October, an Israeli air strike on a family home killed 12 members of the Hijazi family and four of their neighbours, in Gaza City’s al-Sahaba Street. Three children were among those killed. The Israeli military stated that they struck Hamas targets in the area but gave no further information and did not provide any evidence of the presence of military targets. Amnesty International’s research has found no evidence of military targets in the area at the time of the attack.
Amnesty International spoke to Kamal Hijazi, who lost his sister, his two brothers and their wives, five nieces and nephews, and two cousins in the attack. He said: “Our family home, a three-storey house, was bombed at 5:15 pm. It was sudden, without any warning; that is why everyone was at home.”

Collateral damage degrees and dumb bombs

The likelihood of Lavender-directed strikes producing inordinately high civilian casualties was compounded by two further factors.

First, the thresholds for acceptable collateral damage were raised early in the war. From a situation in which at most a few dozen senior Hamas operatives were marked as human targets and the probability of attendant civilian casualties was manually assessed and decisions made on a case-by-case basis, the army decided that “for every junior Hamas operative that Lavender marked it was permissible to kill up to 15 or 20 civilians.”

Strikes were authorized on the basis of a “predetermined and fixed collateral damage degree” (as the ratio of civilians killed relative to targets was called). In the case of a battalion or brigade commander, “the army on several occasions authorized the killing of more than 100 civilians.” Abraham’s article documents several such mass killings.

Second—in sharp contrast to the high precision weaponry Israel used to take out senior figures in of the Islamic Revolutionary Guard Corps (IRGC) in its April 1 attack on the Iranian embassy in Damascus—the IDF’s preferred munitions for assassinating low-level Hamas targets identified by Lavender have been so-called “dumb bombs,” which collapse entire buildings on their occupants. The reasoning is impeccable:

“You don’t want to waste expensive bombs on unimportant people—it’s very expensive for the country and there’s a shortage [of those bombs],” said C., one of the intelligence officers. Another source said that they had personally authorized the bombing of “hundreds” of private homes of alleged junior operatives marked by Lavender, with many of these attacks killing civilians and entire families as “collateral damage.”


“In practice,” said source officer A., “the principle of proportionality did not exist.”

Without regard for persons

A century ago, the great German sociologist Max Weber argued that in contrast to systems of authority in pre-modern societies, modern state bureaucracies—of which the army is an extreme example—operate “without regard for persons.” “Modern loyalty,” he says, “does not establish loyalty to a person, like the vassal’s or disciple’s faith in feudal or in patrimonial relations of authority. Modern loyalty is devoted to impersonal and functional purposes.”

“The more the bureaucracy is ‘dehumanized,’” he explains, “the more completely it succeeds in eliminating from official business love, hatred, and all purely personal, irrational and emotional elements which escape calculation,” the more efficient its operations will be. If this sounds soulless, it is.

But paradoxically, the functioning of this thoroughly amoral machine rests, rather paradoxically, on a specific moralization of the individual’s relation to it. In Weber’s words:

the honor of the civil servant is vested in his ability to execute conscientiously the order of the superior authorities, exactly as if the order agreed with his own convictions. This holds even if the order appears wrong to him and if, despite the civil servants’ remonstrances, the authority insists on the order … without this moral discipline and self-denial, in the highest sense, the whole apparatus would fall to pieces.


It was precisely the moral imperative of following orders in the name of duty, with the concomitant abnegation of individual responsibility, that was repeatedly appealed to by defendants (and rejected by the court) at the Nuremberg trials. Hannah Arendt, discussing the trial of Adolf Eichmann, famously described this as “the fearsome, word-and-thought-defying banality of evil.”

Zygmunt Bauman extends this line of analysis in his classic Modernity and the Holocaust—a work not much liked in Israel—in which he argues that “The light shed by the Holocaust on our knowledge of bureaucratic rationality is at its most dazzling once we realize the extent to which the very idea of the Endlosung [Final Solution] was an outcome of the bureaucratic culture.”

Lavender takes this abnegation of personal responsibility—in this case, for thousands of innocent deaths—demanded by bureaucratic organization a quantum leap further, outsourcing moral judgment to a literal machine. Automation of the selection of “human targets” relieves the burden of responsibility—and guilt.

For me, the most chilling admission in Abraham’s article came from his most senior source, officer B.

There’s something about the statistical approach that sets you to a certain norm and standard … And I have much more trust in a statistical mechanism than a soldier who lost a friend two days ago. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.


There has been much comment on the left, rightly, on how integral dehumanization of Palestinians—“human animals,” according to Israeli Defense Minister Yoav Gallant—is to the ongoing genocide in Gaza. The moral of the Lavender story is that genocide dehumanizes not only its victims but its perpetrators, enablers, and defenders as well.

Derek Sayer is professor emeritus at the University of Alberta and a Fellow of the Royal Society of Canada. His most recent book, Postcards from Absurdistan: Prague at the End of History, won the 2023 Canadian Jewish Literary Award for Scholarship and was a finalist for the Association of American Publishers PROSE Award in European History.

Advertisement

BTL Pine Needles leaderboard

Browse the Archive