Death by Algorithm: Israel’s AI War in Gaza

Image from The New Arab

Remorseless killing at the initiation of artificial intelligence has been the subject of nail-biting concern for various members of computer-digital cosmos. Be wary of such machines in war and their displacing potential regarding human will and agency. For all that, the advent of AI-driven, automated systems in war has already become a cold-blooded reality, deployed conventionally, and with utmost lethality by human operators.

The teasing illusion here is the idea that autonomous systems will become so algorithmically attuned and trained as to render human agency redundant in a functional sense. Provided the targeting is trained, informed, and surgical, a utopia of precision will dawn in modern warfare. Civilian death tolls will be reduced; the mortality of combatants and undesirables will, conversely, increase with dramatic effect.

The staining case study that has put paid to this idea is the pulverising campaign being waged by Israel in Gaza. A report in the magazine +972 notes that the Israeli Defense Forces has indulgently availed itself of AI to identify targets and dispatch them accordingly. The process, however, has been far from accurate or forensically educated. As Brianna Rosen of Just Security accurately posits, “Rather than limiting harm to civilians, Israel’s use of AI bolsters its ability to identify, locate, and expand target sets which likely are not fully vetted to inflict maximum damage.”

The investigation opens by recalling the bombastically titled The Human-Machine Team: How to Create Human and Artificial Intelligence That Will Revolutionize Our World, a 2021 publication available in English authored by one “Brigadier General Y.S.”, the current commander of the Israeli intelligence unit 8200.

The author advances the case for a system capable of rapidly generating thousands of potential “targets” in the exigencies of conflict. The sinister and morally arid goal of such a machine would resolve a “human bottleneck for both locating new targets and decision-making to approve the targets.” Doing so not only dispenses with the human need to vet, check and verify the viability of the target but dispenses with the need to seek human approval for their termination.

The joint investigation by +972 and Local Call identifies the advanced stage of development of such a system, known to the Israeli forces as Lavender. In terms of its murderous purpose, this AI creation goes further than such lethal predecessors as “Habsora” (“The Gospel”), which identifies purportedly relevant military buildings and structures used by militants. Even that form of identification did little to keep the death rate moderate, generating what a former intelligence officer described as a “mass assassination factory.”

Six Israeli intelligence officers, all having served during the current war in Gaza, reveal how Lavender “played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war.” The effect of using the AI machine effectively subsumed the human element while giving the targeting results of the system a fictional human credibility.

Within the first weeks of the war, the IDF placed extensive, even exclusive reliance on Lavender, with as many as 37,000 Palestinians being identified as potential Hamas and Palestinian Islamic Jihad militants for possible airstrikes. This reliance signalled a shift from the previous “human target” doctrine used by the IDF regarding senior military operatives. In such cases, killing the individual in their private residence would only happen exceptionally, and only to the most senior identified individuals, all to keep in awkward step with principles of proportionality in international law. The commencement of “Operation Swords of Iron” in response to the Hamas attacks of October 7 led to the adoption of a policy by which all Hamas operatives in its military wing irrespective of rank would be designated as human targets.

Officers were given expansive latitude to accept the kill lists without demur or scrutiny, with as little as 20 seconds being given to each target before bombing authorisation was given. Permission was also given despite awareness that errors in targeting arising in “approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.”

The Lavender system was also supplemented by using the emetically named “Where’s Daddy?”, another automated platform which tracked the targeted individuals to their family residences which would then be flattened. The result was mass slaughter, with “thousands of Palestinians – most of them women and children or people not involved in the fighting” killed by Israeli airstrikes in the initial stages of the conflict. As one of the interviewed intelligence officers stated with grim candour, killing Hamas operatives when in a military facility or while engaged in military activity was a matter of little interest. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

The use of the system entailed resorting to gruesome, and ultimately murderous calculi. Two of the sources interviewed claimed that the IDF “also decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians.” Were the targets Hamas officials of certain seniority, the deaths of up to 100 civilians were also authorised.

In what is becoming its default position in the face of such revelations, the IDF continues to state, as reported in the Times of Israel, that appropriate conventions are being observed in the business of killing Palestinians. It “does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist.” The process, the claim goes, is far more discerning, involving the use of a “database whose purpose is to cross-reference intelligence sources… on the military operatives of terrorist organizations.”

The UN Secretary General, António Guterres, stated how “deeply troubled” he was by reports that Israel’s bombing campaign had used “artificial intelligence as a tool in the identification of targets, particularly in densely populated residential areas, resulting in a high level of civilian casualties.” It might be far better to see these matters as cases of willing, and reckless misidentification, with a conscious acceptance on the part of IDF military personnel that enormous civilian casualties are simply a matter of course. To that end, we are no longer talking about a form of advanced, scientific war waged proportionately and with precision, but a technologically advanced form of mass murder.

 

[textblock style=”7″]

Like what we do at The AIMN?

You’ll like it even more knowing that your donation will help us to keep up the good fight.

Chuck in a few bucks and see just how far it goes!

Your contribution to help with the running costs of this site will be gratefully accepted.

You can donate through PayPal or credit card via the button below, or donate via bank transfer: BSB: 062500; A/c no: 10495969

Donate Button

[/textblock]

About Dr Binoy Kampmark 1443 Articles
Dr. Binoy Kampmark is a senior lecturer in the School of Global, Urban and Social Studies, RMIT University. He was a Commonwealth Scholar at Selwyn College, University of Cambridge. He is a contributing editor to CounterPunch and can be followed at @bkampmark.

9 Comments

  1. Isnt it a bit strange that a nation with such an advanced surveillance system as we have come to recognise, allows that system to malfunction on 7th October when the software must have been well aware of whaat was coming up?
    A truly independent, maybe AI assisted, investigation of the events leading up to this failure would tell us who the terrorists really are?

  2. Thank you Dr. Kampmark for calling this experiment for what it is: “a technologically advanced form of mass murder.”

    We now understand why so many buildings in Gaza have been destroyed – and the people within.

    What a brave new world this is, when machines are being programmed to tell us who to kill.

    It is difficult to comprehend, let alone understand the calculated brutality of this methodical destruction.

  3. From the 1880’s to the W W 1 days, zionism made violence seem probable, if aims were to be achieved. Jews had been treated badly and unfairly for centuries, especially by keen christians of all mongrelly types, who still may believe in a second coming with the slaughter and elimination of all unbelievers and non-faithful to whatever ridiculous superstitious doctrine is held. Britain’s desperation in war led to approaches to Walter Rothschild by Balfour’s colleagues, no doubt for financial and propaganda purposes. The British, unbelievably, would promise to give away in pontifical and imperious splendour, huge slabs of land they did not own or control, Ottoman lands. Inhabitants were not to be consulted or considered, but French allied collaborators would. Double crossing and lying occurred in the Sykes Picot correspondence versus the Hussein Mc Mahon correspondence, Immense fantastic promising went on and on, with war needs and vanity driving the dishonesty. Decades of troubles occurred, because the British actually grabbed Palestine, actually attempted to realise the impossible and resolve that which could not ever be resolved, the splitting of Palestine to satisfy competing views. Up to May 1948 there was a Palestine, but it was doomed, flawed, by a rotten world attitude, Hitler’s evil shadow, USA double crossing, British stupidity and weakness. Here we are today, in a mess.

  4. PP, agree,

    except I resist using the term Zionism because of its many broad versions. And rather than treated badly and unfairly for centuries, I would venture 2.5 millennia, and the reasons many and complex but mainly commercial / geo-strategic.

    I’d like to add however, that as the UK, France and Russia conspired, so too did the soon to be Israeli Jews foment their plans. Once the UN came into being, the conveniently xenophobic Uncle Sam joined the mob.

    Since the efforts of Rabin and Peres were put paid to, the increasingly lunatic Netanyahu psycho-babbled victim-hood and hatred into the Israeli citizens. And now the peak criminality of Netanyahu / Likud / IDF has set the master-plan to AI (garbage in garbage out) – set and forget!

    All masked by the Israeli world-wide babble machine. But it’s wearing thin, as is the ‘west’s’ divisive apologia, and may just result in the Jews again being despised, and treated badly for yet another millennium.

  5. Are you familiar with the killer robodogs that featured in a recent TV series, the War of the Worlds? Not based on H.G. Wells. They also appeared in an episode of Black Mirror. Surely they must be well known by now among fans of science fiction. Remorseless and relentless killers with artificial intelligence programming them how to find and exterminate their prey. If you are also familiar with the famous painting by William Holman-Hunt called the Scapegoat, inspired by the tale in Leviticus of a scapegoat being sent into the wilderness bearing all the sins of the people of Israel, then imagine the goat being replaced by the killer robot dog. It would fit the painting perfectly.

    If I had a copy of PhotoShop I could manually produce this image, though It might take some hours of work to get it to look right. Or with a copy of PhotoShop with AI commands it could be made in minutes or maybe even seconds.

    I’m saying all this because if I was cartoonist for a newspaper, I would draw that image, and on the goat’s horns I would make a label with clear letters the sin GENOCIDE.

    I don’t know if the paper would publish it, or if its audience would understand it. But for what it is worth that is my comment on this article’s subject. Will AI cyber warfare become the scapegoat for the sins of the people of Israel?

  6. JulianP,

    Der Spud would just ignore it because it’s not something he can use in his neverending reality free baseless attacks and rants about Labor.

  7. Julian P

    Interesting to see these balanced and thoughtful comments from the Jewish Council of Australia.
    By contrast Executive Council of Australian Jewry president Daniel Aghion KC said:

    “The Foreign Minister’s speech is not the way to treat a friend and ally of Australia, such as Israel.

    Spud would be in a spin !

2 Trackbacks / Pingbacks

  1. Death by Algorithm: Israel’s AI War in Gaza – Equilibrion
  2. Nuclear – the not so wonderful news this week – Equilibrion

Leave a Reply

Your email address will not be published.


*


The maximum upload file size: 2 MB. You can upload: image, audio, video, document, spreadsheet, interactive, text, archive, code, other. Links to YouTube, Facebook, Twitter and other services inserted in the comment text will be automatically embedded. Drop file here