Oxfam reaction to the Rio de Janeiro G20…

Oxfam Australia Media Release Responding to the Rio de Janeiro G20 Ministerial Declaration…

Top 1 per cent bags over $40 trillion…

Oxfam Australia Media Release The richest 1 per cent have amassed $42…

50th annual Trade and Assistance Review released

Productivity Commission Media Release The Productivity Commission has released the 50th annual Trade…

Violence trickles down, and the myths that enable…

By Andrew Klein One of the most dangerous, evil myths permeating western…

IJM welcomes tougher stance on Big Tech for…

International Justice Mission Media Release International Justice Mission (IJM) Australia) welcomes move by…

Playing politics with people’s lives

By Bert Hetebry Politics and journalism work hand in hand in sending messages…

Social Democracy: Transitioning from Neoliberalism

By Denis Hay From Neoliberalism to Social Democracy: A Path to a Fairer…

New report reveals technical and market implications for…

Australian Academy of Technological Sciences & Engineering Media Release A new report released…

«
»
Facebook

The Rise of the Killer Robot

“As companies building the technologies in artificial intelligence and robotics that may be repurposed to develop autonomous weapons, we feel especially responsible in raising this alarm.” (Open Letter to the UN on Autonomous technology, August 2017).

Do you leave the gruesome task of killing, pulverising and maiming to robots? The US Defence Department gave a portion of its report Unmanned Systems Safety Guide for DOD Acquisition (2007) to the possibility of designing functional unmanned weapons systems. Other defence departments, including the UK Ministry of Defence, also see the removal of the human element in the drone killing mechanism as a distinct possibility.

It is these points troubling those at the International Joint Conference on Artificial Intelligence in Melbourne, which opened with a letter authored and signed by 116 figures known for their prowess in the field of robotic and artificial intelligence. Among the penning luminaries were Elon Musk, taking time out from some of his more boyish endeavours to get serious. Serious, that is, about humanity.

Reading the words of the open note, oddly titled “An Open Letter to the United Nations Convention on Certain Chemical Weapons” (since when are conventions recipients?) is to be cast back into an aspirational idyll, thrown into archives of hope that humanity’s insatiable appetite for killing itself might be curbed:

“Once developed, lethal autonomous weapons will permit armed conflict to be fought on a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways.”

For the artificial intelligence sage Toby Walsh, a salient figure behind the note and the 2015 open letter which first urged the need to stop “killer robots”, such weapons were as revolutionary as any since the advent of nuclear weaponry. Be aware of “stupid technologies” or, as he puts it, the stupid variant of artificial intelligence.

A central point to bringing robots into the old fray of battle is the notion that machines will be used to target other machines. It is the view of John Canning of the Naval Surface Warfare Center Dahlgren Division. The people, in other words, are sparred the misfortune of death – except the clever ones who wish to continue targeting each other – while “dumb” robots are themselves neutralised or destroyed by other, similarly disposed weapon systems.

Even more direct is Ronald Arkin, who insists that robots can better soldiers in the business of warfare at first instance while also being “more humane in the battlefield than humans.” The idea of a humane machine would surely be a misnomer, but not for Arkin, who contends that robotic platforms may well have the “ability to better adhere to the Laws of War than most soldiers possibly can.”

Both Arkin and Canning are merely fumbling over notions already hit upon by Isaac Asimov in 1942. Robots, he outlined in a series of robot laws in the short story “Runaround” would not injure human beings, had to obey orders given by humans, except when in conflict with the first law, and had to protect their own existence, as long as neither conflicted with the first two laws. Giddy stuff indeed.

These are not points being cheered on by Musk and Co. At the beginning of an automated robotic creature is a potential human operator; and at its end, another human, with a moral and ethical dimension of such dire consequence that prohibition is the only safe choice.

The obvious point, seemingly missed by these figures, is that the nature of automated killing, the technological distance between the trigger puller and the destroyed target, is an inexorable process that continues the alienation of humans from the technology they use.

“We do not have long to act,” comes the cry. “Once this Pandora’s Box is opened, it will be hard to close.” But this box was prized open with each technological mastery, with each effort to design a more fiendishly murderous weapon. The only limit arguably in place with each discovery (chemical and bacteriological weapons; carpet bombing; nuclear weapons) was the not-so-reliable human agent ultimately behind using such weapons.

The elimination of pathos, the flesh and blood link between noble combatants, was already underway in the last days of George Armstrong Custer and the Battle of the Little Bighorn. To win the battle, the machine imperative became irresistible. It was only a matter of time before the machine absorbed the human imperative, becoming its near sci-fi substitute.

Human stupidity – in the making and misuse of technologies – is a proven fact, and will buck any legislative or regulatory trend. Some in the AI fraternity prefer to think about it in terms of what happens if the unscrupulous get hold of such things, that the line can be drawn underneath the inconceivably horrid. But even such a figure as technology investor Roger McNamee has to concede, “bad things are already happening.”

Ultimately, it still takes human agency to create the lethal machinery, to imbue the industrial killing complex with its brutish character. For that very reason, there will be those who think that it is about time machines are given their go. Let the robots, in short, sort out the mess made by human agents. But taking humans out of the business of killing would be a form of self-inflicted neutering. Killing, for all its critics, remains a true human pursuit, the sort of fun some will resent surrendering to the machine.

Dr Binoy Kampmark is a senior lecturer in the School of Global, Urban and Social Studies, RMIT University. He was a Commonwealth Scholar at Selwyn College, University of Cambridge. He is a contributing editor to CounterPunch and can be followed on Twitter at @bkampmark.

 

 

Like what we do at The AIMN?

You’ll like it even more knowing that your donation will help us to keep up the good fight.

Chuck in a few bucks and see just how far it goes!

Your contribution to help with the running costs of this site will be gratefully accepted.

You can donate through PayPal or credit card via the button below, or donate via bank transfer: BSB: 062500; A/c no: 10495969

Donate Button

11 comments

Login here Register here
  1. Ken

    Scary stuff

  2. Zathras

    Until he was forced to face a massed group of only Indian braves at Little Big Horn, General Custer’s tactic and the secret of his “success” was always to attack and capture the women and children to force the braves to surrender.

    I doubt that even AI-equipped robots would “think” in that way.

    Even human remote drone pilots with their limited decision making options are able to make subtle choices.

  3. wam

    even the most basic living organism will give symptoms before killing you. A robot like a drone just kills the AI of HAL or swartzneggar’s cyborg was frightening enough but racial profiling security by AI will make 1984 preferable. Asimov wrote fiction which is wishful thinking davros products are a distinct possibility.

  4. Johno

    If one of these killer robots falls off the back of a truck I could use it to defend our chook shed from foxes. Just kiddin.

  5. Miriam English

    I write a lot of science fiction stories about artificial intelligence (AI) because the topic is extremely important to humanity’s future. In all my stories AIs rescue humanity from themselves and become our guardians and protectors, but all my stories also warn of the danger of AI being misused by the military (efficient mass-killers), spooks (imposing the perfect surveillance system), or spammers (crashing the internet).

    It’s true that AIs might be more humane fighters, taking out only combatants, and sparing innocent civilians, and it is true that robots won’t rape and plunder like Blackwater’s notorious mercenaries do. Nor will they behead civilians like Daesh (ISIS) do. On the other hand perhaps they will simply exterminate everybody in the enemy territory (because only enemies live in enemy territory).

    I don’t really think this is an experiment we can safely conduct. Once one country develops killer robots then so will all the others. Do we really want them in the hands of idiots like Trump and Kim Jong-un, or some of the other psychopaths around the world?

    We should certainly develop AI, but we need to end all war lest AI be used to make war “more efficient”.

  6. silkworm

  7. Miriam English

    silkworm, I’m trying not to sound peevish here, but it would be so much better if you labelled your links and gave some idea of what they referred to. For instance if you’d said that the YouTube link was to a (somewhat tuneless) song sung “Rise Robots Rise” by the robots in a children’s cartoon Gulliver’s Travels Beyond the Moon I would have saved some of my valuable (and costly) bandwidth. I guess in future I won’t bother following your unlabelled links.

  8. Deanna Jones

    More accurately, killing is a male human pursuit, whether it be individual or state level, it is far and away a male human pursuit.

  9. Miriam English

    Deanna, that is puzzling isn’t it. Everywhere, in every culture, violence and killing are overwhelmingly male things.

    The recent fashion for movies with violent female protagonists seems to be shifting that a little, with a rise in violence perpetrated by girls. Nevertheless, it is still almost entirely a masculine preoccupation.

    The greatest threat to a woman is a man (and usually one she knows). The greatest threat to a man is another man. Male violence is bad for everybody. Thank heavens it has been declining for centuries.

    My brother and his wife have been spending the year travelling through Africa and Europe. I set up a blog for them (http://www.englishmob.net/adventure/) and they’ve been posting entries with pictures every few days. I’ve been amazed at the ancient walled cities and efforts taken to protect against violence that used to be normal. Now, here in Australia and in much of the world it is taken for granted that a person should be perfectly safe to walk alone almost anywhere. The world has changed so much!

    Violence is now considered so outrageous and detestable that I would not be at all surprised to see the end of war in the next couple of decades.

  10. king1394

    Fighting wars also uses up the scarce resources of the planet and contribute to climate change. The question needs to be can we learn to live together before we all die together. Perhaps robotic soldiers would be easier to put back in their boxes as they will have no investment in winning or losing

  11. Miriam English

    It’s the ones who control the robotic soldiers who are invested in winning. We can’t put them back in boxes.

Leave a Reply

Your email address will not be published. Required fields are marked *

The maximum upload file size: 2 MB. You can upload: image, audio, video, document, spreadsheet, interactive, text, archive, code, other. Links to YouTube, Facebook, Twitter and other services inserted in the comment text will be automatically embedded. Drop file here

Return to home page