Nuclear Bombs and Landmines, Say Hello to Robots

Robotics joins mines and nuclear fission as a third, new technology with profound moral implications for war-fighting and for downstream civilian harm.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

The anti-war identity of my generation rose out of the fear of nuclear annihilation and the horror of landmines. The imminence of these causes gave many of us the motivation to engage in meaningful dialogue across borders, and gave rise to all forms of communication, demonstration and debate. Tuesday, the Campaign to Stop Killer Robots launched by news conference in London. Robotics joins mines and nuclear fission as a third, new technology with profound moral implications for war-fighting and for downstream civilian harm. Full disclosure: I am a member of the International Committee for Robot Arms Control, one of the organizations behind this new campaign. As a roboticist, my value to such a debate comes in helping to inform professional and public understanding about how robotics technology is changing day by day, and how new robot affordances are changing the way in which computerized sensing and decision-making are set to redefine the power relationship of humans in war's lethal course. Robotic fighting systems have particular appeal to departments of defense and to the executive branches of government: they distance one's own soldiers from harm; they disentangle declarations of war at the legislative level from robo-fighting, blurring the distinctions between peace and war; they appeal to the user's sense of technological superiority and infallibility. After all, if we program a robot to kill ethically, then it does so without fail, right?

Technology is no savior because new innovation extends its reach faster and further than our comprehension of its technical and ethical limitations. Can robots really make ethical decisions? No; they have senses and mental faculties that are a shadow of ours. Do military acquisitions departments and industrial contractors understand the practical limitations of deployable autonomous robotics? Not by a mile. Military trends point us toward a future in which robot fighters in the wild make their own kill decisions. In Robot Futures I wrote about one such research program that has already test flown at Fort Benning, criss-crossing the base to look for a specific face, then fire a (simulated) weapon to make the kill entirely without human intervention or approval, and this is just one small example of a much richer research field. This campaign is about such autonomous killing machines. True, autonomy itself is a troublesome concept to pin down. Several of us at ICRAC have been working to understand how coming robot autonomy relates to existing and future weapons systems. But the fact that there is a slippery slope along the ecology of smart weapons should not stop us from appreciating just how earth-shattering the ultimate realization of our apparent ambition will be: robots that are unleashed in the wild, very lethal, and fully empowered to decide who to kill.

In Robot Futures, I describe dystopian social futures based on trends in massive data collection, dehumanizing robot-human interactions and autonomous telepresence systems. My vignettes were limited to civilian interactions, and yet there is plenty of fodder in business and social spaces for robotic technologies to change our identities and values. In the fog of war, the possibilities reach dizzying new heights. Robots can change how human fighters behave in all the wrong ways- take drones and multiply by a hundred. Then there's P.W. Singer's valid point in Wired for War : every technology we invent for war will be reinvented by our enemies, to be used on us, a decade later. So, do we really want autonomous killing machines on the other side, some of which are our own robots, hacked ? I am hoping that there will be rich discussions as we try to define justifiable and measurable limitations on the scope of robots in war. The technologies will change rapidly even as the discourse moves forward, and it is the job of the roboticists to ensure that we can predict the robot future well enough to seriously consider its ramifications. Please visit the Campaign website to learn more.

Popular in the Community

Close

What's Hot