Robot Soldiers Will Be a Reality -- And a Threat

Much controversy has surrounded the use of remote-controlled drone aircraft in the war on terror. But another, still more awe-inducing possibility has emerged: taking human beings out of the decision loop altogether.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

Much controversy has surrounded the use of remote-controlled drone aircraft or "unmanned aerial vehicles" in the war on terror. But another, still more awe-inducing possibility has emerged: taking human beings out of the decision loop altogether. Emerging brain science could take us there.

Today drone pilots operate thousands of miles away from the battlefield. They must manage vast amounts of data and video images during exceptionally intense workdays. They are scrutinized by superiors for signs of stress, and to reduce such stress the Air Force is attempting shift changes, less physical isolation on the job, and more opportunities for rest.

Yet even as this remarkable new form of war fighting is becoming more widely recognized, there are at least two more possible technological transitions on the horizon that have garnered far less public attention. One is using brain-machine interface technologies to give the remote pilot instantaneous control of the drone through his or her thoughts alone. The technology is not science fiction: Brain-machine interface systems are already being used to help patients with paralytic conditions interact with their environments, like controlling a cursor on a computer screen.

In a military context, a well-trained operator, instead of using a joystick for very complicated equipment, may be able to process and transmit a command much more rapidly and accurately through a veritable mind-meld with the machine.

There are enormous technical challenges to overcome. For example, how sure can we be that the system is not interpreting a fantasy as an intention? Even if such an error were rare it could be deadly and not worth the risk.

Yet there is a way to avoid the errors of brain-machine interface that could change warfare in still more fundamental and unpredictable ways: autonomous weapons systems combining the qualities of human intelligence that neuroscience has helped us understand with burgeoning information and communications technologies.

Even now there are defensive weapons systems on U.S. naval ships that routinely operate on their own, but with human monitoring. A new automated weapons system has been deployed at the demilitarized zone between North and South Korea. This robot sentry is said to be the first that has integrated systems for surveillance, tracking, firing and voice-recognition. Reportedly it has an "automatic" mode that would allow it to fire without a human command, but that mode is not being used.

Robot warriors, proponents argue, would not be subject to the fatigue, fear and fury that often accompany the chaos of combat -- emotions can result in accidental injuries to friends or even barbaric cruelties motivated by a thirst for revenge and a sense of power. Others say the proponents of robot warriors are naive: What would inhibit dictators or nonstate actors from developing robotic programs that ignored the laws of war?

Moreover, some security analysts already worry that remote control unacceptably lowers the bar for a technologically superior force to engage in conflict. And will their adversaries, frustrated by their lack of opportunity to confront an enemy in person, be more likely employ robotic terror attacks on soft targets in that enemy's territory? Will this be the death knell of whatever ethos of honor remains in modern military conflict?

Another technology is even more radical. Neuroscientists and philosophers are exploring the parameters of "whole brain emulation," which would involve uploading a mind from a brain into a non-biological substrate. It might be that Moore's Law (the idea that computing capacity doubles about every two years) would have to persist for decades in order for a computer to be sufficiently powerful to receive an uploaded mind. Then again, the leap might come by means of the new science of quantum computing -- machines that use atomic mechanical phenomena instead of transistors to manage vast amounts of information. Experiments with quantum computing are already being performed at a number of universities and national laboratories in the United States and elsewhere.

Robotic warriors whose computers are based on whole brain emulation raise a stark question: Would these devices even need human minders? Perhaps, if we're not careful, these creatures could indeed inherit the Earth.

National security planners and arms-control experts have already begun to have conversations about the ethical and legal implications of neurotechnologies and robotics in armed conflict. For it is inevitable that breakthroughs will be incorporated into security and intelligence assets.

The various international agreements about weapons and warfare do not cover the convergence of neuroscience and robotic engineering. Thus new treaties will have to be negotiated, specifying the conditions under which research and deployment may proceed, what kinds of programming rules must be in place, verification procedures, and how human beings will be part of the decision loop.

Given the obvious dangers to human society, fully autonomous offensive lethal weapons should never be permitted. And though the technical possibilities and operational practicalities may take decades to emerge, there is no excuse for not starting to develop new international conventions, which themselves require many years to craft and negotiate before they may be ratified by sovereign states. The next presidential administration should lead the world in taking up this complex but important task.

A version of this article appeared May 12, 2012, on page A15 in some U.S. editions of The Wall Street Journal, with the headline: Robot Soldiers Will Be a Reality -- and a Threat.

Popular in the Community