Terminator Robots and AI Risk

Concerns about risk coming from the development of AI have been recently expressed by many computer science researchers, entrepreneurs and scientists, making us wonder: what are we fearing?
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

2015-03-03-dreamstime_m_46857537.jpg

Concerns about risk coming from the development of AI have been recently expressed by many computer science researchers, entrepreneurs and scientists, making us wonder: what are we fearing? What does this worrisome thing look like? An overwhelming number of attempts to explain the risk came in the media accompanied by pictures of terminator robots. But while the prevalent visual representation of AI risk has become the terminator robot, this is in fact very far from the most likely scenarios in which AI will manifest itself in the world. So, as we begin to face our fear, the face of what we're told we should fear is utterly misleading. My fear is instead that, like with any representation that reveals some things and hides others, what the terminator robot reveals is simply something about our mind and its biases, while hiding the real dangers that we are facing.

The terminator robot becoming such a "catchy" representation is due, I believe, to the fact that our minds and the fears they dream up are embodied. We have evolved to fear moving atoms: tigers that could attack us, tornados that could ruin our shelters, waves that could drown us, human opponents that could harm us. Killer robots from the future are just a spinoff that cultural evolution has put onto our deeply rooted sources of fear that we've evolved to react to.

There is much research showing that the way we conceive of the world and the way we act or react in it are based on embodied representations. In their book Metaphors We Live By, Lakoff and Johnson talk about how we represent even very abstract concepts in relation to our own physical bodies. For example, we think of happiness as being up, and sadness being down when we talk about events that "lift us up" and days when we feel "down". These metaphors we use for representing abstractions in embodied ways are so deeply ingrained in our language that we don't even think of them as figures of speech anymore. Equally so, our reactions are highly influenced by embodied representations. Several studies have found that when people are looking at drawings of eyes, they cheat less and behave more pro-socially than when they are not. Finally, the way we act in the world and the way we perceive our actions to be ethical or not depends on embodiment. Variations of the famous trolley problem (in which a person is asked whether it is morally right to sacrifice the life of one person in order to save the life of five by using this person as a trolley-stopper) have shown that people are more willing to say that it is ethical to do so when one needs to pull a lever that will cause the person to fall in front of the trolley, than when one needs to push the person oneself.

All of this suggests that the reason why killer robots "sell" is because we are wired to fear moving atoms, not moving bits of information. It's almost like we need to give our fears an embodied anchor or it's not scary anymore. But what is the price we pay for the sensation of fear that we need to nurture through embodied representations? I believe the price is blindness to the real danger.

The risk of AI is very likely not going to play out as armies of robots taking over the world, but in more subtle ways by AI taking our jobs, by controlling our financial markets, our power plants, our weaponized drones, our media... Evolution has not equipped us to deal with such ghostly entities that don't come in the form of steel skeletons with red shiny eyes, but in the form of menacing arrangements of zeros and ones.

In spite of our lack of biological readiness to react to such threats, we have created societies that are more and more dependent on these elusive bits of information. We no longer live solely in a world of moving atoms but we also live in a world of moving bits. We've become not just our bodies but also our Facebook page, our Twitter account, our Internet searches, our emails, etc. We no longer own just gold coins or dollar bills, but we own numbers: credit card numbers, passport numbers, phone numbers. Our new digital world is quite different from the one that hosts our bodies and it is silly to think that what is worthy of fear in it will have the same characteristics as what's worthy of fear in this one. And just because our emails and Internet searches being stored and read by others does not feel as creepy as a pair of eyes always peering above our shoulder, it doesn't mean that it really isn't. Just because silent and stealthy taking over by AI does not give us the heebie-jeebies quite as much as roaring armies of terminators do, that doesn't mean it is not equally dangerous or even more so.

So, even if we do not feel the fear, we need to understand it. We need to be fearfully mindful not of the terminator robots themselves, but of what they hide and misrepresent.

Popular in the Community

Close

What's Hot