Mating With a Machine? Maybe It's Not a Cop Out

Given a choice between a man who brutalizes and murders, and a robot, I would opt for the robot in a heartbeat.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

So my husband and I got into a bit of a tiff this morning. We argued about the strangest thing.


It happened after I wrote a piece on Monday about Sherry Turkle's new book, Alone Together. A bunch of readers wrote in, and one of the comments got me thinking.

The writer -- who calls himself "MIIKE" -- yes, two i's there before the k -- wrote quite eloquently, and took the position that this "robophobia" of mine doesn't make a lot of sense. He noted:

In the not too distant future, robots will look like people and will think for themselves. They will have personalities and personal quirks. Why wouldn't someone fall in love with another being, flesh or metal, if the two connected on various deep and personal levels? People fall in love with their pets, why not robots? As for sex, I can easily see a robot being a far superior lover to any human.

He went on to point out the fact that human relationships are fraught with drama and despair. There are too many "acrimonious divorces" to count. There are "physical and psychological abuse both sexes heap upon one another." And then, Miike noted, there is "the agony of watching a mate die of old age, or, even worse, of some horrible disease."

Miike is opting for the bot: "I'm waiting in line for my hot, super smart, super beautiful, soulmate/lover/house cleaner/bodyguard/secretary. Sorry people, but no human can come close to a mate like that."

He concludes by raising the issue of a robot's soul. "Why wouldn't self-realized androids also have souls? Perhaps far more noble souls than a race that rapes, tortures, destroys the planet it lives on, and has perpetrated mass murders from the beginning of time to the present."

Miike got me thinking. There are a whole lot of wicked people in the world. People who brutalize each other. People who are cruel and self-centered and destructive and dangerous. Given the choice between a destructive and violent person and a kind and compassionate robot, why would someone choose a relationship with the former?

That's what got my husband mad. He said he thought it was sad that anyone would want a machine as a stand-in for a human, in any relationship. On principle I agree with him. EXCEPT, I said to him, what if the flesh-and-blood person was a pedophile? Or a pervert? Or a brute? Or a rapist? Or just a really really unhappy person who made everyone else miserable?

At that point, his temper flared a bit. He got testy, which he hardly ever does. He was about to sit down to meditate (I got him mediating a few years back and now he's addicted to it, the way I am.) His point: a person who chose a brute probably wouldn't choose a robot, but rather, the brute.

"A robot is most likely going to be a stand-in for a person who cannot deal with people," he said. "A robot is going to be a stand-in for a person who can't deal with the complexities of human relationships."

He was getting more and more upset. As nicely as I could, I pointed that fact out to him, but that actually seemed to make him more upset. Finally, I told him I thought we should just drop the whole discussion. We did.

But then a little while later, I started to write this post. In an effort to make sure I had his position correctly, I asked him very nicely if I could read a portion of this post to him. He sat down in the rocking chair in my study and I read to him. He started to get really upset again. He said he couldn't believe that I was actually considering this notion that it might be better to have people in relationships with robots rather than people.

I said it was better to see people in relationships with kind and compassionate robots than partners who were mean and brutal. I think I said that because I had a long long talk yesterday with my writer friend Peg. She's writing an amazing novel, based on a true story, about a woman who was married to a man who ended up murdering her.

My position with my husband was simple: given a choice between a man who brutalizes and murders, and a robot, I would opt for the robot in a heartbeat.

He was not buying that. He said that there are reasons that lead a woman to marry a brute and once again, he made the point that said woman would not choose the robot.

But how would he know who she'd choose?

To him, the idea of robots in relationship to humans is just a cop out -- a way to avoid the complex challenges of being human. "It is just one more way for people to avoid dealing with the great challenges of personal growth, trying to understand themselves and others and how to achieve compassion and human love."

But he was getting upset again. And since my goal in life is generally to try as much as possible to achieve PEACE, I decided it was time to let the robo talk go.

So we did. Until breakfast. When we started talking more calmly about whether most people want to change their bad behaviors. Whether people really want to try to address their flaws, and improve their communication, and get along.

Or whether most people wouldn't rather engage (and escape) in petty dramas that keep life interesting. If you consider what is popular on TV and in the movies, and what kind of novels are commercially successful, then I think there is a case to be made that despite our great love for self-help manuals, we love mucking around in dysfunction and disaster and despair.

I know one thing: I have a very tough time finding literature out there that is truly redemptive, with characters who really transform themselves. And another thing: it's really really hard to change yourself in meaningful ways. You might say it's a full-time job that very few people apply for.

So anyway, that breakfast chat was interesting. And peaceful. Which is good. Because the last thing my husband and I need to do is argue. Ever. Especially about something like ... robots.

Before You Go

Popular in the Community