Siri: humble assistant -- and close friend?
A survey of 1,000 cell phone owners commissioned by Nuance, a provider of voice recognition software, suggests that people are developing closer relationships with the virtual assistants on their smartphones. In the past decade, we've embraced software as a service. Will software as soulmate be next?
Fifty-seven percent of people surveyed said they felt a "personal connection" with their mobile assistant and wanted a virtual assistant that was not only helpful, but personable. Nearly half of respondents sought an assistant with a sense of humor, and almost a third desired "sassy" assistants. That's good news for Siri, whose sarcastic answers turned her into a celebrity and continue to differentiate her from new virtual assistants, like Google Now, which haven't replicated her personality. More than half of all users -- 71 percent of women and 66 percent of men -- have actually named their virtual assistants.
In addition to peppering virtual assistants with predictable questions about driving directions, the weather forecast and where to go for dinner, people are doing some soul-searching with their assistants as well, the survey found. One in five people polled have asked their assistant about the meaning of life, and 5 percent have asked their assistant for love advice. Presumably, however, some fraction of those queries were merely to show off an assistant's snappy replies (Siri's answer to "What's the meaning of life?" has proved quite the party trick; there are nearly 1,000 YouTube videos showing Siri's answer to the question.)
Though the survey, released to coincide with a Nuance announcement at the Consumer Electronics Show last week, is more promotional stunt than peer-reviewed science, there are several tech trends that suggest that people may indeed be developing a real bond with their virtual helpers.
The more intimate, emotional connections some people are sensing with software could stem from the fact that we're using our devices in more intimate ways. Where we once got information by jabbing at phones with our fingers, now we can converse with assistants much as we would with each other, asking questions in a normal tone of voice and in a natural way. Screaming "NEW YORK SUSHI" at a smartphone has given way to a calm, "Where can I find sushi around here?" Some of these assistants, especially Siri, have been endowed with both artificial intelligence and artificial personalities, which foster the sense that there's a caring companion on the other end of the line listening to our requests.
These chatty assistants are increasingly able to anticipate our needs -- in some cases, even more effectively than the people around us. As the New York Times' Damon Darlin observed in a 2010 story on digital devices as "objects of affection," we've become reliant on our gadgets, and all the digital goodies that go with them, as an extension of our brains.
Google positions Google Now as a kind of digital guardian angel that's always looking out for you and can serve up information even before you know you need it. It can already prompt users to leave early for a meeting when a three-car pileup backs up traffic on U.S. 101, or serve up sports scores after studying which teams you love best. Google says of its assistant:
It tells you today's weather before you start your day, how much traffic to expect before you leave for work, when the next train will arrive as you're standing on the platform, or your favorite team's score while they're playing. And the best part? All of this happens automatically. Cards appear throughout the day at the moment you need them.
These virtual assistants are getting better at making us laugh; speaking with us the way we speak to each other; and helping us out when we need them. It seems little wonder people are starting to feel for them.
History has shown humans will quickly suspend disbelief and bond with software, even when we know an algorithm, not a human, is engaging us. ELIZA, a Chabot created by MIT computer scientist Joseph Weizenbaum in the 1960s, was designed to imitate a Rogerian psychotherapist and would answer a person's musings with questions generated automatically from the preceding correspondence (For example: Patient: "You are afraid of me." ELIZA: "Does it please you to believe I am afraid of you.") Weizenbaum, who became critical of artificial intelligence after observing how deeply users would bond with his bot, wrote in a 1966 report that "some subjects have been very hard to convince that ELIZA...is not human." He famously recounted in a paper that his secretary, who would have known ELIZA was an algorithm, became so involved in her conversation with the bot that, after only a few exchanges, she asked Weizenbaum to leave the room.
Consider that ELIZA corresponded with people via a written response that appeared on a screen, whereas assistants today are increasingly able to talk.
"I know from long experience that the strong emotional ties many programmers have to their computers are often formed after only short exposures to their machines," Weizenbaum wrote in his 1976 seminal work, Computer Power and Human Reason: From Judgment to Calculation.. "What I had not realized is that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people."