Your Next Phone Will Know What You're Feeling

Your Next Phone Will Know When You're Sad
Open Image Modal

Your future smartphone could know from the sound of your voice that you're stressed, and automatically book you a massage. Or it might sense you're feeling sad, and cue up an ad for antidepressants.

Artificial intelligence, meet artificial intuition: Researchers at the University of Rochester have developed an algorithm that can detect human emotions through speech with greater accuracy than was previously possible and without analyzing the meaning of the speakers' words.

To create the computer program, the University of Rochester team identified 12 different acoustic features of a speaker's voice, such as pitch and loudness, and used those to classify short snippets of speech as expressing one of six different emotions, from "anger," sadness" and "disgust" to "happiness," "fear" and "neutral." The software was then instructed to associate certain speech characteristics with specific positive and negative emotions, and produced a new algorithm that can correctly identify human emotions with 80.5 percent accuracy, up from the 55 percent accuracy offered by prior programs.

University of Rochester graduate student Na Yang, who co-authored the research paper, has already developed a Windows Phone smartphone app, Listen-n-Feel, that functions as a "mobile emotion sensor" and allows the phone to detect a speaker's feeling, showing a smiley face or frown-y face depending on the speaker's tone.

And the algorithm doesn't even need to follow what the speaker is saying -- only how.

"We actually used recordings of actors reading out the date of the month -- it really doesn't matter what they say, it's how they're saying it that we're interested in," said Wendi Heinzelman, a professor of electrical and computer engineering at the University of Rochester, according to a press release issued by the university.

The report's authors hypothesize that the technology could be used by psychologists or clinicians to monitor patients in a more natural environment, or integrated in smartphones so a gadget could "choose songs based on the user’s current emotion."

It's easy to imagine a whole host of seemingly sci-fi applications. What if Siri could whisper sweet nothings to you when you're stressed? Or serve up Prozac ads when you're sad, promos for beach getaways when it senses frustation, or Starbucks ads when you sound sleepy? If a phone can sense what you're feeling, so, more likely than not, could advertisers.

Affectiva, a company that tracks emotion using facial cues, is already being used by brands for market research and to gauge the effectiveness of ads. In an interview with the Technology Review published in July, then-Facebook engineer Andrew Bosworth suggested the company would consider targeting ads using sounds picked up via phone's microphone.

Our 2024 Coverage Needs You

As Americans head to the polls in 2024, the very future of our country is at stake. At HuffPost, we believe that a free press is critical to creating well-informed voters. That's why our journalism is free for everyone, even though other newsrooms retreat behind expensive paywalls.

Our journalists will continue to cover the twists and turns during this historic presidential election. With your help, we'll bring you hard-hitting investigations, well-researched analysis and timely takes you can't find elsewhere. Reporting in this current political climate is a responsibility we do not take lightly, and we thank you for your support.

to keep our news free for all.

Support HuffPost