Facebook just became a bit friendlier to blind people.
The social network on Tuesday launched a new feature called "automatic alternative text" (AAT), which will describe images to people who are blind or visually impaired. If your friend posts a picture from her hiking trip, for example, the app will now be smart enough to recognize visual cues from the photo and say aloud, "Image may contain: two people, smiling, sunglasses, sky, outdoor [sic], water."
It's not the most evocative description of all time, but imagine using Facebook without any visual cues whatsoever. The social network works best when words and pictures combine to create meaning. Someone might caption a photo with the words "I can't believe I did this!" -- but if you can't see the picture, you would have no idea if they were referring to eating an entire wedding cake or hitting a reindeer with their truck.
Enabling the feature is easy, though you'll need an iPhone or iPad to use it right now. To use it, ask Siri to "turn on VoiceOver," a built-in iOS feature that describes out loud whatever is on your screen. You can also tap into Settings, then "General" and "Accessibility" to manually flip on VoiceOver.
Representatives for Facebook say the feature will come to other platforms in the near future. While they didn't specify which, the demo shown to The Huffington Post ran on a laptop.
This isn't frivolous. Say what you will about wasting time on Facebook, but one in seven people spend time there every day. Scrolling through pictures is an activity you've likely done countless times when you're bored -- but only because you're fortunate enough to have full use of your eyes, unlike 285 million other people in the world.
Matt King, a member of the accessibility team who is blind, recalled in an interview with HuffPost how agonizing it once was to navigate Facebook.
"What you might do in 15 or 20 minutes, just sitting back enjoying a cup of coffee, looking at what pictures your friends posted, an equivalent activity [for me] would've been like four hours of strenuous figuring out what to do," King said. "It's a feeling like all of technology and society is moving against you. You're shoved to the margins yet again."
Facebook has been working on this problem for years. The company launched a formal Facebook Accessibility team five years ago, but its Core Data Science team has long been developing algorithms to automatically understand content posted on the social network. Those algorithms help Facebook serve you links, pictures and statuses it thinks you'll like.
It's all tied together: The algorithms help every user discover content from their friends, but they're also the foundation of the new AAT feature.
"The goal of the company is to connect the world," Jeff Wieland, the project manager for Facebook's accessibility initiative, told HuffPost. "If we're going to achieve that goal, then we're going to develop strategies that work for accessibility. It's not going to happen automatically."
Facebook is just getting started on this. Even using AAT, people with visual impairments are missing out on a lot. If you've taken a writing class at any level, you've heard that the trick to evocative prose is "showing, not telling." Put another way, seeing someone smile communicates a bit more than hearing the words "He is smiling."
To that end, Facebook is working on making the spoken descriptions more personal.
"We'll continue to expand the number of activities described as we improve the product," Shaomei Wu, a data scientist at Facebook specializing in accessibility, told HuffPost via email. "Identifying individuals who are your friends is something we'd like to add in the future as well."
That last part presents a privacy concern acknowledged by the accessibility team.
It's a feeling like all of technology and society is moving against you. You're shoved to the margins yet again. Matt King, a Facebook accessibility engineer who is blind
Facebook already has the capability to recognize who is depicted in a photograph you post -- it uses data from previous pictures to match an individual to an image and suggest that you "tag" that person. But you must take a manual action to identify that person and allow Facebook to show that label to other people.
Yes, Facebook technically has the power to identify and tag people without your blessing, but it won't (for now). That means you don't need to worry about a computer program tagging you all over the place, but blind AAT users won't be able to hear about specific people in a photo unless those people have been manually tagged by other humans.
As you've perhaps experienced yourself, people don't always make the effort to tag their friends. If they do, sometimes it takes them a while: not such a big deal if you can see the photo, but it might be a pain for people who can't.
"People are concerned about the computer being able to recognize information about a photo without them being fully aware of it," King told HuffPost.
Wieland said the technology could read a manual tag, but that presents a different set of problems because of how people use Facebook. Sometimes people tag images when they want their friends to see them, not because their friends are actually in the pictures. That would "trick" the AAT feature and potentially confuse users.
So, it's all a work in progress. Automatic alt text isn't perfect, and not just because of its technical quirks. Some point out that these great achievements from tech corporations are less about altruism and more about developing new ways to mold human behavior for greater profits. You can assuredly dream up your own nightmare scenarios for Facebook's "deep learning" technology -- the very thing powering automatic alt text -- which has been described by Wired as "interconnected machines that approximate the web of neurons in the human brain." What could go wrong!
Still, one can't deny that Facebook -- the "most populous nation on Earth" -- just became a bit easier to navigate for an entire population of people left behind by the social networking most of us take for granted every day.