My friends and I were some of Facebook's earliest adopters. We signed on in 2004, at a time when only those with email addresses ending in ".edu" from a few select schools were allowed access to the social networking site. Only one picture was allowed: your user profile photo.
We were true believers back then. Evangelists. Returning home over holiday breaks we talked incessantly about the newest fad to hit campus and to pull us away from our studies. We cultivated our profiles, our friends lists, and handcrafted our online personas. But the gloss has come off the Gospel -- call it Facebook Fatigue -- and there are whispers of "Facebook farewells."
In the beginning, we intimately understood the meaning of "communities" in the context of this new social media platform. Most of us shared openly with both our Facebook friends and our university network, which we accepted as a limited, closed community of trustworthy individuals. When the "News Feed" feature was released, we protested: changing our relationship status to signal a break-up was painful enough, but no one wanted Facebook broadcasting the fact quite that loud. Later Facebook let us decide which types of information could show up on the News Feed and we quieted down; we simply wanted control over how our information was shared. We wanted to believe.
As we prepared to graduate we closed off our profiles, keeping our friends inside, but locking out the public at large. We sanitized the entire place, deleting those embarrassing photos and factoids. All along, we clung to the idea of Facebook as a walled garden in which we could selectively share a slice of our lives with people we trusted. Even as Facebook slowly opened up - allowing us to choose to have our profiles indexed by Google, creating the "everyone" privacy setting -- we understood the rules of the game: private by default, public with effort. We had other outlets -- MySpace, blogs and eventually Twitter -- that could serve up our information to the public in a kind of "all you can eat" style, but those weren't the services we sought.
Fast-forward a few years. Today's Facebook has evolved to be both public and commercial by default and semi-private with effort. Today, Facebook is a different beast, with new and exciting ideas for changing the Internet and building a profitable business. And that is Facebook's prerogative.
But during this transformation, Facebook has played fast and loose with the sensitive information its users have entrusted to the company's care, and it has obfuscated its messaging to make it difficult for users to understand what Facebook actually is - and what it isn't.
Just this week, Techcrunch reported that a security hole made it possible for some users to view their Facebook friends' past, private chat sessions. This flaw was not the first for Facebook. In the past three months alone, Facebook has accidentally exposed users' private messages to strangers, accidentally revealed users' email addresses through a flaw in its friend request emails, and accidentally made users' private email addresses public to the world.
Facebook's security flaws are just one piece of a larger puzzle. In the words of Senators Chuck Schumer, Al Franken, Michael Bennet, and Mark Begich, the (non-accidental) changes that the social network has rolled out over the past five months also beg questions about the seriousness with which Facebook takes its responsibility to "protect the sensitive personal biographical data of its users and provide them with full control over their personal information."
Back in December, Facebook revamped its defaults, essentially leading non-savvy users -- which are the majority of its user population -- into sharing profile data with a broader audience; Chris Conley of the ACLU of Northern California has reported that gay college students who had come out at school but not at home were outed by one of these changes.
And just two weeks ago, Facebook revealed it would begin forcing users into a "Hobson's choice" of removing information from their profile or publishing it to the world. Facebook also announced that the activities of any user who visits a "pre-approved third party website" while inadvertently logged in to the social network would now, by default, have her activity on that site associated with her profile. Here's how that might work: Imagine pseudonymously reviewing a Planned Parenthood listing on Yelp, a site that lists reviews by users of local businesses -- only to discover that, without your consent, Facebook has shared your real name with Yelp, shared the review with your Facebook friends who visit Yelp, and publicly "connected" the review to your profile.
Meanwhile, even the act of logging out of Facebook, one affirmative step in the process of disassociating your behavior on third party websites and your Facebook profile, is unintuitive: Facebook has hidden its logout button. Facebook home pages have room for all types of information -- a link to the user's profile, friend suggestions, event suggestions, and ways to "get connected" -- but apparently there's no room for a button that can turn Facebook off. Earlier this year, the logout button moved from the user's home page to a spot buried within a drop-down menu, only visible to the user who clicks on the "account" tab.
Meanwhile, a Facebook spokesman told the Associated Press in an email that, "None of these changes removed or reduced people's control over their information and several offered even greater controls." This claim epitomizes the type of "straight talk" that Facebook consistently offers its users. I certainly hope that Senator Schumer, his colleagues, and the Federal Trade Commission, will continue to pay attention to these inconsistencies.
As Facebook continues to put a positive spin on dubious privacy practices, we early adopters are feeling rather queasy. We've started to throw around the term "Facebook Stalking" -- once used to describe how we would silently monitor our friends' activities on the social networking site -- in a new context: to describe the way Facebook stalks us. We find ourselves wondering, if this company wants to become a trusted identity manager for the Web, why aren't they more careful with our trust? Meanwhile, talk of "Facebook fatigue," which began when privacy settings became difficult to understand and control, is slowly morphing into whispers of Facebook farewells.
Erica Newland is a policy analyst for the Center for Democracy & Technology, a Washington, DC-based, non-profit public interest group.