Information-intensive companies such as Facebook follow a Machiavellian public relations strategy when introducing new programs. Without warning, these companies introduce "features" that invariably result in more information being shared with advertisers, wait for a negative reaction, and then announce minimal changes without affecting the new feature. They explain away the fuss with public relations spin: "we are listening to our users," "we didn't get it right this time," "we look forward to your feedback," etc. This strategy works, time and time again.
Facebook's recent troubles illustrate this neatly. Reacting to rising criticism of recent changes that affected hundreds of millions of users -- by forcing certain profile information to be permanently public and automatically enrolling all users in a new "instant personalization" service that shares profile information with external websites -- Facebook CEO Mark Zuckerberg acknowledged in a Washington Post op-ed that his company had "missed the mark" in providing users the ability to control their privacy on the popular site. Noting, "Whenever we make a change, we try to apply the lessons we've learned along the way," Zuckerberg promised that simpler privacy controls would be forthcoming. Tweaks to Facebook's privacy settings were announced, but instant personalization remained on by default.
The most recent set of Facebook snafus are the direct descendants of prior decisions taken by the social networking giant. In 2006, Facebook activated News Feed, where users' actions were automatically posted on friends' pages, causing many to object because it made it too easy for other people to track down individual activities. A year later, Facebook launched Beacon, an advertising program that announced users' purchases at other websites on Facebook, often without explicit consent.
In all these cases, Facebook follows the pattern of taking two steps forward with an aggressive misuse of personal information and creeping back the slightest bit once the criticisms emerged. Each time, Facebook promised users that "we will keep listening," and artfully reminding us that all they really want to do is make "the world more open and connected."
These events represent the perfection of privacy public relations. Guided by earlier battles fought by tobacco and drug companies, information-intensive firms have learned how to use rhetoric to distract the public while successfully implementing new programs. They are the Machiavellis of privacy.
Privacy PR results in "blowforward." Typically, entities that behave transgressively experience blowback: they lose market share or some power they once had. But platforms such as Google and Facebook are so compelling that users will not defect, even as the companies change settings. Facebook installed a window onto users' profiles and replaced it with a one-way mirror in response to the controversy. The situation leaves the user responsible for perceiving the observation room behind the mirror and to shutter it.
Some glibly ascribe this debate to "young people not caring about privacy." But this point is inaccurate and confuses the issue. Users have always been able to make profiles more public. We are describing a situation where the service provider itself makes the changes, thus pushing them towards greater public exposure.
Further, both qualitative and quantitative research shows that Americans of all ages care about privacy. Interestingly, the youngest users of social networking services are the least trustful of them and most likely to take privacy-preserving steps, according to a new report by the Pew Internet & American Life Project.
This distrust relates to the inherent motivations of social networking services. Relying on a business model that depends on unfettered, open access to as much personal information as possible -- all in the interest of serving advertising -- social networking services design their systems to maximize sharing. Any privacy settings provided tend to be minimal, hidden, and difficult to use. Too often, new features don't have any meaningful privacy controls until users protest, and their reactions are "listened to."
Some data-intensive firms are seeking a broad cultural change that places all personal information out in the open and in the hands of companies for whatever uses they see fit. Many of us might welcome such a change, but if we fail to recognize the manipulation used to bring about this change, we will all be included in Facebook's utopia.