By Meg Roggensack
Consultant, Business and Human Rights
After a week of bad press and threats of user boycotts stemming from Facebook's latest privacy policies, the company's chief executive, Mark Zuckerberg, is promising a newer, better model. I appreciate Facebook's willingness to acknowledge past mistakes, but Zuckerberg's promise, as outlined in today's Washington Post, rings hollow. Here's why.
Zuckerberg is correct that people want to share and stay connected with friends and family. Facebook has fast become a central hub for personal information. But the vast majority of users don't like it when that information becomes available to people outside their circle of friends or to marketers. Facebook's ever-changing privacy controls and data storage policies have put users at risk of exposure of their personal information.
Every time Facebook "upgrades" its policies, it becomes harder for the average user to comprehend the switch and how to adapt their settings to minimize invasions of privacy. With his company's most recent modifications, Zuckerberg claims that his company was trying to provide "lots of granular controls" but concludes "we just missed the mark." In his Washington Post piece, he lays out a set of operating principles and promises a new policy without any specifics or timeframes.
Facebook "missed the mark" because the company just doesn't get it. Facebook and Mark Zuckerberg have taken the position that sharing of information and connectedness is the new social norm, and that privacy is outmoded. That approach has left Facebook trying to innovate its way around a fundamental human right that the company has a responsibility to respect -- privacy. Until Facebook takes that responsibility seriously -- starting with a full understanding of the right to privacy and building on that understanding with a proactive set of policies -- it will continue to founder.
Facebook's entire business model is based on aggregating and sharing user information. Facebook's privacy standards use "opt out" rather than "opt in" defaults, a policy that permits third party sharing of data without informing users. Even worse, Facebook's code has errors that permit even more data to leak out. I fear that Facebook's new strategy will simply be more of the same.
Users are attracted to Facebook for its services, not its platform. If users can only gain access to Facebook's full suite of services by agreeing to an open platform, then they are no better off. Zuckerberg is championing "openness" and connectedness as an unalloyed good. That's a debatable point, as users in repressive societies know all too well. Their governments use the openness of technology platforms to surveil and censor. For example, Facebook has made pages that users "like" public by default, along with geographical data. The government of Iran might well be interested in a list of everyone living in Iran who is a "fan" of Mousavi. Similarly, people's networks are also public. Various repressive governments might be interested in individuals within their borders who are in contact with well known dissidents or asylum seekers beyond their borders.
Zuckerberg is correct that users won't want a partial system with limited capabilities, but they shouldn't be forced to agree to "openness" to enjoy the full Facebook experience. I realize that navigating the law, user expectations and -- let's face it, the desire for financial profit -- can be a challenge, but it's one that must be met.
That is why Human Rights First has joined with technology companies to form the Global Network Initiative. The GNI is founded on international human rights standards and includes stakeholders from the corporate, investor, nongovernmental and academic communities. The GNI provides a safe harbor for stakeholders to come together and collectively create, implement and evaluate mechanisms to protect and advance privacy and freedom of expression. The benefits of this platform are realized when GNI company members raise issues within GNI and take advantage of the views and learning available from other members. Companies that "go it alone" and attempt to address challenges after the fact, as Facebook is doing, risk not only their reputations, but the privacy rights of their users.
We suggest that users concerned about Facebook's privacy polices send Zuckerberg this message: We agree that a more connected world has the potential to be a more open world, but only if people have the ability to protect themselves against the use of that information in ways that may be harmful to them.