At two privacy conferences -- one in New York, the other right now in Victoria, B.C. -- I've watched the growth of privacy's regulatory/industrial complex and seen its strategy in action: Scare, then sell.
Yesterday, before I spoke at the Reboot conference, the privacy commissioner for the province, Elizabeth Denham, got up to demonize the social net and its leaders. She said that Google's Eric Schmidt believes privacy is not relevant anymore, citing his jokes about changing our names at age 21. She belittled Mark Zuckerberg, too. She bragged about helping to bring Facebook to accounts when she was in the federal privacy office. And she gloated about the fizzle of Google Buzz. Then she boasted about adding more regulators to her office and getting more resources. Scare and spend.
At a later panel, I saw a vendor go through his PowerPoint showing the growth of so many outlets of social media. He said 500 million people were using Facebook. Then he paused... dramatically. Then he said, "Scary." Why is that scary? He didn't say. He talked about watching YouTube videos as if that could be harmful in and of itself. How? He didn't say. That's how the discussion of the social web has advanced in this industry: All you have to do is say people are using these mysterious tools, and the fear is assumed. But then he sold his service. Scare, then sell.
I spoke with the head of an association of chief privacy officers. Boy, I said, I'll bet your membership is growing. In increments of a thousand, he said. He also noted how the growth in the U.S. is in privacy officers while in Europe it's in privacy regulators.
I saw the two come together at the other conference, MediaBistro's in New York, when the head of a privacy advocacy organization issued his fearsome specters for the crowd of companies and regulators. It becomes a self-powering machine: The privacy advocate feeds the regulators arguments to be scared and regulate more, then companies think they need more privacy services, and more companies are born to provide them -- companies that set up booths here in Victoria. One handed out a slick magazine with the big cover billing: "Social Media RISKS: Four Areas You Must Examine At Your Company."
In the draft of my book Public Parts -- which I'm furiously editing now -- I had not gone after privacy's regulatory/industrial complex. I'm trying hard not to pit privacy and publicness against each other as they are not binary; one depends upon the other in a continuum of choices we all make.
But the emergence of Privacy, Inc., as a industry built on scaring people is beginning to scare me.
In my talk yesterday, I warned of unintended consequences of too much regulation enacted too quickly. I cited Germany's Verpixelungsrecht, its blurring of images in Google Street View and the precedent that sets for others taking pictures in public of public views.
I also worry that efforts to bring in a "Do Not Track" list and other demonization of ad targeting could cripple the revenue of the media and news industries even as they struggle to find sustainability; it could kill news outlets and reduce journalism.
At the final panel I attended, moderated by Denham, I saw execs from trade groups and Yahoo as well as a reasonable friend from Ottawa's privacy office talk about meaningful efforts that are being made to be more transparent about advertising, which -- lord knows -- is needed.
The ad and media industries have been damned fools, not being open enough about what they do and how they do it and the value that comes to them -- in higher ad rates -- and much more importantly to the public -- in relevance (and less noise). But Yahoo showed off a good tool to see and change how you are being targeted. The Canadian Interactive Advertising Bureau put forward a good framework for self-regulation. FutureOfPrivacy.org gave good advice about seeing past tools and disclosures and making advertising actually worthwhile for consumers.
Denham, to her credit, asked the panel to define bad regulation. They said it's taking a narrow issue and using broad strokes to regulate it, doing collateral damage. She came to the view of regulation I've learned from danah boyd: that we need to concentrate on controlling use of data more than the gathering of it. (It's illogical, indeed impossible, to tell people what they may not know; it's logical and feasible to tell them what they may not do with what they know.)
So at the end of the day, I felt a bit better. But I fear that the reasonable and necessary moves to protect privacy -- and it does need protection -- won't be able to outrun the fear strategy. For fear is building a new industry, a very fast-growing industry.
Here's Mathew Ingram's GigaOm report on my talk with a brief chat. I hope to be able to post the talk itself soon.