Following the news, some media outlets have outlined ways individual Facebook users can better protect their data. But many cybersecurity experts say these measures aren’t particularly effective when it comes to powerful companies like Facebook, nor do they believe individual Facebook users have much power over their personal data at all.
But can everyday users do anything to protect their privacy on Facebook?
“The short answer is no,” Bradley Shear, a Maryland lawyer who specializes in social media and privacy, told HuffPost. “Facebook sells your personal information to data brokers, political consultants and corporations, so if you really want to maximize your privacy, you need to limit your Facebook use or not even have an account.”
Mark Weinstein, a cybersecurity and privacy expert, echoed Shear. “People forget or don’t understand that Facebook is a data company, and that is their true business,” he said, noting that marketers pay Facebook for data to create targeted ads. “You as a Facebook user are not the customer. You are the product they sell.”
You as a Facebook user are not the customer. You are the product they sell.
Many companies collecting Facebook user data are likely doing nothing more sinister than targeting specific audiences with ads for products they might want to buy ― a concept which creeps people out to varying degrees.
“Advertisement, whether for political or commercial purposes, has always been about persuading people. There’s not necessarily something wrong with that. We’re counting on people to be rational consumers and citizens,” Jef Ausloos, a legal researcher at the KU Leuven Centre for IT & IP Law in Belgium, told HuffPost.
“The problem we are facing today is that the predominant internet business-model, combined with massive advancements in ad tech and data science, shifts persuasion into manipulation,” he added. “The basis of democracy and free market (i.e. informed/rational and autonomous individuals) is basically short circuited.”
Facebook has landed in hot water in the past for its handling of user data. In 2017, Dutch and French watchdogs ruled that the company had broken their countries’ data protection rules by tracking users and non-users on third party websites without their knowledge and failing to to provide people with sufficient controls over how their information is used. Courts throughout Europe have since made similar rulings about its privacy and data policies.
Back in 2011, the Federal Trade Commission charged Facebook with deceiving consumers “by telling them they could keep their information on Facebook private, and then repeatedly allowing it to be shared and made public.”
A 2016 report from the ACLU revealed that social media platforms including Facebook and Instagram (which is owned by Facebook) provided access to user data to a controversial software company that helped police monitor Black Lives Matter activists.
The company also faced criticism last year after leaked documents showed Facebook executives telling advertisers they could identify teens feeling “insecure,” “worthless” and “in need of a confidence boost.”
Facebook insisted this psychological information was not incorporated into any ad targeting tools and has cut ties with controversial developers tracking Black Lives Matter activists. Still, many people are troubled by the possibility for such dealings.
Shear emphasized that it’s not just advertisers and corporations that can use your personal information. It’s also discoverable in litigation, as courts can permit relevant social media posts to serve as evidence and law enforcement frequently request data from Facebook. Insurance companies could even use Facebook data to price insurance premiums, and employers and college admissions officers already admit to making decisions based on social media.
And while Facebook has platform policies dictating how app creators should use data obtained from the site, the revelation about Cambridge Analytica’s actions suggest there’s little accountability. Experts also complain there’s not enough transparency around what Facebook itself is doing with the data it collects.
“People don’t know what exactly is going on behind the scenes. That’s the big challenge here. We don’t know who exactly Facebook is selling our information to,” Shear said.
For many people, the obvious answer to these issues is simply to get off Facebook ― a move which may not be guaranteed to protect your personal information but can serve as a point of leverage to prompt change within the company. Weinstein also recommends installing a privacy browser like Tor, using search engines that don’t track consumers like DuckDuckGo and using social media sites like his platform, MeWe, which emphasizes strong privacy features.
But Jennifer Grygiel, a professor of communications and social media at Syracuse University, says getting off Facebook is not necessarily the most realistic option.
“As our lives become more and more intertwined with the digital world through sharing economy apps like Uber and Airbnb to social media like Snapchat and Facebook, the ability for the user to opt out of things is very limited,” Grygiel told HuffPost “And at this point, a lot of users have even been on Facebook for more than 10 years. What are we going to do ― boycott it? It’s not really a reality.”
If you want to keep using Facebook, there are some best practices when it comes to privacy. As the Cambridge Analytica scheme involved a personality quiz app, it’s worth looking at the third-party apps you signed up for on Facebook.
The National Cyber Security Alliance’s executive director, Russ Schrader, recommends doing a sort of tech “spring cleaning.”
“Keep your caches up to date, go through your apps and figure out which ones you don’t use anymore, go through your location data and see who has access to it. Who have you given permission to collect your data?” Schrader said.
To manage the apps that have your data, click the little arrow at the top right of the screen on desktop and select “Settings.” Then click on “Apps” on the left side of the screen. You’ll see a list of apps you’ve authorized and can remove them by clicking the little “x” that appears when you hover over each one.
Your personal information is like money. You wouldn’t walk around with $20 bills hanging out of your pocket. You need to value it and protect it.
Schrader’s other recommendations include “owning your online presence” by being mindful about what you share and who you share it with, enabling two-factor authentication and creating longer, more difficult passwords.
“Your personal information is like money. You wouldn’t walk around with $20 bills hanging out of your pocket. You need to value it and protect it. You need to be thoughtful about it,” he said.
Given that Cambridge Analytica’s data collection also targeted friends of users (through a feature which Facebook removed in 2015), it may also be worth taking a look at your list of Facebook friends. Limiting the information you give Facebook can also mean “liking” fewer things, as “likes” help shape data profiles and essentially provide targeted insights to advertisers.
Facebook ― which was unable to provide a comment on its data and privacy standards ahead of publication ― has various privacy settings for users to tinker with, but multiple experts who spoke to HuffPost called these safeguards a “facade” of sorts.
“Facebook provides an effective privacy checkup tool, but it does nothing to limit the data that Facebook sees, or that Facebook decides to share with organizations willing to buy it, or even that hackers decide to target,” cybersecurity expert John Sileo told HuffPost, adding that anything you put on Facebook is “public, permanent and exploitable.”
“The data you’ve already shared on Facebook, from your profile to your posts and pictures is already lost. There is nothing you can do to protect it now,” he continued. “The only data you can protect is your future data that you choose to not share on Facebook. Most of my basic profile data is a white lie.”
Grygiel believes the real answer to the concerns about Facebook and privacy comes via legislation.
“We need these platforms to be held to higher standards, to be brought into regulatory environment that’s effective,” they said, adding that regulators need to take a deep dive into this issue and create more accountability.
There is a precedent for more stringent privacy legislation. In 2016, the European Parliament adopted the General Data Protection Regulation ― a set of data privacy rules set to go into effect on May 25. The regulations include limiting the types of data companies can collect from EU citizens and how they can use it, requiring parental consent for anyone under 16 who wants to use certain online services and giving people enhanced rights over their personal information.
The GDPR also outlines an user’s “right to be forgotten,” which allows people to ask companies to remove certain online data about them. Companies who violate the new privacy laws could face fines up to four percent of their annual revenue.
“At the end of the day, the best course of action is to tell our legislators that we want our privacy ― because without greater regulation in this space, these social media companies will not protect our privacy,” Grygiel said. “Self-regulation is not working, and the regulation that governs them is very thin at best especially in the U.S. We need the public to become active in this space.”
This week’s news could serve as a call to action. After all, Facebook’s latest scandal offers a glimpse into what can happen to poorly handled data.
“In the case of the Cambridge Analytica debacle, they sold personally identifiable data on 50 million members to supposed ‘researchers,’” Weinstein said. “The data has ended up in the most nefarious of hands, and now those members are the perfect manipulation target. This is downright scary.”