When Mel Neve founded her Facebook group Women with PTSD United four years ago, she couldn’t imagine it would swell to 17,000 members. She said it’s become a tight sisterhood, in which members share jokes and memes ― and offer each other advice about their most painful private struggles.
“We’ve laughed and cried with each other and bared our souls and done so much deep bonding,” Neve said.
She credits her peers for helping her understand how her lingering anxiety and depression are connected to her past traumas, including domestic violence, kidnapping and burglary.
When Neve heard about Facebook’s privacy violations, she wasn’t so concerned about her own data. But with Facebook users threatening to delete their accounts en masse over recently exposed privacy violations, she worried what would happen to her beloved community. Would the group start to lose members in the #deleteFacebook campaign?
That concern is shared by many administrators of Facebook’s wildly popular groups feature, launched in 2004, which offers support forums for many people struggling with health conditions ― from infertility to depression to cancer.
Even if Neve’s most loyal members stayed put, she feared that potential new members wouldn’t be on Facebook to discover her group, or that existing users would spend less time there and contribute fewer posts and encouraging comments. Earlier this year, company CEO Mark Zuckerberg said Facebook usage had dipped by 5 percent even before the Facebook backlash began.
“Facebook is a safe place for us,” said Neve, a speech therapist in Concord, Michigan. “I don’t even know what other kind of forum could reach this many people worldwide.”
How much of Facebook is truly closed? How visible are your posts? Could someone take a screenshot of something you said and repost it? Glenn Cohen, Petrie-Flom Center for Health Law, Biotechnology and Bioethics
So far, the exodus Neve feared hasn’t happened: New data this week shows that the social media giant’s membership has reached nearly 2.2 billion – up 13 percent from the previous year, according to the latest earnings report.
But Facebook group administrators have noticed that their members are increasingly worried about the privacy of their personal information and soul-baring posts. So while usage doesn’t currently seem to be at risk, Facebook users’ behavior might be changing.
Chrissi Kenkel, founder of the 33,000-strong Mental Health Awareness and Support group, said she’s noticed more members asking “Can my friends or family see what I’m writing?”
And Katharina Sucita, who runs the Weight Loss Support Group, which has more than 55,000 members, said members’ privacy concerns recently prompted her to make the group “closed” ― a designation that requires administrators to approve members and limits visibility to those inside the group.
“That way members would feel more comfortable sharing their personal stories and weight loss pictures without fearing their neighbor or boss might see them,” Sucita said.
It’s impossible to expect a true sense of privacy among 55,000 people, but users and bioethicists alike have lingering questions about Facebook’s use of data.
“How much of Facebook is truly closed?” asked Glenn Cohen director of Harvard’s Petrie-Flom Center for Health Law, Biotechnology and Bioethics. “How visible are your posts? Could someone take a screenshot of something you said and repost it?”
These new questions may be a good thing. Our deep desire to connect can lull us into a false sense of security, according to Matthew DeCamp, assistant professor at the Johns Hopkins Berman Institute of Bioethics.
“Over time, people become more comfortable sharing, and what could happen to their data isn’t at the forefront of their minds,” he said.
Over time, people become more comfortable sharing, and what could happen to their data isn’t at the forefront of their minds. Matthew DeCamp, Johns Hopkins Berman Institute of Bioethics
Although Facebook denies sharing the content of users’ posts or comments in closed groups, some people are wondering whether other kinds of health information could be at risk.
Last month, news broke that Facebook had approached several U.S. health organizations, including Stanford Medical School and the American College of Cardiology, about sharing anonymized patient data on illnesses and prescriptions to match it with data it had collected.
Facebook responded in a statement: “This work has not progressed past the planning phase, and we have not received, shared, or analyzed anyone’s data.”
The company separately told HuffPost, “We take protecting people’s privacy very seriously. We do not sell people’s data. Nor do we allow advertisers to target people based on health conditions or conversations on Facebook, including in Groups.”
Yet the social media network is constantly gathering other kinds of health information that’s valuable to advertisers, said Kirsten Ostherr, a digital health technology researcher at Rice University and author of a recent Washington Post op-ed on the topic. Examples include products or links that you “like,” your search history (like “breast augmentation surgery”) or where you’re logging on from, such as a cancer treatment center.
“You can start to see there’s plenty of information for a company to figure out whether you would be a good target for ads about their drugs and health products,” she said. Earlier this month, Facebook announced it was launching a feature to help users control the release of their info to third parties.
Cohen is also concerned about the possibility of so-called anonymized data being used for commercial reasons. Imagine what kind of connections future data miners could make based on your social media posts, fitness trackers, voting record, purchasing history and perhaps even genetic data.
“Whatever the risks are now, if there are no policy changes as more and more data about you becomes known, the risk of re-identifying you is only going to go up,” Cohen said, noting that this could affect your ability to get a job, health insurance or life insurance.
The European Union has been leading the way in changing the way companies collect data. Under a new law that goes into effect May 25, companies must ask if they can collect data and disclose any breaches within 72 hours.
I can’t tell you what it’s meant to talk to other parents who are going through what I am. We found each other on Facebook. Sandra Sermone, member of ADNPKids
Administrators say they hope future protections are in the works, because their groups serve a valuable function. Although mental health experts caution that online support networks are not a substitute for professional counseling, Dartmouth College researchers found they provided important peer support by creating a sense of belonging in a forum that’s free of social stigma. “Not only do you learn you’re not alone, you learn how other people are coping,” said Kelly Aschbrenner, assistant professor of psychiatry at Dartmouth’s Geisel School of Medicine.
Kathie Underwood, who lives in rural northwest Georgia and cares for her elderly mother-in-law who has Alzheimer’s disease, said she depends on the 13,000-member group Alzheimer’s Caregiver Support on Facebook.
“My mother-in-law might have a bad fall and be agitated or hallucinating, and there’s always someone who can tell me how to calm her down,” said Underwood, who is comforted by the fact that she can find a sympathetic ear, day or night, in any part of the world because of the Facebook group.
Such groups have become an even more critical lifeline for smaller communities, such as Sandra Sermone’s 200-member group for parents of children with the rare disease ADNP. All that unites them is a diagnosis: The group is restricted to parents who recently discovered through genetic sequencing that their children suffer the neurodevelopmental disorder characterized by delayed growth and hearing, vision and digestive problems.
At ADNPKids, parents trade tips on where to find specialists, how to thicken water to prevent aspiration and how to advocate for their kids at school. They also strategize on how to get their children’s disease on researchers’ radars.
“Having a child with a rare disease is the most isolating thing in the world,” said Sermone, of Vancouver, Washington. “I can’t tell you what it’s meant to talk to other parents who are going through what I am. We found each other on Facebook.”
But Cohen notes that Facebook isn’t the only game in town, and predicts that other platforms run by members will grow in popularity over time. For example, PatientsLikeMe, a well-known patient support website, has more than 600,000 participants. And there’s no telling what other social media networks will develop their own groups.
Sermone said several U.S. foundations have approached her to start support groups, though she believes they won’t attract the same global audience.
These health groups fill such an emotional need that some members might not even care about their privacy, said Gail Weatherill, an administrator at Alzheimer’s and Dementia Caregiver Support.
“My thinking is that if these folks had to choose between giving up the group and printing their personal data on the front of The New York Times, they’d be hoping the Times uses a flattering photo of them,” she said.
CORRECTION: A previous version of this story misstated which Facebook group Kathie Underwood said she turns to for support.