Families Of San Bernardino Massacre Victims Sue Social Media Giants

The lawsuit claims that Twitter, Facebook and Google have provided "material support" to terrorist groups, including Islamic State.
LOADINGERROR LOADING

LOS ANGELES ― Families of victims of the 2015 terrorist attack in San Bernardino filed a federal lawsuit this week against Twitter, Facebook and Google’s YouTube over claims the tech companies “knowingly and recklessly” provided “material support” to terrorist groups, such as the so-called Islamic State, and “enabled” them to carry out numerous attacks.

In the lawsuit, filed Wednesday in the U.S. District Court for the Southern District of California, the family members accuse the social media companies of providing the terrorist group also known as ISIS with social network accounts, which were used to spread extremist messages, promote and encourage attacks, raise funds and attract recruits. Without these social networks, the lawsuit claims, the rapid growth of ISIS over the last several years into “the most feared terrorist group in the world” would not have been possible.

The family members argue that the posts made on the social media platforms contributed to the radicalization of Syed Rizwan Farook and his wife, Tashfeen Malik, whose attack on the Inland Regional Center on Dec. 2, 2015, left 14 people dead and 22 others injured.

The families of Tin Nguyen, Nicholas Thalasinos and Sierra Clayborn, who were among those killed that day, filed the suit. The tech companies, the families say, are liable for aiding and abetting acts of international terrorism, providing support to a designated international terrorist organization and wrongful death, among other claims.

“Even if Farook and Malik had never been directly in contact with ISIS, ISIS’ use of social media directly influenced their actions on the day of the San Bernardino massacre,” the lawsuit says.

Following the shooting, FBI Director James Comey said that investigators found no evidence that the couple were explicitly directed by a terrorist organization to attack the San Bernardino center, where Farook was employed, but did say that Farook and Malik were “consuming poison on the internet” and that the pair had become “homegrown violent extremists, inspired by foreign terrorist organizations.” Investigators said that Malik, on the day of the shooting, “pledged allegiance” to ISIS leader Abu Bakr al-Baghdadi in a Facebook post. She also expressed support for Islamic jihad in private messages on the social network in the years prior to the attack, federal officials told the Los Angeles Times.

Keith Altman, an attorney representing some of the family members and who filed the suit along with attorney Theida Salazar, talked about the goals of the lawsuit in an email to HuffPost. “Clearly, this is mostly about behavior modification” on the part of the tech companies. The aim is “no more funerals.”

Altman said he believes it’s clear that the companies are aware of what is going on on their platforms and thinks they could be doing more to thwart use by terrorists, “because there are very specific behaviors in use by the terrorists that could be detected and prevented.”

A Facebook spokesperson told HuffPost that there is “no place on Facebook for groups that engage in terrorist activity or for content that expresses support for such activity” and that the company takes “swift action to remove this content when it’s reported to us.” The company offered its sympathies to the families of the victims of the San Bernardino attack.

A spokesperson for Twitter declined to comment. Google did not respond to a request for comment.

“There’s no free-speech-related reason these companies have to allow terrorists and neo-Nazis and whoever else on their sites.”

- Ben Feuer, chairman of the California Appellate Law Group

All three companies have strict rules prohibiting threats of violence and the promotion of terrorism, and have begun to police themselves when it comes to this kind of content. Facebook announced this week that it is hiring 3,000 more staffers to screen and combat harmful content on the social network (on top of the 4,500 who already do). YouTube has a history of removing extremist videos that depict or incite violence. And Twitter has suspended a total of 646,248 accounts for extremism, from August 2015 to December 2016.

Similar cases against the social media companies have been filed before, some from the same attorneys who filed this one. The tech companies have argued they are not liable for the content of their users, asserting protections under the Communications Decency Act, the federal law that provides immunity to internet companies that publish user content.

One recent case against Twitter, which also claimed the company provided material support to ISIS, was ultimately dismissed last year. In its ruling, the federal court explained that a decision against the social media company for such claims could theoretically make it liable “indefinitely” for “any and all future ISIS attacks.”

“Such a standard cannot be and is not the law,” the ruling says.

Eric Goldman, a law professor and co-director of the High Tech Law Institute at Santa Clara University School of Law in Silicon Valley, explained in an email to HuffPost that if the media companies were actually found liable for providing material support to terrorists in a case like this, the consequences would be enormous ― not just financially crippling for these specific sites but potentially for all social media sites and possibly the entire internet.

Ultimately, Goldman said, the Constitution “protects the rights of everyone ― even terrorists ― to express their political viewpoints.”

“At their core, these lawsuits seek to deprive terrorist groups of the right to speak online,” Goldman said. “Not only would that outcome be unconstitutional, but it’s a bad idea. We cannot learn about and understand our adversaries if we refuse to even listen to them.”

Still, Ben Feuer, chairman of the California Appellate Law Group and a former clerk on the U.S. 9th Circuit Court of Appeals, told HuffPost that, while the Communications Decency Act does clearly grant immunity to these kinds of companies on user content, free speech arguments aren’t an exact fit in this kind of situation.

“These are private companies, and you don’t get free speech ‘rights’ in a private sandbox,” Feuer said. “There’s no free-speech-related reason these companies have to allow terrorists and neo-Nazis and whoever else on their sites.”

The companies have the discretion to remove people ― there’s no due process and the terms of use are completely controlled by each company, Feuer noted.

“Maybe at some time in the future social media will be so important to society that Facebook will be deemed a quasi-public space, like a mall or an airport, and people will get constitutional rights there. But that hasn’t happened, and I don’t see it happening in the near future,” Feuer said.

Before You Go

Nicholas Thalasinos

These Are The Victims Of The San Bernardino Shooting

Popular in the Community

Close

What's Hot