‘Confused’ Supreme Court Justices Seem Wary Of Opening Lawsuit Floodgates On Website Operators

“Let’s say I retweet an ISIS video,” Amy Coney Barrett wondered aloud Tuesday during an argument that focused on a law known as "Section 230."
People wait in line outside the U.S. Supreme Court on Tuesday to hear oral arguments in two cases that test Section 230, the law that provides tech companies a legal shield over what their users post online.
People wait in line outside the U.S. Supreme Court on Tuesday to hear oral arguments in two cases that test Section 230, the law that provides tech companies a legal shield over what their users post online.
JIM WATSON via Getty Images

A huddle of unelected, Luddite law nerds might fundamentally change the nature of the internet based on a pair of cases they heard this week. But at least a few seemed to realize that doing so could have profound implications they may not fully understand.

After several Supreme Court justices acknowledged Tuesday that they were “confused” by the arguments on display and the law they were being asked to interpret, one of them laid her cards out on the table.

“We’re a court. We really don’t know about these things,” Justice Elena Kagan said. “You know, these are not like the nine greatest experts on the internet.”

That’s putting it mildly. Over two cases and five hours Tuesday and Wednesday, the court spent much of its time trying to define the debate, often resorting to complicated metaphors to illustrate the questions of the case — everything from rice pilaf recipes to “adult” sections in bookstores.

Several justices from across the ideological spectrum appeared hesitant to touch the law found in the Communications Decency Act’s Section 230 — the legal protection for website operators that’s widely thought to have shaped the modern internet — at all, especially given the potential impacts on the free flow of ideas on the web, not to mention the massive market shifts that could come with new rules.

Both cases before the court this week concern website operators’ liability for terrorist acts. The first case of the two, Gonzalez v. Google, asks whether Section 230 covers YouTube’s recommendation algorithm, which the family of Nohemi Gonzalez claims indirectly led to her death in the Islamic State group-linked terrorist attack in Paris in 2015. The second, Twitter v. Taamneh, concerns the Anti-Terrorism Act and whether websites can be considered to have aided and abetted in the death of Nawras Alassaf during a 2017 attack by the group, also known as ISIS, in Istanbul.

On Tuesday, questioning Gonzalez family attorney Eric Schnapper, Justice Brett Kavanaugh repeatedly sought answers on whether opening the floodgates to lawsuits based on website recommendations would “crash the digital economy,” as some advocates have said. When Schnapper doddered, Kavanaugh forced the question.

“What would the difference be in liability? In damages?” he demanded. “Like, how would the money, at the end of the day, differ if you are successful?”

“How would the money, at the end of the day, differ if you are successful?”

- Justice Brett Kavanaugh, questioning petitioners in Gonzalez v. Google

Schnapper tried to minimize the potential effects. “Most recommendations just aren’t actionable” as lawsuits even without Section 230, he said. But Justice Amy Coney Barrett quickly added her own hypothetical to the heap.

“Let’s say I retweet an ISIS video,” the court’s youngest justice, at 51, said. “On your theory, am I aiding and abetting? And does the statute protect me, or does my putting the thumbs-up on it create new content?”

Schnapper protested, arguing that he wouldn’t read the word “user” that broadly, but Barrett insisted: Was that behavior protected by Section 230? No, Schnapper said: “If you start down that road ... every time I send a defamatory email, I’m protected, as long as I’m quoting somebody else.” Chief Justice John Roberts called on Justice Ketanji Brown Jackson.

“I guess I’m thoroughly confused,” Jackson said, launching into a discussion of the distinctions between Section 230′s lawsuit protections and the eventual liability websites could face if the court changes the law’s reading.

‘Terrorist Activities — Or Pilaf’

Section 230 primarily protects website operators from liability for content that users post, such as potentially defamatory or libelous Twitter posts. The confusion inherent in Gonzalez v. Google, in general, lies with questions over recommendation algorithms. Do the law’s protections apply if YouTube’s “up next” algorithm points users toward terrorist recruitment activity? Does it matter that YouTube didn’t single out terrorist propaganda to promote with its algorithm, but rather generally recommends everything to everyone according to their interests? (According to Google’s lawyer, the site would be protected either way.)

As arguments in the cases proceeded, justices and lawyers alike leaned on a surprising variety of metaphors and hypotheticals to try to wrap their heads around key questions.

Roberts compared the situation in Gonzalez to a bookstore in which a customer asks for books about the legendary baseball player Roger Maris. The bookseller points the customer to a table with Roger Maris books. Is that a “recommendation”? What about a catalog of books for sale that a bookseller creates on her own? The latter case, if the bookstore were a website, would lose Section 230 protections, Schnapper argued.

At various points Tuesday, both sides even argued over the significance of YouTube video thumbnails. Schnapper called them a “joint creation” of YouTube and the initial content creator, while Google’s attorney, Lisa Blatt, said thumbnails were a “screenshot of the information being provided by another” and “embedded third-party speech.”

Clarence Thomas was the only justice to speak out about Section 230 before this week. He’d urged the court to “question” the “sweeping” protections that lower courts have interpreted Section 230 as offering. But in the first minutes of Tuesday’s arguments, he seemed to lend credence to the idea that so-called neutral algorithms, such as those that appear to treat terrorism results roughly the same as cooking videos, are protected under the law.

“Say you get interested in rice pilaf from Uzbekistan,” Thomas said. “You don’t want pilaf from some other place, say, Louisiana.” Search algorithms would learn that you preferred Uzbek recipes, he said.

“I don’t see how that is any different from what is happening in this case. Are we talking about the neutral application of an algorithm that works generically for pilaf, and it also works in a similar way for ISIS videos? Or is there something different?”

Later, Roberts appeared to agree. It seemed significant to the chief that algorithms treat different topics neutrally — “because then they don’t have a focused algorithm with respect to terrorist activities — or pilaf or something.”

The Supreme Court heard arguments about Section 230 Tuesday and Wednesday. Here, part of the group is shown as the justices sit for a new portrait in October.
The Supreme Court heard arguments about Section 230 Tuesday and Wednesday. Here, part of the group is shown as the justices sit for a new portrait in October.
via Associated Press

Blatt offered a similar formulation during her time before the justices, saying it was only logical that searching for the word “football” would lead to different sports in different countries.

The Google attorney argued separately that without Section 230, there would be endless lawsuits and “death by 1,000 cuts” for the internet as we know it — even if websites use benign sorting methods for material.

“A website could put something alphabetical in terms of reviews, and every Young, Williams and Zimmerman … could say, well, that was negligent,” she said.

‘I Think You Would Win’

Regardless of the legal humility on display at the court, the justices may very well still act on Section 230. No clear consensus emerged during this week’s hearing, and as a political issue, Section 230 and questions about web operators’ liability have a tendency to scramble the typical partisan divide. Both Joe Biden and Donald Trump have called for eliminating Section 230 protections, for example — even if they’ve done so for completely unrelated reasons. A lawyer for Biden argued for an hour on Tuesday, largely aligned with the Gonzalez petitioners, though on Wednesday, the Biden administration argued for fairly limited liability for web operators under anti-terrorism laws.

A few justices even seemed sympathetic to Gonzalez’s petitioners’ argument. Jackson, for her part, expressed frustration with the buildup of judicial interpretation around a relatively short law that, she said, appeared initially intended to simply “not disincentivize these platforms from blocking and screening offensive conduct.”

The law as written has two parts, she noted: one that generally shields websites from liability based on content submitted by users, and another that shields websites from liability if they attempt to moderate users’ content, but unintentionally miss some illegal or actionable material. In that case, Jackson added, “I think you would win. Unless your recommendations argument really is just the same thing as saying, ‘They’re hosting ISIS videos on their website.’”

“I think we do have to be drawing that distinction,” Schnapper agreed.

Still, Twitter v. Taamneh seemed to emphasize several justices’ skepticism of applying broad liability to websites. That case, which was not based on Section 230, instead addressed websites’ liability under anti-terrorism laws directly. If social media recommendation algorithms were opened to lawsuits as a result of the Gonzalez ruling, could they technically be considered “aiding and abetting” deadly ISIS terror attacks?

Relatives of terror attack victims who lost their lives after the deadly Islamic State group attack occurred at Istanbul nightclub Reina leave flowers and pray at the site during the commemoration ceremony on Dec. 31, 2017.
Relatives of terror attack victims who lost their lives after the deadly Islamic State group attack occurred at Istanbul nightclub Reina leave flowers and pray at the site during the commemoration ceremony on Dec. 31, 2017.
Anadolu Agency via Getty Images

Twitter’s lawyer, Seth Waxman, argued that the social media site ought to be off the hook, “absent specific knowledge of particular accounts or posts that were used to plan, commit, or proximately support the act of international terrorism that injured the plaintiff.”

Schnapper, who represented these plaintiffs as well, argued a much broader version of liability, arguing that specific awareness of an impending attack on Twitter’s part wasn’t necessary, and that instead, “affirmatively recommending” terroristic content was enough.

“It’s recruiting and fundraising,” he said, adding later that simply selling a cellphone to Osama bin Laden, no matter if he used it to advance a specific terror attack, would be grounds for liability.

Thomas, amplifying the concerns of several others, asked Schnapper if his framework wouldn’t lead to endless lawsuits after every terrorist attack.

“If we’re not pinpointing causing and effect or proximate cause for specific things, and you’re focused on infrastructure, or just the availability of these platforms, then it would seem that every terrorist act that uses this platform would also mean that Twitter is an aider and abetter in those instances,” he said.

“I think, as you phrased it, the answer is yes,” Schnapper said, before adding the caveat that remoteness in both space and time between the assistance and a terror attack could decrease liability.

Justice Samuel Alito poked at that idea, wondering if local businesses that had been warned by police about a gang member would be liable for selling cellphone service — or even takeout — to the person.

The answer would depend, Schnapper said: Gun dealers might face liability, but a Chinese restaurant probably wouldn’t.

Waxman, given some time for a rebuttal, compared that hypothetical to Walmart, the nation’s largest gun seller.

“Nobody would say that they are aiding and abetting particular crimes that happen to be committed by someone that bought a gun at Walmart.”

Popular in the Community

Close

What's Hot