Facebook Exec: Algorithms Should Be 'Broadly' Regulated With More Transparency

“We can’t make all of these decisions and provide all these societal solutions on our own," said Nick Clegg, vice president of global affairs.

A top Facebook executive said Sunday that the social media giant “broadly” supports regulators accessing its algorithms as well as “greater transparency” amid mounting concerns about the company’s effects on children and its spread of harmful misinformation.

Nick Clegg, vice president of global affairs, said that while he believes Facebook has done its best in filtering out harmful content and that its paused program “Instagram Kids” would have been part of a solution, he supports more oversight and regulation.

“We need greater transparency,” he told CNN’s Dana Bash. “The systems that we have in place ... should be held to account, if necessary, by regulation so that people can match what our systems say they’re supposed to do from what actually happens.”

One suggestion he shared would be amending Section 230 of the federal Communications Decency Act. This regulation, passed in 1996, protects online service providers from liability for content published by its users, as well as liability for regulating it.

“My suggestion would be to make that protection, which is afforded to online companies like Facebook, contingent on them applying the systems and their policies as they’re supposed to,” he said. “And if they fail to do that, they would then have that liability protection removed. That seems, to us at least, perhaps a sensible change to consider to Section 230.”

Though Clegg said he supports more oversight, he defended the company’s current handling of user content, telling ABC News’ George Stephanopoulos in a separate interview Sunday that the algorithms that Facebook and Instagram use to rank content in users’ newsfeeds is a necessary “spam filter.”

“If you were to just sort of, across the board, remove the algorithm, the first thing that would happen is people would see more, not less, hate speech. More, not less, misinformation. More, not less, hurtful content,” he said.

Clegg said that Facebook also plans to introduce new tools for parents. These will include an optional parental supervision control. There will also be automatic reminders for teens to take a break if using the program for long periods of time and a “nudge” to look away from content that “may not be good for their well-being,” he told Stephanopoulos.

Clegg said there’s only so much companies can do on their own, however, and that more needs to be done by lawmakers.

“We can’t make all of these decisions, and provide all these societal solutions on our own. That does mean, or does require lawmakers to act as well,” he said.

Sen. Amy Klobuchar (D-Minn.), who reacted to Clegg’s comments in a later interview on CNN, said she appreciated his willingness to talk about the matter, “but the time for action is now.”

“One, we need privacy legislation. We’re one of the few countries that doesn’t have a federal privacy policy that fits the sophistication of these tech companies,” she told Bash. “If you want to share all your private data, you have got to opt in and make an actual decision to do that.”

Klobuchar additionally advocated for expanding the Children’s Online Privacy Act so that it includes protections for children older than 13. The current act, which only offers protections for those under the age of 13, requires websites to have certain protections for children, including not collecting their personal information. Klobuchar has also called for stronger antitrust laws that would prevent companies, like Facebook, from getting too powerful.

“I like competition. I believe in capitalism,” she said. “Breaking up Facebook will be up to, by the way, the attorney general’s office or something like that. We’re not going to do that by law. But what I think has been lacking here is any serious review of these mergers, not just with Facebook, with all of these tech companies, with pharmaceutical companies.”