Facebook is offering few answers about a live video that shows the aftermath of the fatal police shooting of Philando Castile in a suburb of St. Paul, Minnesota.
Diamond Reynolds, Castile’s girlfriend, broadcast the graphic footage with a smartphone after an officer shot Castile during a traffic stop. In the video, he can be seen bleeding to death while police officers shout orders at Reynolds, who narrates the situation as it unfolds.
Reynolds’ video disappeared from Facebook for about an hour after the stream concluded, a Facebook spokeswoman told The Huffington Post, making it impossible for users to access or share the footage in the moments following the shooting. The video was eventually reinstated with a graphic content warning added by the social network’s community organization team.
The way the traumatic encounter unfolded on the world’s largest social network communicates a lot about Facebook’s difficult position in modern media today.
“If Facebook delivers the news, doesn’t it have a responsibility to the public that consumes that news? Should it not explain its editorial choices?”
Facebook isn’t a news organization, technically. It’s a tech enterprise. But its products ostensibly exist to help users tell stories. So, when something happens with those stories, where does Facebook’s responsibility to the public begin and end?
Two weird things happened with Reynolds’ video: It was temporarily deleted, and it reappeared with a content warning. Facebook doesn’t have much to say about either of these things, claiming a “technical glitch” caused the video to go down.
“We’re very sorry that the video was temporarily inaccessible. It was down due to a technical glitch and restored as soon as we were able to investigate,” the spokeswoman told HuffPost, echoing a previous statement to TechCrunch.
Facebook would not elaborate when pressed further. Asked if the glitch might have been related to an automated video moderation system, the spokeswoman simply said “Sorry, we don’t have any further details to share.”
It’s a big deal when news outlets delete material from their websites. Removing content raises natural questions about censorship, accountability and business concerns ― as BuzzFeed learned last year, when it was criticized for deleting three posts under pressure from advertisers. Glitch or not, it’s natural for consumers to wonder why the video temporarily disappeared from Facebook, and the company’s response offers few helpful details.
This lack of details is especially concerning because it’s a bit unclear where Facebook stands on this type of graphic content. The company could decide to ban all footage of weapons and death, but it hasn’t. So its overall editorial stance is ambiguous, especially in the era of live video.
According to Facebook’s community guidelines:
Sometimes, [content posted to Facebook involves] violence and graphic images of public interest or concern, such as human rights abuses or acts of terrorism. In many instances, when people share this type of content, they are condemning it or raising awareness about it. We remove graphic images when they are shared for sadistic pleasure or to celebrate or glorify violence.
We also ask that people warn their audience about what they are about to see if it includes graphic violence.
As for the graphic content warning, that addition of a warning label amounts to an editorial decision ― the sort you might see before war footage on the evening news. Someone at Facebook decided it was appropriate to warn users who might be unsettled by the video. That’s the right call, unquestionably, and it’s one that HuffPost also made in its article.
Yet Facebook, reeling from a controversy over how its employees handle “trending” news, has shied away from being seen as a media enterprise. In fact, the company has taken a number of steps specifically geared toward de-emphasizing its role as a news platform.
That isn’t going to work.
Emily Bell, the founding director of the Tow Center for Digital Journalism at Columbia University, told HuffPost that it’s time for Facebook to be more transparent about how it delivers news to its 1.65 billion users.
“They’ve always said ‘we’re just here as a platform,’ but once you introduce something like livestreaming for everybody, then clearly that’s not possible,” Bell said. “I don’t know what steps exactly went on within Facebook, except the video went down, and then it went up again with a graphic content warning. That’s definitely editorial intervention.”
Facebook could mitigate these concerns by hiring an editor who interfaces with the company’s users, Bell suggests.
“There is a role in explaining to the public the context and the process by which certain things are published and certain things are deleted on platforms,” she said. “They need somebody who is willing to engage in those kinds of debates, and somebody who’s able to make the process transparent, and who actually can go and ask difficult questions inside the company.”
“Most people feel that it would be beneficial if they were a little bit more transparent about the steps and thinking that goes on behind their decisions.”
The reason why basically boils down to trust. Facebook is where millions of us get news today, and it’s the platform that directly hosted the shooting footage from Minnesota. But its processes are completely opaque, with few clarifying details provided to the media upon request.
If Facebook delivers the news, doesn’t it have a responsibility to the public that consumes that news? Should it not explain its editorial choices, whether they amount to adding a graphic content label to a video or tweaking an algorithm to prioritize some stories over others?
“Most people feel that it would be beneficial if they were a little bit more transparent about the steps and thinking that goes on behind their decisions,” Bell said. “If they do have sentient views about what the right thing to do is with the content they have, as opposed to ‘hey, we just put it all up there,’ that’s a real sort of maturity of the role the platform plays. But it’s very complicated as well.”
UPDATE: 8:45 p.m. ― In a post on Facebook Thursday night, Facebook CEO Mark Zuckerberg said his heart “goes out to the Castile family” and added that Reynolds’ video “reminds us why coming together to build a more open and connected world is so important ― and how far we still have to go.”
He did not comment on why the video briefly disappeared from his site.