The real-time broadcast of the violence was seen fewer than 200 times, Facebook deputy general counsel Chris Sonderby said in a statement. But it had begun gaining traction by the time the first user alerted Facebook to it ― 12 minutes after the 17-minute livestream ended.
The original video was played about 4,000 times before being removed from the platform. Facebook said Saturday that it had removed 1.5 million different videos of the shooting, including blocking 1.2 million before they were uploaded.
But a version of the video continued to spread on the internet because an 8chan user shared a link on the message board, which describes itself as “the darkest reaches of the internet” and is seen as a breeding ground for hate.
Facebook said it removed the video from its platform immediately after being contacted by New Zealand police.
“We removed the personal accounts of the named suspect from Facebook and Instagram, and are actively identifying and removing any imposter accounts that surface,” Sonderby said, noting that the original Facebook Live video has been hashed, meaning any visually similar content will be detected and removed automatically.
Facebook said it now employs audio technology to help catch any new uploads because the platform had trouble detecting screen recordings and other variants of the shooting footage.
“We designated both shootings as terror attacks, meaning that any praise, support and representation of the events violates our Community Standards and is not permitted on Facebook,” Sonderby said.
The massacre unfolded in Christchurch on Friday, and at least 50 people were killed in two separate mosques. The video showed only the first mosque.
The alleged gunman appears to have espoused hatred for Muslim immigrants and embraced white supremacy in a 74-page manifesto posted on 8chan and Twitter before the attack.