In the first 24 hours after the deadly mass shooting in New Zealand, Facebook says it eliminated 1.5 million videos that were uploaded from the attack, of which 1.2 million "went up."
The company made the ad in a Tweet following up on a previous announcement that it had been alerted by the authorities and removed the Facebook and Instagram accounts of the alleged shooter. Facebook spokeswoman Mia Garlick says the company is also "eliminating all edited versions of the video that do not show graphic content."
In the first 24 hours we eliminated 1.5 million videos of the attack worldwide, of which more than 1.2 million were blocked in the upload …
– Facebook Newsroom (@fbnewsroom) March 17 of 2019
We communicate with Facebook for additional comments, and we will update this publication if we listen back.
The terrorist attack appears to have been designed to go viral, with the alleged shooter launching a manifesto that referred to numerous people like YouTuber Felix Kjellberg and Candace Owens, as well as conspiracy theories of white supremacy. He also posted a 17-minute video on Facebook, Instagram, Twitter and YouTube, which led to the message becoming more viral, even when all those companies have worked to prevent its spread.
The attacks have led social networking sites to react to such content: Facebook, Twitter and YouTube have been working to eliminate videos. Reddit banned a subreddit called r / watchpeopledie, while Valve began to eliminate tributes to the alleged shooter that had been posted in the user profiles.
But Facebook's elimination of more than one million copies (and edited versions) of the video speaks to the enormous challenge it has in moderating the site. In their drive for rapid growth, their efforts to expand their ability to monitor and remove content that is offensive, illegal or disruptive have been left without resources, and allow suspects to use the platform to spread their message quickly. There have been other high-profile examples of murders or terrorist attacks on the platform. In fact, as Facebook has worked to solve the problem, it has turned to external contractors, some of whom have been radicalized and traumatized by the mere fact of eliminating said content.
After the attack, numerous world leaders have called Facebook for its role in spreading this type of content. According to Reuters New Zealand Prime Minister Jacinda Ardern said she wants to talk to the company about the live broadcast, while British Labor leader Jeremy Corbyn said such platforms should act, and raised the question on regulation.
Updated on March 17, 2019 11:17 a.m. ET : Updated to clarify that Facebook has eliminated a total of 1.5 million videos, with 1.2 million blocked in the upload.