Facebook says images of the Christchurch shooting were seen only 200 times during its live broadcast, and 4,000 times in total before it was deleted. The company also says that it did not receive any user reports on the live broadcast until 12 minutes after the video ended.
During the 24 hours after the initial transmission, people tried to upload the video 1.5 million times. The disparity highlights the specific challenges facing the social network, especially when an attack like Christchurch's attack went viral. The videos of the attack were published on various social networks, and links to the live broadcast and a manifesto on 8chan were posted before the attack.
Despite Facebook's claims, there are reasons to doubt these numbers. The display figures of the platform are currently the focus of a lawsuit, which has accused the social network of inflating its display figures in an attempt to attract advertisers to its platform.
The account of the platform about the time of the first user report has also been challenged by Right Wing Watch researcher Jared Holt who states that reported the video shortly after finding it ] through His 8-channel thread. The Facebook account also indicates that he deleted the video after being contacted by the New Zealand Police, instead of a user report, although he says he did so "in a matter of minutes" of having been contacted.
Now, the challenge for Facebook has been to prevent the video from being uploaded again. The company says it generated a video hash to avoid publishing unedited versions, and is using audio recognition to identify the video versions that have been recorded on the screen. There were blocked 1.2 million attempts to upload the video in the first 24 hours, but that means that 300,000 versions still managed to slip through the Facebook filters.
Facebook is not the only social network that works to stop the spread of attack images. YouTube, Twitter, Reddit and even Steam have been working to remove the content or tributes to the attacker.