YouTube handled an "unprecedented volume" of videos after last week's massive shootings in New Zealand, while the platform was trying to remove the videos with the recording, said YouTube's product manager . Washington Post .
The Friday killings at two mosques in Christchurch were recorded and published in social media around the world as part of a plan that was apparently designed to disseminate the images online. As the archival material progressed over the Internet, it was repeatedly loaded. Yesterday, Facebook said that it eliminated 1.5 million videos of the attack in the first 24 hours after the shooting.
While YouTue did not say precisely how many videos were ultimately deleted, the company faced an avalanche of videos after the shooting, and the moderators worked all night to remove tens of thousands of videos with the footage, product manager Neal Mohan told the Post . It was reported that some uploads were modified to evade detection, as users modified the footage slightly to prevent automatic tools from marking it.
Copies were added as soon as one per second and, finally, the platform disabled some searches to limit visibility. YouTube also cut some human revision features to speed up the process, Mohan said at Post . (The service said on Friday that it was sending potentially newsworthy videos containing clips of the images to humans for review.)
Social media companies are faced with new questions about the moderation of the platform after filming, which extends not only to the most important services. , but also in the darkest corners of the internet. While the services said the incident was unprecedented, the broadcast of the video footage led to calls from lawmakers for companies to do more to monitor their platforms.