Why can’t YouTube automatically catch re-uploads of disturbing footage?

After a man used Facebook to broadcast live his attack on two mosques in New Zealand last night, the video quickly spread to YouTube. The moderators defended themselves, trying to eliminate the terrible images, but new uploads of the video continued to appear. Many observers asked: since YouTube has a tool to automatically identify copyrighted content, why can not you identify this video and delete it automatically?

Exact re-uploads of the video will be banned by YouTube, but videos that contain clips of the footage should be sent to human moderators for review, The Verge has learned. Part of that is ensuring that news videos that use part of the video for their segments are not removed in the process.

The YouTube security team considers it a balancing act, according to sources familiar with his thinking. For important news events, such as yesterday's shots, the YouTube team uses a system similar to its copyright tool, Content ID, but not exactly the same. Look for recharged versions of the original video to get metadata and similar images. If it is a reload without editing, it is deleted. If edited, the tool points it out to a team of human moderators, full-time YouTube employees and contractors, who determine if the video violates company policies.

YouTube also has a system to immediately eliminate child pornography and terrorism. Related content, by fingerprinting the footage using a hash system. But that system does not apply in cases like this, due to the potential for journalistic interest. YouTube believes that the elimination of videos of journalistic interest is equally detrimental. YouTube prohibits images intended to "surprise or displease viewers," which may include the consequences of an attack. However, if used for informational purposes, YouTube says images are allowed, but may have age restrictions to protect younger viewers.

The other problem is that the YouTube content identification system is not designed to deal with breaking news events. Rasty Turek, executive director of Pex, a video analysis platform that is also working on a tool to identify content that is uploaded again or stolen, told The Verge that the problem is how the product is implemented. Turek, who studies YouTube's content identification software closely, points to the fact that the software takes 30 seconds to register even if something is reloaded before delivering it for manual review. A YouTube spokesperson could not say The Verge if that number was accurate.

"They never tried to have something like this for this type of event," Turek said. "It is possible that they will appear after this type of situation, but that will take months to implement even if [CEO] Susan Wojcicki orders it today."

YouTube's content identification tool takes "a couple of minutes, or even hours," to record the content, "said Turek, which is usually not a problem for copyright issues, but poses problems Real when it is applied to urgent situations.

Turek says the pressure to do more, and to do it faster, is growing. "The pressure never used to be that high," said Turek. "There is no harm to a society when Copyrighted things or leaks are not removed immediately. However, here is a damage to society. "

The next big hurdle, in which both YouTube and Turek can agree, is to catch live broadcasts as they happen, according to Turek, it's almost impossible because the content of a live broadcast is constantly changing.

It's Why Live Broadcasting Is Considered a High risk area for YouTube People who violate the rules of live broadcasts, which are sometimes discovered using Content ID once the live broadcast ends, lose their broadcasting privileges because it is an area that YouTube can not monitor So good YouTube teams are working on it, according to the company, but they recognize that it is very difficult, Turek agrees.

"No one can identify this directly," said Turek. You can blame YouTube for many things, but nobody on this planet can fix the live broadcast at this time. "

For now, YouTube focuses on analyzing all the videos that appear with similar metadata and images found in the original of the live broadcast shooter, and determining what is newsworthy and what violates its rules. It may be all they can handle now, but it's not for critics like Turek.

"This must be a priority for leadership," he said. "Once they decide this is a priority and give the team adequate resources, the team will solve it. There is no doubt about it "

Please Note: This content is provided and hosted by a 3rd party server. Sometimes these servers may include advertisements. igetintopc.com does not host or upload this material and is not responsible for the content.