YouTube won’t stop recommending videos with children, despite pedophilia problem

YouTube will not stop recommending videos of young children, despite the constant concerns that predators are presented with these videos through the company's recommendation algorithm.

A new report from The New York Times found that, despite evidence from independent researchers that the YouTube algorithm helps children's videos spread among predatory circles, YouTube teams do not want to disable recommendations because it would hurt creators by reducing the traffic generated in their videos. Instead, the company "will limit recommendations on videos that it believes put children at risk," writes Times .

The limitation recommendations are YouTube's last attempt to control their pedophilia problem. The company initiated major changes in February when it was first alerted that predators were using the comments section on videos featuring children to participate in conversations of sexual exploitation. The YouTube security team decided to close the comment sections in most of the videos that star the children. It is not clear from The article New York Times if the comments were deactivated in an example used in the story. The Verge has approached YouTube for more details.

YouTube's terms of service state that children under the age of 13 can not have their own accounts, but many of these innocuous videos are uploaded by older family members. Many children are also key components of a complete genre on YouTube known as "family vlogging." Creators such as The Ace Family (16.4 million subscribers), Tydus and Cor (2.8 million subscribers), Daily Bumps (4.6 million subscribers) and Roman Atwood Vlogs (15.2 million subscribers) put their children front and center . After viewing them, the YouTube algorithm recommends more family vlogging content and videos geared towards children.

YouTube is trying to balance the success of these creators and their platform in general, with critical moderation concerns. The family vloggers were frustrated by YouTube's decision to close comments earlier this year. A couple of family vloggers told The Verge that they understood that YouTube needed to find a solution to this problem, but many said they had the feeling that the change would be the end of their career.

The YouTube statement in Times reiterates this double-edged sword: YouTube must ensure that its creators are protected from bad actors, but also wants to promise its broad base of creators that they can continue to operate . . When YouTube decided to remove the comments, a company spokesperson told The Verge that they understood the frustration but added, "we also know that this is the right thing to protect the YouTube community." , changing the algorithm also means that YouTube is getting hit in an area that the company deeply values: seeing the time. YouTube is moving further away from recommendations based on "commitment", but observation time is still crucial for the company.

Please Note: This content is provided and hosted by a 3rd party server. Sometimes these servers may include advertisements. igetintopc.com does not host or upload this material and is not responsible for the content.