For years, YouTube ignored its employees' requests to address and remove toxic videos in a search to increase engagement, reports Bloomberg . According to more than 20 current and former YouTube employees, employees would offer proposals to stop the dissemination of videos containing conspiracy theories and disturbing and disturbing content, but it was reported that the leadership was more interested in boosting the commitment than in providing Attention to those warnings.  One proposal offered a way to keep content that was "close to the line" from violating policies on the platform, but removing it from the recommended tab. YouTube rejected that suggestion in 2016, a former engineer said, and instead continued recommending videos no matter how controversial they were. According to employees, the internal goal was to achieve one billion hours of visits per day.
"I can say with great confidence that they were deeply mistaken," said engineer Bloomberg . (YouTube implemented a policy in January of 2019 which is similar to what he initially suggested.)
Employees who are not part of the moderation team were also discouraged from looking for toxic videos on YouTube, as the lawyers said that the company would have a greater responsibility if there was proof that employees knew and recognized that these videos existed.
At least five high-level employees have left YouTube due to their unwillingness to address the problem.As another former employee described, YouTube CEO Susan Wojcicki , "he never put his fingers in the balance," and said that his opinion was simply to "run the company" instead of dealing with the avalanche of information. n erroneous and dangerous content. A YouTube spokesman said the company began taking action in late 2016 and began to demonetize channels that promoted harmful content in 2017. However, as recently as the end of that year, fewer than 20 employees worked on his team of "confidence and security". You can read the full report of Bloomberg here for more anecdotes of how employees struggled to keep controversial videos from becoming viral.
In 2018, YouTube tried to prevent false news and conspiracies from spreading on its platform with an information box, and this year, it began to extract advertisements for potentially harmful content. Even so, even if YouTube can prevent controversial videos from spreading, it will eventually have to deal with the central problem of content moderation, since the toxic content is still rampant on the site.