YouTube will not stop recommending videos of young children, despite ongoing concerns that predators are being presented with these videos through the company’s recommendation algorithm.
A new report from The New York Times found that, despite evidence from independent researchers that YouTube’s algorithm helps videos of children spread among predatory circles, YouTube’s teams don’t want to turn off recommendations because it would hurt creators by reducing traffic driven to their videos. Instead, the company will “limit recommendations on videos that it deems as putting children at risk,” the Times writes.
Limiting recommendations is the latest attempt on YouTube’s part to control its pedophilia problem. The company instituted major changes in February when it was first alerted that predators were using the comments section on videos starring children to engage in sexually exploitive conversations. YouTube’s safety team decided to close the comment sections on most videos that star minors. It’s unclear from The New York Times article if comments were turned off in an example used in the story. The Verge has reached out to YouTube for further details.
YouTube’s terms of service state that children under the age of 13 aren’t allowed to have their own accounts, but many of these innocuous videos are uploaded by older family members. Many children are also key components to an entire genre on YouTube known as “family vlogging.” Creators like The Ace Family (16.4 million subscribers), Tydus and Cor (2.8 million subscribers), Daily Bumps (4.6 million subscribers), and Roman Atwood Vlogs (15.2 million subscribers), put their kids front and center. After watching them, YouTube’s algorithm recommends more family vlogging content and videos geared around children.
YouTube is trying to balance the success of these creators, and its platform as a whole, with critical moderation concerns. Family vloggers were left frustrated by YouTube’s decision to close comments earlier this year. A couple of family vloggers told The Verge that they understood YouTube needed to find a solution to this problem, but many said they were left feeling like the change would be the end of their career.
YouTube’s statement to the Times reiterates this double-edged sword: YouTube needs to ensure that its creators are protected from bad actors, but it also wants to promise its extensive creator base that they can continue to operate. When YouTube decided to remove comments, a company spokesperson told The Verge that they understand the frustration but added, “we also know that this is the right thing to do to protect the YouTube community.” Still, changing the algorithm also means YouTube is taking a hit in an area the company deeply values: watch time. YouTube is moving further away from “engagement”-based recommendations, but watch time is still crucial to the company.