Disturbing YouTube content

If you’ve ever spent time with a relative under the age of six who watches YouTube, you’ve probably seen dozens of toy unwrapping videos and have heard the Finger Family counting song one too many times. Channels targeting kids, like “Ryan ToysReview” or “Seven Awesome Kids” rack up millions and millions of views, profiting because of parents who plop an iPad in front of their children for hours, letting autoplay screen video after video of animated cartoons and family-friendly pranks.

While most content for children on YouTube is innocent, several prominent figures and major companies have recently called attention to a darker side of YouTube Kids. In early 2017, YouTube news commentator Philip DeFranco discussed what has been named “Elsa-gate,” bringing to light hundreds of channels that manipulate the YouTube algorithm by posting videos of cartoon characters engaging in disturbing behavior—with thousands of videos showing violent and sexual content featuring animated or live action adaptations of characters like Mickey Mouse, Spiderman and Elsa from Frozen.

These videos are tagged under children’s categories, and due to YouTube’s autoplay feature, parents who put on an innocent toy unwrapping video might have exposed their young children to wildly inappropriate content. Some of these videos have millions of views— and even more alarmingly—most are monetized. Nobody is quite sure where these videos are from, but YouTube has since claimed to be working on fixing the problem, deleting over 150,000 videos last November.

This phenomenon has received a bit of attention: technology and media expert James Bridle gave a TED Talk last April, saying that although YouTube’s vetting process has made some improvements, that effort hasn’t been enough. And as disturbing content continues to be monetized and shown to kids months after YouTube improved their safeguards, I have to agree.

YouTube has never done a very good job in determining what is appropriate and inappropriate for advertisements, receiving backlash for quietly demonetizing smaller channels and LGBTQ content, while actively promoting controversial figures like idiot Logan Paul. While few know exactly how YouTube vets videos, the company should be working on AI that can ensure that family-friendly content is indeed family-friendly, and hiring more human moderators to discern exactly what is appropriate to be shown to the public. As the largest platform for broadcast creators, YouTube isn’t going to disappear anytime soon, making it all the more important for them to improve their system of monetization and advertising.

While the increase in safeguarding and moderators was a step in the right direction, YouTube needs to be conscious of what’s on the line—whether it be smaller channels being demonetized for talking about mental health, or the children on the other side of the screen seeing Peppa Pig abduct and murder someone.