On the surface, it seems like a great idea. YouTube is home to all kinds of quality video content, so a dedicated, age-appropriate version for children sounds great. However, even with a host of parental controls, YouTube Kids is still plagued with issues in regard to content.
Whether it’s conspiracy videos or obviously disturbing clips, YouTube Kids is flooded with terrible content. But that’s not surprising. With 10 hours of video uploaded every second, trying to police YouTube content is like trying to slow the water coming out of a fire hydrant. A spokesperson said last year, “We’ve taken a series of tough actions to better protect families on YouTube and YouTube Kids, including getting content down faster through technology, hiring more people to help us tackle content that breaks our rules and cutting the revenue streams to content that misleads families.” But those actions haven’t always been enough.
In response to criticism, YouTube has tried to deter users from posting offensive videos. In 2017, it altered its policy on what types of content would be eligible for ad money. Those changes kept clips that show “family entertainment characters engaged in violent, sexual, vile or otherwise inappropriate behavior, even if done for comedic or satirical purposes” from earning money. YouTube also improved its parental controls. Last April, the company added the option of only seeing content that had been approved by “partners and the YouTube Kids team.” Then in September, it expanded that feature so that kids using the app would only see what their parents approved.