YouTube’s Family Friendly Content Has Some Fatal Flaws

Kristin MacLaughlin

You might think YouTube's family-friendly videos are safe for your kids, but we found some fatal flaws.

Many brands use YouTube to target advertising to parents on children’s videos and recently, they’ve found the platform has some fatal flaws. After finding their advertising on inappropriate videos, corporate sponsors such as Adidas, eBay, Amazon, Mars (M&Ms, Snickers), Mondelez (Oreos, Cadbury) pushed You Tube to take action as they threaten to freeze ad accounts indefinitely.

Mars, one of the most vocal of the sponsors said, “We are shocked and appalled to see that our adverts have appeared alongside such exploitative and inappropriate content. It is in stark contrast to who we are and what we believe. We have taken the decision to immediately suspend all our online advertising on YouTube and Google globally.”

After intense pressure from advertisers, YouTube finally took the following action: More than 270 accounts and over 150,000 videos were removed recently from the YouTube platform and commenting on 625,000 videos targeted by child predators.

YouTube Kids Isn’t Kid-Safe

On-demand video viewing is intoxicating to almost everyone involved including parents, children, content creators and advertisers. The challenge comes when a platform such as YouTube does not have a process in place that can ensure that posted videos are actually generated from a reputable content producer.

For example, if your child is a fan of Peppa Pig, take the time to make sure the videos your child is watching are from a trusted source and not a counterfeit cartoon. Unfortunately, video producers can counterfeit, copy, pirate and change story lines to create videos they hope will go viral and some of the revised story lines are extremely graphic and inappropriate. There are countless producers that create counterfeit Peppa Pig videos that has characters doing extreme violence including using knives, guns, Peppa eating her father or drinking bleach.

A recent New York Times article identified other various animated videos that slipped past the filter and shows monstrous characters that would traumatize most adults and preschoolers alike – all of which were easily viewed on the “kid-safe” YouTube Kids app.

Make no mistake, exposing children to these types of videos is child abuse, and as parents we must be very wary of any platform being able to police their user generated content. Please don’t be fooled by any platform that uses the term “kid-safe.”

How Prolific is Children’s Programming on YouTube?

Kids programming is the most popular category on YouTube with four of the five top U.S. channels featuring kids content, according to Tubular Labs. Additionally, over 1 billion hours of video are watched on YouTube every day. So make no mistake, your kids most likely are on YouTube!

Content creators love it too, since YouTube gives its video producers 55% of the ad revenue generated. With that kind of money to make, no wonder YouTube offers a continuous stream with videos for as long as your child will watch.

What Parents Need to Understand About YouTube Filters

The base problem with YouTube’s filter is a similar issue that Facebook was criticized for earlier in the year. The power of Facebook and YouTube is that they provide a platform for users to post and view content, but that content is often not appropriate for children.

While I have no doubt these technology companies take these inappropriate content issues seriously (especially when advertisers take notice), it is virtually impossible for them to put safeguards in place to screen all content even when using a combination of artificial intelligence (AI) and human moderators.

YouTube receives over 300 hours of new submitted video every minute, and although their moderator policies are not public record, most reporters who have covered YouTube have assumed that videos are only reviewed after it has been flagged by a user as inappropriate.

YouTube also uses algorithms to suggest videos based on a child’s interest, so if your child is a fan of Elsa from the Disney movie Frozen or Spider Man, the algorithm looks for keywords in the video description like “Elsa” or “Spider Man.” Within that mix of videos though, it’s possible that Elsa and Spidey are shown in counterfeit videos doing activities that are not G-rated.

Steps Parents Can Take for a Safer YouTube Experience

Setting limits on screen time and the type of content is a constant priority for parents and let’s face it, some days that is an easier task to accomplish than others. Below are a few steps that make imposing limits a bit easier.

To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video

Enable Restricted Mode

Restricted mode hides videos that may contain inappropriate content flagged by users and other YouTube algorithms. Depending on the age of your child, they may find this mode too restrictive, so you will need to evaluate based on the age of your child.

Step 1: Open the YouTube app on tap on the user icon near the top right

Step 2: Tap on setting icon in the menu

Step 3: Toggle the switch next to Restricted Mode

Pause Watch History & Search History

If you want to disable the endless loop of suggested videos based on your watch and search history, consider pausing these 2 options.

Step 1: Open the YouTube app on tap on the user icon near the top right

Step 2: Tap on setting icon in the menu

Step 3: Scroll down to privacy

Step 4: Toggle the switch next to Pause Watch History and Pause Search History

Create Playlists

Rather than letting YouTube auto-generate a playlist for your kids, you choose the videos you feel appropriate.

Step 1: Open the YouTube app

Step 2: Search for a video you think your child will like and watch it to make sure it is appropriate if you have not seen it.

Step 3: Click the menu option under the video

Step 4: Tap add to playlist

Step 5: If you are not adding to an existing playlist, click “Create a playlist” and name your playlist.