The app contains dark corners, as videos that are disturbing for children slip past its filters, either by mistake or because bad actors have found ways to fool the YouTube Kids algorithms.

Share story

It was a typical night in Staci Burns’ house outside Fort Wayne, Indiana. She was cooking dinner while son Isaac, 3, watched videos on the YouTube Kids app on an iPad. Suddenly he cried out, “Mommy, the monster scares me!”

When Burns walked over, Isaac was watching a video featuring crude renderings of the characters from “PAW Patrol,” a Nickelodeon show popular among preschoolers, screaming in a car. The vehicle hurtled into a light pole and burst into flames.

The 10-minute clip, “PAW Patrol Babies Pretend to Die Suicide by Annabelle Hypnotized,” was a nightmarish imitation of an animated series in which a boy and a pack of rescue dogs protect their community from troubles like runaway kittens and rock slides.

Parents and children have flocked to Google-owned YouTube Kids since it was introduced in early 2015. The app’s more than 11 million weekly viewers are drawn in by its seemingly infinite supply of clips, including those from popular shows by Disney and Nickelodeon, and the knowledge that the app is supposed to contain only child-friendly content.

But the app contains dark corners, as videos that are disturbing for children slip past its filters, either by mistake or because bad actors have found ways to fool the YouTube Kids algorithms.

In recent months, parents like Burns have complained that their children have been shown videos with well-known characters in violent or lewd situations and other clips with disturbing imagery, sometimes set to nursery rhymes. Many have taken to Facebook to warn others and to share video screenshots showing moments ranging from a claymation Spider-Man urinating on Elsa of “Frozen” to Nick Jr. characters in a strip club.

“My poor little innocent boy, he’s the sweetest thing,” said Burns, a nurse who credits the app with helping Isaac to learn colors and letters before other boys his age. “And then there are these horrible, horrible, evil people out there that just get their kicks off of making stuff like this to torment children.”

Malik Ducard, YouTube’s global head of family and learning content, said the inappropriate videos were “the extreme needle in the haystack,” but that “making the app family-friendly is of the utmost importance to us.”

While the offending videos are a tiny fraction of YouTube Kids’ universe, they are another example of the potential for abuse on digital-media platforms that rely on computer algorithms, rather than humans, to police the content that appears in front of people — in this case, very young people.

And they show, at a time Congress is closely scrutinizing technology giants, how rules that govern at least some of the content on children’s television fail to extend to the digital world.

When videos are uploaded to YouTube, algorithms determine whether they are appropriate for YouTube Kids. The videos are continually monitored after that, Ducard said, a process that is “multilayered and uses a lot of machine learning.” Several parents said they expected the app to be safer because it asked during setup whether their child was in preschool or older.

Ducard said that while YouTube Kids might highlight some content, like Halloween videos in October, “it isn’t a curated experience.” Instead, “parents are in the driver’s seat,” he said, pointing to the ability to block channels, set usage timers and disable search results.

Most videos flagged by parents were uploaded to YouTube in recent months by anonymous users with names like Kids Channel TV and Super Moon TV. The videos’ titles and descriptions feature popular character names and terms like “education” and “learn colors.”

They are independently animated, presumably to avoid copyright violations and detection. Some clips uploaded as recently as August have millions of views on the main YouTube site and run automatically placed ads, suggesting they are financially lucrative for the makers and YouTube, which shares in ad revenue. It is not clear how many of the views came on YouTube Kids.

One video on YouTube Kids from the account Subin TV shows “PAW Patrol” characters in a strip club. One of them then visits a doctor and asks for her cartoon legs to be replaced with long, provocative human legs in stiletto-heeled shoes.

The account that posted the video seen by Burns’ son is named Super Ares TV and has a Facebook page called PAW Patrol Awesome TV. Questions sent there were mostly ignored, though the account did reply: “That’s a Cute character and video is a funny story, take it easy, that’s it.”

The Super Ares TV account seems to be linked to a number of other channels targeting children with cartoon imitations, based on their similar channel fonts, animation style and Greek mythology-inspired names, from Super Hermes TV and Super Apollo TV to Super Hera TV.

A Super Zeus TV account included a link to a shopping site,, registered in Ho Chi Minh City, Vietnam.

A call to the phone number listed in the site’s registration records was answered by a man who declined to identify himself. He said his partners were responsible for the videos. He said he would forward email requests for comment to them. Those emails went unanswered.