Start United States USA — IT On YouTube Kids, Startling Videos Slip Past Filters

On YouTube Kids, Startling Videos Slip Past Filters

289
0
TEILEN

The app has more than 11 million weekly viewers. But some disturbing knockoff videos have reached children, upsetting parents.
It was a typical night in Staci Burns’s house outside Fort Wayne, Ind. She was cooking dinner while her 3-year-old son, Isaac, watched videos on the YouTube Kids app on an iPad. Suddenly he cried out, “Mommy, the monster scares me!”
When Ms. Burns walked over, Isaac was watching a video featuring crude renderings of the characters from “PAW Patrol,” a Nickelodeon show that is popular among preschoolers, screaming in a car. The vehicle hurtled into a light pole and burst into flames.
The 10-minute clip, “PAW Patrol Babies Pretend to Die Suicide by Annabelle Hypnotized,” was a nightmarish imitation of an animated series in which a boy and a pack of rescue dogs protect their community from troubles like runaway kittens and rock slides. In the video Isaac watched, some characters died and one walked off a roof after being hypnotized by a likeness of a doll possessed by a demon.
“My initial response was anger,” said Ms. Burns, a nurse, who credits the app with helping Isaac to learn colors and letters before other boys his age. “My poor little innocent boy, he’s the sweetest thing, and then there are these horrible, evil people out there that just get their kicks off of making stuff like this to torment children.”
Parents and children have flocked to Google-owned YouTube Kids since it was introduced in early 2015. The app’s more than 11 million weekly viewers are drawn in by its seemingly infinite supply of clips, including those from popular shows by Disney and Nickelodeon, and the knowledge that the app is supposed to contain only child-friendly content that has been automatically filtered from the main YouTube site.
But the app contains dark corners, too, as videos that are disturbing for children slip past its filters, either by mistake or because bad actors have found ways to fool the YouTube Kids algorithms.
In recent months, parents like Ms. Burns have complained that their children have been shown videos with well-known characters in violent or lewd situations and other clips with disturbing imagery, sometimes set to nursery rhymes. Many have taken to Facebook to warn others, and share video screenshots showing moments ranging from a Claymation Spider-Man urinating on Elsa of “Frozen” to Nick Jr. characters in a strip club.
Malik Ducard, YouTube’s global head of family and learning content, said that the inappropriate videos were “the extreme needle in the haystack,” but that “making the app family friendly is of the utmost importance to us.”
While the offending videos are a tiny fraction of YouTube Kids’ universe, they are another example of the potential for abuse on digital media platforms that rely on computer algorithms, rather than humans, to police the content that appears in front of people — in this case, very young people.
And they show, at a time when Congress is closely scrutinizing technology giants, how rules that govern at least some of the content on children’s television fail to extend to the digital world.
When videos are uploaded to YouTube, algorithms determine whether or not they are appropriate for YouTube Kids. The videos are continually monitored after that, Mr. Ducard said, a process that is “multilayered and uses a lot of machine learning.” Several parents said they expected the app to be safer because it asked during setup whether their child was in preschool or older.
Mr. Ducard said that while YouTube Kids may highlight some content, like Halloween videos in October, “it isn’t a curated experience.” Instead, “parents are in the driver’s seat,” he said, pointing to the ability to block channels, set usage timers and disable search results.
Parents are also encouraged to report inappropriate videos, which someone at YouTube then manually reviews, he said. He noted that in the past 30 days, “less than.005 percent” of the millions of videos viewed in the app were removed for being inappropriate.
“We strive,” he added, “to make that fraction even lower.”
Holly Hart of Gray, Tenn., said she was recently reading while her 3-year-old daughter was in the room when she noticed that Disney Junior characters in the video her daughter was watching started “turning into monsters and trying to feed each other to alligators.” An image previewing a recommended video showed the characters in a provocative pose.
“It was an eye-opener for me,” said Ms. Hart, who had downloaded the app because it was being used at the local elementary school.
Not all of the inappropriate videos feature cartoons. Alisa Clark Wilcken of Vernal, Utah, said her 4-year-old son had recently seen a video of a family playing roughly with a young girl, including a scene in which her forehead is shaved, causing her to wail and appear to bleed.
Most of the videos flagged by parents were uploaded to YouTube in recent months by anonymous users with names like Kids Channel TV and Super Moon TV. The videos’ titles and descriptions feature popular character names and terms like “education” and “learn colors.”
They are independently animated, presumably to avoid copyright violations and detection. Some clips uploaded as recently as August have millions of views on the main YouTube site and run automatically placed ads, suggesting they are financially lucrative for the makers as well as YouTube, which shares in ad revenue. It is not clear how many of those views came on YouTube Kids.
One video on YouTube Kids from the account Subin TV shows the “PAW Patrol” characters in a strip club. One of them then visits a doctor and asks for her cartoon legs to be replaced with long, provocative human legs in stilettos. The account’s description says, “Video created with the purpose of learning and development of children!”
The account that posted the video seen by Ms. Burns’s son is named Super Ares TV and has a Facebook page called PAW Patrol Awesome TV. Questions sent there were mostly ignored, though the account did reply: “That’s a Cute character and video is a funny story, take it easy, that’s it.”
The Super Ares TV account seems to be linked to a number of other channels targeting children with cartoon imitations, based on their similar channel fonts, animation style and Greek mythology-inspired names, from Super Hermes TV and Super Apollo TV to Super Hera TV.
A Super Zeus TV account included a link to a shopping site called SuperKidsShop.com, which is registered in Ho Chi Minh City, Vietnam. A call to the phone number listed in that site’s registration records was answered by a man who declined to identify himself. He said that his partners were responsible for the videos and that a team of about 100 people worked on them. He said he would forward email requests for comment to them. Those emails went unanswered.
Dr. Michael Rich, a pediatrics professor at Harvard Medical School and the director of the Center on Media and Child Health, said such videos brought up a host of issues for children.

Continue reading...