Self-harm clips hidden in kids' cartoons

Share

A mom in Ocala, Florida found videos on YouTube Kids that gave children instructions for suicide.

She said a man known as Filthy Frank was edited into the video and was actually teaching kids how to slit their wrists and commit suicide. "Though YouTube has deleted channels and removed videos, Hess points out that it's still easy to find a plethora of "horrifying" content aimed at children on YouTube Kids". "Sideways for attention. Longways for results".

Hess became involved in seeking out disturbing content in children's clips on social media after she witnessed increased child suicide rates occurring within her emergency room in recent years.

"I think our kids are facing a whole new world with social media and internet access". "We also need to fight to have the developers of social media platforms held responsible when they do not assure that age restrictions are followed and when they do not remove inappropriate and/or risky material when reported".

In July of 2018, the mother originally saw a video of a man holding out his arm and instructing kids on how to properly kill themselves. I immediately turned off the video.

Fortnite has proven extremely popular with children and teens but is reportedly being hacked by the Momo Challenge.

As for YouTube Kids, it said: "We work to ensure the videos in YouTube Kids are family-friendly and take feedback very seriously".

SpaceX rocket blasts off carrying Israeli lunar lander
SpaceX described the booster recovery - its 34th - as facing "some of the most challenging reentry conditions to date". If successful, Israel will be the fourth country to achieve a controlled touchdown on the moon's surface.

"We rely on both user flagging and smart detection technology to flag this content for our reviewers", Faville added.

"We remove flagged videos that violate our policies".

Does your kid spend a lot of time or even nominal time on YouTube watching innocent cartoon videos?

According to numerous reports, children have been watching Peppa Pig and Fortnite videos that then feature Momo instructing viewers to self-harm or perform other unsafe stunts. As The Post's Elizabeth Dwoskin reported last month, YouTube announced that it was rebuilding its recommendation algorithm to prevent it from prompting videos that include conspiracy theories and other bogus information, though the videos would remain on the site.

Nadine Kaslow, a former president of the American Psychological Association, told The Washington Post that taking down the videos won't be enough. "I had to stop, but I could have kept going", Hess said.

"But no system is flawless and inappropriate videos can slip through, so we're constantly working to improve our safeguards and offer more features to help parents create the right experience for their families", the website's description says. Vulnerable children, perhaps too young to understand suicide, may develop nightmares or try harming themselves out of curiosity, she warned.

If you have thoughts of suicide, confidential help is available for free at the National Suicide Prevention Lifeline.

Share