https://gizmodo.com/youtube-bans-ai-reanimated-dead-kids-true-crime-videos-1851150159 [p] * The A.V. Club * Deadspin * Gizmodo * Jalopnik * Kotaku * Quartz * The Root * The Takeout * * The Onion * * The Inventory Send us a Tip!ShopSubscribe The Future Is Here We may earn a commission from links on this page Search * Home * Latest * News * Reviews * Science * Earther * io9 * AI * Space * Espanol * Video Tech News YouTube: No More AI Zombie Children Grifters on YouTube will have to find new ways to make money off of the suffering of real murder victims. By Thomas Germain Published5 hours ago Comments (3) We may earn a commission from links on this page. A finger hitting play on a keyboard, superimposed over a YouTube video Photo: Dilok Klaisataporn / Shutterstock.com (Shutterstock) When Tim Berners-Lee invented the World Wide Web back in 1989, he probably didn't imagine that his new system would be the repository for all of humanity's worst impulses, but here we are. The latest example from the internet horror show comes from YouTube, where the company was forced to update its policies to say that no, you're not allowed to make AI videos of dead kids for your true crime content. Watch Twitter Verification is a Hot Mess CC Share Subtitles * Off * English Share this Video FacebookTwitterEmail RedditLink view video Twitter Verification is a Hot Mess Why is the Dancing Baby an NFT? June 1, 2022 Gizmodo's Favorite New Emojis July 15, 2022 YouTube described the change in a post on its Help Center. "On January 16, we'll begin striking content that realistically simulates deceased minors or victims of deadly or well-documented major violent events describing their death or violence experienced," the company wrote. Advertisement The update comes in response to a disturbing genre of videos that generated millions of views on social media with the simulated voices of real child murder victims describing their own gruesome deaths, as reported by the Verge Monday. Advertisement "Grandma locked me in an oven at 230 degrees when I was just 21 months old," an animated baby said in one viral TikTok video, before identifying itself as Rody Marie Floyd, a real murder victim. "Please follow me so more people know my true story." Similar videos sparked widespread attention on YouTube. Advertisement TikTok already has policies that address this class of internet obscenity. The platform requires labels on AI-created videos and prohibits deepfakes of people under 18 or any adult who isn't a public figure. The videos aren't just disturbing for viewers, they're painful for survivors. Denise Fergus, whose son James Bulger was abducted and killed in 1993, called the AI videos featuring her child "disgusting" in an interview with the Mirror. "It is bringing a dead child back to life," Fergus said. "It is beyond sick." Advertisement Time and again we've seen there's nothing so depraved that someone won't try to monetize it online. You could blame the people making this creep show content, but it's also a logical consequence of a system that incentivizes creators to hijack our attention. Show all 3 comments