TikTok is under fire and has been urged to apologize to the family of James Bulger due to the widespread dissemination of “harrowing” AI-generated videos depicting the two-year-old describing the events of his murder. The disturbing videos feature an animated version of the toddler narrating the moments leading up to his tragic death.
James Bulger was abducted from a shopping center in Merseyside on February 12, 1993, by two 10-year-old boys, Jon Venables and Robert Thompson. The young boys then subjected James to torture before brutally killing him.
One of the videos shows a photo of James being manipulated to make it appear as though he is speaking, while a generated voice explains the horrifying circumstances surrounding his death. Despite TikTok’s claim that it had removed such content for violating its guidelines, the videos continue to surface.
The issue came to light when James’s mother, Denise Fergus, expressed her dismay, labeling the videos “beyond sick,” and demanding their immediate removal.
Kym Darby, the chairwoman of the James Bulger Memorial Trust, revealed that over 100 clips were discovered on TikTok, and some are still slipping through their monitoring systems. Darby condemned the platform, stating, “It’s absolutely shocking. Denise doesn’t mind people talking about James and what happened because it raises awareness, especially for the younger generation, but it’s overstepped the line. TikTok owes her a personal apology and some assurance they will all be taken down.”
In response to the controversy, TikTok asserted that such disturbing content has no place on its platform and emphasized that “synthetic” media featuring the likeness of young individuals is prohibited and will be removed.
The UK government also weighed in on the issue, condemning the “harrowing” videos and stating that the forthcoming Online Safety Bill would hold social media platforms legally accountable for upholding their terms of service. A government spokesperson mentioned that clips like these serve no purpose other than causing distress and upset, and the Online Safety Bill would ensure swift action is taken against platforms that fail to enforce their own rules. Under the new legislation, media regulator Ofcom would have the power to levy fines of up to £18 million (CAD $30.638.866) or 10% of global annual revenue for non-compliance.