Click here to follow Screen Digital on YouTube and stay updated with the latest from the world of cinema.
Elon Musk’s new AI video tool, Grok Imagine, is under fire for churning out explicit fake videos of celebrities like Taylor Swift, even without being prompted for sexual content. According to a BBC report, this new feature is making “a deliberate choice” to create NSFW clips of the singer and other stars, including Scarlett Johansson, as flagged by an online abuse researcher. The outrage exploded after The Verge tested Grok’s “spicy” mode.
Grok Imagine rolled out this week for Apple users and went viral within minutes. According to Musk, the first launch racked up around 34 million images in just 48 hours. Available through the $30 SuperGrok subscription, the tool lets users create still images from text prompts, which can then be turned into videos using four options: “Custom,” “Normal,” “Fun,” and “Spicy.”
While testing the feature, Verge’s Jess Weatherbed prompted, “Taylor Swift celebrating Coachella with the boys,” expecting harmless party shots. Grok delivered an image of the singer in a dress, standing behind a group of guys. But when she tried generating an animated video, the AI made explicit content instead. She stressed she never asked for nudity, only clicked the “spicy” setting. “It was shocking how fast I was met with it. I never told it to remove her clothing — all I did was select ‘spicy.’”
The real problem with this tool is that in countries like the UK, the law now requires platforms hosting explicit content to carry out proper age verification, and Grok didn’t. All it asked for was a date of birth. No proof. No checks. That’s a direct violation. Professor Clare McGlynn stressed this wasn’t some random glitch. She pointed out Musk’s team could have removed the feature entirely, particularly after the January 2024 incident when sexual deepfakes of the Cruel Summer singer went viral online, racking up millions of views before X finally stepped in and temporarily blocked searches for her name.
And it’s not just Taylor Swift, Deadline and Gizmodo tested Grok with other stars. Scarlett Johansson, Sydney Sweeney, Jenna Ortega, Nicole Kidman, Kristen Bell, Timothée Chalamet, and even Nicolas Cage, all could be shown undressing or posing suggestively in “spicy” mode. In some cases, attempts were blocked with a “video moderated” message, but worked for others. Johansson, who has long warned about deepfake abuse and threatened legal action against AI companies before. Kristen Bell has spoken about her experience of being used in fake videos.
Click here to follow Screen Digital on YouTube and stay updated with the latest from the world of cinema.