by
Advertisement

Elon Musk’s AI accused of making explicit videos of Taylor Swift, Sydney Sweeney without being prompted after enabling NSFW ‘Spicy Mode’

Elon Musk’s Grok Imagine sparks outrage for generating explicit deepfakes of stars like Taylor Swift and Scarlett Johansson in “Spicy Mode”.

3 min read
Grok Imagine’s ‘Spicy Mode’ under fire after generating NSFW clips

Elon Musk’s new AI video tool, Grok Imagine, is under fire for churning out explicit fake videos of celebrities like Taylor Swift,  even without being prompted for sexual content. According to a BBC report, this new feature is making “a deliberate choice” to create NSFW clips of the singer and other stars, including Scarlett Johansson, as flagged by an online abuse researcher. The outrage exploded after The Verge tested Grok’s “spicy” mode.

“This is not misogyny by accident, it is by design,” said Clare McGlynn, a law professor working to make pornographic deepfakes illegal. The update has been slammed online, with many warning about the dangerous territory these models could lead users into if left unchecked.

Also read: Elon Musk reacts as Indian-origin partner Shivon Zilis turns son’s drawing into animated video using Grok: ‘That’s museum art’

Elon Musk’s AI accused of creating explicit videos with ‘Spicy Mode’

Grok Imagine rolled out this week for Apple users and went viral within minutes. According to Musk, the first launch racked up around 34 million images in just 48 hours. Available through the $30 SuperGrok subscription, the tool lets users create still images from text prompts, which can then be turned into videos using four options: “Custom,” “Normal,” “Fun,” and “Spicy.”

While testing the feature, Verge’s Jess Weatherbed prompted, “Taylor Swift celebrating Coachella with the boys,” expecting harmless party shots. Grok delivered an image of the singer in a dress, standing behind a group of guys. But when she tried generating an animated video, the AI made explicit content instead. She stressed she never asked for nudity, only clicked the “spicy” setting. “It was shocking how fast I was met with it. I never told it to remove her clothing — all I did was select ‘spicy.’”

Also read: Jimmy Fallon mocks Donald Trump’s tariffs on India; jokes about Putin meeting POTUS: ‘Building a ballroom to meet Sydney Sweeney’

The real problem with this tool is that in countries like the UK, the law now requires platforms hosting explicit content to carry out proper age verification,  and Grok didn’t. All it asked for was a date of birth. No proof. No checks. That’s a direct violation.  Professor Clare McGlynn stressed this wasn’t some random glitch. She pointed out Musk’s team could have removed the feature entirely, particularly after the January 2024 incident when sexual deepfakes of the Cruel Summer singer went viral online, racking up millions of views before X finally stepped in and temporarily blocked searches for her name.

Story continues below this ad

And it’s not just Taylor Swift, Deadline and Gizmodo tested Grok with other stars. Scarlett Johansson, Sydney Sweeney, Jenna Ortega, Nicole Kidman, Kristen Bell, Timothée Chalamet, and even Nicolas Cage, all could be shown undressing or posing suggestively in “spicy” mode. In some cases, attempts were blocked with a “video moderated” message, but worked for others. Johansson, who has long warned about deepfake abuse and threatened legal action against AI companies before. Kristen Bell has spoken about her experience of being used in fake videos.

Stories For You

Click here to follow Screen Digital on YouTube and stay updated with the latest from the world of cinema.

Tags:
  • Elon Musk Nicolas Cage nicole kidman scarlett johansson Taylor Swift
Edition
Install the Express App for
a better experience
Featured
Trending Topics
News
Multimedia
Follow Us
Freedom Month SaleGet Express Edge For Just Rs 999. Subscribe Now!
X