Grok’s latest feature for its generative AI video tool, branded as “spicy,” has raised significant concerns regarding its potential for misuse. Unlike competitors such as Google’s Veo and OpenAI’s Sora, which enforce restrictions to prevent the generation of adult content and celebrity deepfakes, Grok Imagine appears to facilitate both without hesitation. On my initial attempt with the tool, I was presented with explicit topless portrayals of Taylor Swift almost instantaneously, without having made a specific request for nudity.
The Imagine feature available on iOS allows users to create images based on text prompts and quickly convert them into video clips using four settings: “Custom,” “Normal,” “Fun,” and “Spicy.” Notably, while many image-generating applications shy from rendering recognizable celebrity images, I requested a scene of “Taylor Swift celebrating Coachella with the boys.” The tool responded with over 30 generated images, several of which featured Swift in suggestive outfits.
Using the generated images, I selected one of Swift wearing a silver skirt and halter top and proceeded to create a video. After choosing the “spicy” option and entering my birth year—which I was not required to verify during app installation, despite stringent age restrictions in the UK—the tool produced an unsettling video of Swift ripping off her clothes and dancing provocatively in front of an indifferent virtual audience.
While Swift’s representation in the generated images was somewhat lacking in realism, with a noticeable uncanny valley effect, her likeness was still identifiable. The image-generating tool refrained from producing nude images upon request; prompts seeking nude pictures of Swift or any person returned empty responses. However, there was no assurance that selecting the “spicy” preset would always yield nudity—some clips featured her suggestively gesturing with her clothing instead. Yet, many instances included depictions of her removing most of her attire.
Interestingly, while the image generator can produce photorealistic pictures of children, it does appear to block the creation of inappropriate animations featuring them. Although the “spicy” setting remains an option for these images, tests indicated it merely added generic movement to the visual without crossing any lines.
Given the company’s prior challenges regarding unauthorized Taylor Swift deepfakes and existing regulations such as the Take It Down Act, one would expect greater caution. The acceptable use policy from xAI prohibits “depicting likenesses of persons in a pornographic manner.” However, Grok Imagine seems to lack adequate safeguards to prevent users from generating celebrity likenesses while promoting a service that includes partial nudity in its video creations. The age verification appeared only once during the process and was easily circumvented, requiring no verification of my actual age.
If this capability is accessible to me, it follows that any individual with an iPhone and a $30 SuperGrok subscription can replicate it. Since its launch, Grok Imagine has already generated over 34 million images, according to xAI CEO Elon Musk, who remarked that usage is expanding rapidly.