Grok AI allegedly made “a deliberate choice” to create deepfake, pornographic videos of popular musician Taylor Swift without prompting, according to a report by The Verge. 

Grok was said to have created the videos whole in a new “spicy mode” and produced clips of “fully uncensored topless videos.”

The Imagine feature on iOS devices allows users to select from four modes: Custom, Normal, Fun, and Spicy.

The AI generator was asked to create a video of “Taylor Swift celebrating Coachella with the boys.” Grok generated over 30 images, many of which were said to feature Swift in revealing clothing.

After selecting an image of Swift in a silver skirt and halter top, Grok was asked to generate a spicy video. The video showed Swift tearing off her clothes and dancing in only a thong, according to The Verge.

TAYLOR SWIFT performs during her Eras Tour in Europe.8
TAYLOR SWIFT performs during her Eras Tour in Europe.8 (credit: Gareth Cattermole/Getty Images/TNS)

While the AI tool asked for confirmation of the user's birth year, ensuring they were over 18 years old, this information was not collected upon app registration.

The report found that while Swift’s likeness wasn’t perfect, it was recognizably her.

Misogyny 'by design'

Despite providing a near-naked video of Swift unprompted, Grok reportedly refused to generate naked or near-naked photos of people with the text-to-image generator. 

Clare McGlynn, a law professor who has contributed to the drafting of a law that would make pornographic deepfakes illegal, told BBC News, "This is not misogyny by accident, it is by design.”

"That this content is produced without prompting demonstrates the misogynistic bias of much AI technology," the Durham University professor stressed. "Platforms like X could have prevented this if they had chosen to, but they have made a deliberate choice not to.”