Musk’s Grok Chatbot Made Sexual Images of Minors, Teens Allege in Lawsuit
· Rolling StoneIn early December, a Tennessee teenager allegedly received a message from an anonymous Instagram user warning that sexually explicit deepfake images of her had been uploaded to a Discord server.
One image purportedly was created from a photograph taken at her school’s homecoming last September. Another image, allegedly depicting her topless, appeared to have been generated from her yearbook portrait taken last June.
When the teenager, who is now a legal adult, eventually obtained a link to the Discord server, she allegedly found images and videos of at least 18 other girls who were minors at the time, many of whom she recognized from her school.
The new allegations are detailed in a 44-page complaint filed Monday in federal court in San Jose, Calif., by three Tennessee-based plaintiffs identified as Jane Does. The anonymous plaintiffs are suing Elon Musk’s artificial intelligence company, xAI, over its generative AI model, Grok. The proposed class-action lawsuit alleges xAI recklessly designed Grok to enable such abuse, and then, amid a public outcry, restricted the technology to paid subscribers and third-party companies rather than fix the problem.
“xAI — and its founder Elon Musk — saw a business opportunity: an opportunity to profit off the sexual predation of real people, including children,” the complaint obtained by Rolling Stone reads. “Limiting the image- and video-generation features to paid subscribers … does not prevent the creation of AI-generated CSAM (child sexual abuse material), it merely ensures that xAI will profit from all such content.”
Rolling Stone reached out to xAI for comment Monday but did not immediately receive a response.
“No one should have to live with the fear that these survivors now carry with them, but I am inspired by their strength and clarity of purpose in bringing this lawsuit on behalf of themselves and other minors in the Class,” Vanessa Baehr-Jones of Baehr-Jones Law, one of the firms representing the plaintiffs, told Rolling Stone in a statement.
Editor’s picks
The 250 Greatest Albums of the 21st Century So Far
The 100 Best TV Episodes of All Time
The 500 Greatest Albums of All Time
100 Best Movies of the 21st Century
“These are children whose school photographs and family pictures were turned into child sexual abuse material by a billion-dollar company’s AI tool and then traded among predators. Elon Musk and xAI deliberately designed Grok to produce sexually explicit content for financial gain, with no regard for the children and adults who would be harmed by it,” said Annika K. Martin of Lieff Cabraser, whose firm is also representing plaintiffs. “Without xAI, this harmful, illegal content could never, and would never, have existed. The lives of these girls have been shattered by the devastating loss of privacy and the deep sense of violation that no child should ever have to experience. We intend to hold xAI accountable for every child they harmed in this way.”
Grok’s image-generation feature surged in popularity in late December after Musk announced that users of X could use Grok to edit images posted to the platform with a single click. Although Grok was not supposed to generate fully nude images, users allegedly circumvented the restriction by asking the system to alter real photos so subjects appeared in “transparent bikinis, then in bikinis made of dental floss, placed in sexualized positions, and made to bend over so their genitals were visible,” the lawsuit states.
The image-generation feature was restricted to paid users on Jan. 9, with additional technical limits added on Jan. 14. In a Jan. 16 post on X, Musk said he was unaware of any naked images of minors generated by Grok. “Literally zero,” he wrote. “When asked to generate images, it will refuse to produce anything illegal, as the operating principle for Grok is to obey the laws of any given country or state. There may be times when adversarial hacking of Grok prompts does something unexpected. If that happens, we fix the bug immediately.”
Related Content
Live Nation's Antitrust Trial Isn't Over as States Take Up Fight
FBI Calls Michigan Synagogue Attack ‘Targeted Act of Violence,’ Suspect Dead
Rosanna Arquette Hits Back at Harvey Weinstein’s Latest Denials: ‘The Rapes Happened’
Kim Kardashian Denies Involvement in Sex Tape Release. Ray J’s Lawyer Calls it ‘Perjury’
The Center for Countering Digital Hate, meanwhile, reports that it reviewed a random sample of 200,000 images from the roughly 4.6 million images Grok produced between Dec. 29, 2025, and Jan. 8, 2026. Based on that sample, the group estimated that Grok generated about three million sexualized images during that period, including roughly 23,000 that allegedly depicted children. One Grok-generated image, the group said, showed six young girls wearing micro bikinis and remained publicly available on X as of Jan. 15.
Trending Stories
BTS to Deliver First U.S. Performance in Nearly 4 Years at Fan Event for New Album
Kristi Noem Has Always Been an Incompetent Narcissist. No One Cared
How Derek Trucks Ended Up Playing Jerry Garcia’s $12 Million ‘Tiger’ Guitar
Jack Black to Host 'SNL' With Musical Guest Jack White
The second plaintiff in the lawsuit said she was informed by law enforcement on Feb. 12, 2026, that she had also been targeted. According to the complaint, the person who ran the Discord server used Instagram photos of her wearing a blue bikini at the beach last October to generate images of her without clothing. The alleged perpetrator was arrested in December, the lawsuit says.
The three plaintiffs say they have suffered severe emotional distress. Their lawsuit seeks to hold xAI liable for the creation and distribution of the alleged child sexual abuse material and demands a jury trial.