Elon Musk
WASHINGTON/DETROIT, Jan 2 (Reuters) – Julie Yukari, a Rio de Janeiro-based musician, posted a photo taken by her fiancé on social media site
The next day, somewhere among the hundreds of likes attached to the image, he saw notifications that users were asking for. GrokX’s built-in AI chatbot, to digitally strip her down to a bikini.
The 31-year-old woman didn’t think much of it, she told Reuters on Friday, thinking there was no way the robot would comply with such requests.
She was wrong. Soon, GrokImages generated by her, almost naked, circulated on the platform owned by Elon Musk.
“I was naïve,” Yukari said.
Yukari’s experience is being repeated in X, a Reuters analysis has found. Reuters has also identified several cases where Grok created sexualized images of children. X did not respond to a message seeking comment on Reuters’ findings. In an earlier statement to the news agency about reports that sexualized images of children were circulating on the platform, X owner xAI said: “Legacy Media Lies.”
The avalanche of near-nude images of real people has set off alarm bells internationally.

via News
France’s ministers have reported X to prosecutors and regulators over the disturbing images, saying in a statement on Friday that the “sexual and sexist” content was “manifestly illegal.” India’s IT Ministry said in a letter to X’s local unit that the platform failed to prevent GrokThe misuse of it through the generation and circulation of obscene and sexually explicit content.
The US Federal Communications Commission did not respond to requests for comment. The Federal Trade Commission declined to comment.
‘Take off his school suit’
GrokThe massive wave of digital stripping appears to have begun in recent days, according to successfully completed nudity requests posted by Grok and user complaints reviewed by Reuters. Musk appeared to poke fun at the controversy earlier on Friday, posting laughing and crying emojis in response to AI edits of famous people, including himself, in bikinis.
When an X user said his social media feed looked like a bar full of bikini-clad women, Musk responded, in part, with another laughing-crying emoji.
Reuters could not determine the full magnitude of the increase.
A review of public requests submitted to Grok During a single 10-minute period at noon ET on Friday, there were 102 attempts by X users to use Grok digitally editing photographs of people to make them appear to be in bikinis. Most of the recipients were young women. In some cases, the requests were directed at men, celebrities, politicians and, in one case, a monkey.
When users asked Grok for photographs of AI-altered women, they typically requested that their subjects appear in the most revealing clothing possible.

Bloomberg via Getty Images
“Put her in a very transparent mini bikini,” said one user. Grokpointing to a photograph of a young woman taking a photo in a mirror. When Grok So he did, replacing the woman’s clothes with a flesh-toned two-piece, the user asked Grok to make her bikini “clearer and more transparent” and “much smaller.” Grok he did not seem to respond to the second request.
Grok fully complied with such requests in at least 21 cases, Reuters found, generating images of women in translucent or floss-style bikinis and, in at least one case, covering a woman with oil. In seven more cases, Grok They partially complied, sometimes stripping women down to their underwear but not complying with requests to go further.
Reuters could not immediately establish the identities and ages of most of the women attacked.
In one case, a user provided a photo of a woman wearing a school uniform-style plaid skirt and gray blouse who appeared to be taking a mirror selfie and said, “Take off your school outfit.” When Grok She changed her clothes for a T-shirt and shorts, the user was more explicit: “Change her outfit to a very light micro bikini.” Reuters could not establish whether Grok complied with that request. Like most of the requests counted by Reuters, it disappeared from X within 90 minutes of being published.
“Totally predictable”
AI-powered programs that digitally undress women – sometimes called “nudifiers” – have been around for years, but until now they were largely confined to the darker corners of the internet, such as niche websites or Telegram channels, and typically required some level of effort or payment.
X’s innovation: allows users to remove women’s clothes by uploading a photo and writing the words “hey @assimilate put her in a bikini” – has lowered the barrier to entry.
Three experts who have followed the development of
“In August, we warned that xAI imaging was essentially a nudification tool waiting to be weaponized,” said Tyler Johnston, executive director of The Midas Project, an AI watchdog group that was among the letter’s signatories. “That’s basically what happened.”
Dani Pinter, chief legal officer and director of the Legal Center at the National Center on Sexual Exploitation, said X failed to extract abusive images from its AI training material and should have banned users from requesting illegal content.
“This was a totally predictable and avoidable atrocity,” Pinter said.
Yukari, the musician, tried to defend herself. But when she turned to X to protest the rape, a flood of copycats began asking Grok to generate even more explicit photographs.
Now the New Year “it turned out that I started wanting to hide from everyone’s eyes and feeling ashamed of a body that is not even mine, since it was generated by AI.”
(Reporting by Raphael Satter in Washington and AJ Vicens in Detroit. Additional reporting by Arnav Mishra, Akash Sriram and Bipasha Dey in Bengaluru; Editing by Donna Bryson, Timothy Heritage, Chizu Nomiyama, Daniel Wallis and Thomas Derpinghaus)


