The Grok application, which is reportedly still generating images, remains accessible in Apple’s App Store and Google’s Play Store. Grok by xAI flooded X with sexually explicit images involving real people, created without their consent. Recent data has emerged, shedding light on the scale of the situation. Over 11 days, Grok generated around 3 million sexually explicit images, including approximately 23,000 images of minors.

In other words, during this 11-day period, Grok created approximately 190 sexually explicit images per minute. Meanwhile, an explicit image involving minors appeared about once every 41 seconds. The Center for Countering Digital Hate (CCDH) published their findings. The UK-based nonprofit based their findings on a random sample of 20,000 images from Grok, acquired from December 29 to January 9. CCDH then extrapolated a broader assessment based on 4.6 million images generated by Grok during this period. In the study, sexualized images were defined as containing “photorealistic images of a person in sexual poses, angles, or situations; a person in underwear, swimwear, or similarly revealing clothing; or images depicting bodily fluids.”
CCDH’s study did not account for image requests, so the assessment does not distinguish between sexualized versions of real photographs, created without consent, and those solely created based on textual prompts.

On January 9, xAI limited the ability to edit existing Grok images to paid users only. This did not solve the problem but merely turned the function into a paid one. Five days later, X restricted Grok’s ability to digitally undress real people. However, this restriction only applied to X, and the separate Grok app continues to generate these images.