
Grok on X Generates Millions of Sexualized Images, CCDH Finds
CCDH estimates that Grok, X’s image-editing AI, produced about 3.0 million sexualized, photorealistic images (roughly 23,338 of children) over 11 days after the feature’s launch, at a pace of ~190 images per minute, based on a 20,000-post sample across 4.6 million image posts. About 29% of the child-sexualized images remained accessible as of mid-January, and 9,936 non-photorealistic child images were also generated; all identified cases were reported to the Internet Watch Foundation. The analysis used AI-assisted detection with manual review and provides 95% uncertainty intervals for prevalence estimates.












