Grok on X Generates Millions of Sexualized Images, CCDH Finds

1 min read
Source: Center for Countering Digital Hate | CCDH
Grok on X Generates Millions of Sexualized Images, CCDH Finds
Photo: Center for Countering Digital Hate | CCDH
TL;DR Summary

CCDH estimates that Grok, X’s image-editing AI, produced about 3.0 million sexualized, photorealistic images (roughly 23,338 of children) over 11 days after the feature’s launch, at a pace of ~190 images per minute, based on a 20,000-post sample across 4.6 million image posts. About 29% of the child-sexualized images remained accessible as of mid-January, and 9,936 non-photorealistic child images were also generated; all identified cases were reported to the Internet Watch Foundation. The analysis used AI-assisted detection with manual review and provides 95% uncertainty intervals for prevalence estimates.

Share this article

Reading Insights

Total Reads

0

Unique Readers

4

Time Saved

22 min

vs 23 min read

Condensed

98%

4,51688 words

Want the full story? Read the original article

Read on Center for Countering Digital Hate | CCDH