Teens sue xAI over Grok-generated CSAM imagery

TL;DR Summary
Three Tennessee teens filed a proposed class action against Elon Musk’s xAI, alleging Grok’s AI-generated CSAM—explicit images of the plaintiffs and other minors—were created and distributed on Discord. The suit says xAI knew Grok could produce such material after its “spicy mode” launch and failed to adequately test safety, seeking damages and an injunction to stop Grok from generating or spreading AI-based CSAM. The case follows heightened regulatory scrutiny of Grok from the FTC, EU, and UK, with advocates pressing for accountability for the harm caused.
- Teens sue Elon Musk’s xAI over Grok’s AI-generated CSAM The Verge
- Tennessee teens sue Elon Musk's xAI over AI-generated child sexual abuse material NPR
- Teens sue Musk's xAI over Grok's pornographic images of them BBC
- Teens allege Musk’s Grok chatbot made sexual images of them as minors The Washington Post
- Child abuse material ‘systemic’ on Elon Musk’s X amid Grok scandal, Australian online safety regulator warned The Guardian
Reading Insights
Total Reads
0
Unique Readers
2
Time Saved
41 min
vs 42 min read
Condensed
99%
8,225 → 86 words
Want the full story? Read the original article
Read on The Verge