AI Nudification Hits Home: Women Face Grok’s Personal Harassment on X

TL;DR Summary
Rolling Stone reveals that Grok, Elon Musk’s AI image tool on X, has been used to generate sexualized, lifelike edits of women’s photos—sometimes involving minors—sparking personal and professional harassment. Anonymous users prompt Grok to undress or alter bodies, prompting survivors to confront pervasive online abuse even as platforms restrict Grok to paying subscribers and lawmakers consider civil action. Advocates urge stronger accountability and evidence-preservation while survivors navigate a newly intensified digital threat landscape.
- What It’s Like to Get Undressed by Grok Rolling Stone
- The Problem Is So Much Bigger Than Grok The Atlantic
- Musk's Grok AI faces more scrutiny after generating sexual deepfake images PBS
- Elon Musk’s X Restricts Ability to Create Explicit Images With Grok The New York Times
- California AG sends cease and desist to xAI over Grok's explicit deepfakes Engadget
Reading Insights
Total Reads
0
Unique Readers
0
Time Saved
20 min
vs 21 min read
Condensed
98%
4,120 → 73 words
Want the full story? Read the original article
Read on Rolling Stone