FBI Raises Alarm on AI-Generated Deepfakes in Sextortion Scams

The FBI has issued a warning about the increasing use of AI-generated deepfakes in sextortion schemes, which involve tricking people into providing payment or explicit photos through the threat of sharing already obtained compromising images. The use of AI to generate fake videos that appear to show real people engaged in sexually explicit activities has grown in recent months. Scammers often obtain victims’ photos from social media or elsewhere and use them to create "sexually themed images that appear true-to-life in likeness to a victim, then circulate them on social media, public forums, or pornographic websites." The FBI urged people to take precautions to prevent their images from being used in deepfakes and to report any sextortion threats to the appropriate authorities.
- FBI warns of increasing use of AI-generated deepfakes in sextortion schemes Ars Technica
- Cases of 'sextortion' using artificial intelligence or 'AI' on the rise, FBI says KPRC 2 Click2Houston
- Deepfake technologies complicate fight against online extortion schemes WAVY TV 10
- View Full Coverage on Google News
Reading Insights
0
0
3 min
vs 4 min read
84%
744 → 122 words
Want the full story? Read the original article
Read on Ars Technica