FBI Raises Alarm on AI-Generated Deepfakes in Sextortion Scams

1 min read
Source: Ars Technica
FBI Raises Alarm on AI-Generated Deepfakes in Sextortion Scams
Photo: Ars Technica
TL;DR Summary

The FBI has issued a warning about the increasing use of AI-generated deepfakes in sextortion schemes, which involve tricking people into providing payment or explicit photos through the threat of sharing already obtained compromising images. The use of AI to generate fake videos that appear to show real people engaged in sexually explicit activities has grown in recent months. Scammers often obtain victims’ photos from social media or elsewhere and use them to create "sexually themed images that appear true-to-life in likeness to a victim, then circulate them on social media, public forums, or pornographic websites." The FBI urged people to take precautions to prevent their images from being used in deepfakes and to report any sextortion threats to the appropriate authorities.

Share this article

Reading Insights

Total Reads

0

Unique Readers

0

Time Saved

3 min

vs 4 min read

Condensed

84%

744122 words

Want the full story? Read the original article

Read on Ars Technica