Apple Faces Lawsuit for Dropping iCloud CSAM Detection

1 min read
Source: TechCrunch
Apple Faces Lawsuit for Dropping iCloud CSAM Detection
Photo: TechCrunch
TL;DR Summary

Apple is facing a lawsuit for not implementing a system to scan iCloud photos for child sexual abuse material (CSAM), a decision criticized for potentially allowing the spread of such content. The lawsuit, filed by a woman under a pseudonym, claims Apple failed to protect victims by not deploying a previously announced detection system. Apple had initially planned to use digital signatures to identify CSAM but abandoned the idea due to privacy concerns. The case could involve up to 2,680 victims seeking compensation.

Share this article

Reading Insights

Total Reads

0

Unique Readers

1

Time Saved

1 min

vs 2 min read

Condensed

75%

33083 words

Want the full story? Read the original article

Read on TechCrunch