
Apple's Explanation for Abandoning CSAM Detection in iCloud Photos Sparks Controversy
Apple has provided a detailed explanation for its decision to abandon its plan to detect known Child Sexual Abuse Material (CSAM) in iCloud Photos. The company stated that while it is committed to combating child sexual abuse, scanning every user's privately stored iCloud data would create new threats to privacy and could lead to unintended consequences, such as bulk surveillance and the desire to search other encrypted messaging systems. Apple's response comes amid a renewed encryption debate, with the U.K. government considering legislation that would require tech companies to disable security features like end-to-end encryption. Apple has warned that it will pull services like FaceTime and iMessage in the U.K. if the legislation is passed in its current form.


