Tag

Csam

All articles tagged with #csam

Apple Faces $1.2B Lawsuit Over Dropped CSAM Detection System

Originally Published 1 year ago — by MacRumors

Featured image for Apple Faces $1.2B Lawsuit Over Dropped CSAM Detection System
Source: MacRumors

Apple is facing a $1.2 billion lawsuit for abandoning its plan to scan iCloud photos for child sexual abuse material (CSAM). The lawsuit, representing 2,680 victims, claims Apple's decision has allowed harmful content to persist, causing ongoing harm. Apple initially announced the CSAM detection system in 2021 but withdrew it due to privacy concerns and potential security vulnerabilities. The company maintains its commitment to child safety through other measures, despite the lawsuit's allegations.

Apple Faces Lawsuit for Dropping iCloud CSAM Detection

Originally Published 1 year ago — by TechCrunch

Featured image for Apple Faces Lawsuit for Dropping iCloud CSAM Detection
Source: TechCrunch

Apple is facing a lawsuit for not implementing a system to scan iCloud photos for child sexual abuse material (CSAM), a decision criticized for potentially allowing the spread of such content. The lawsuit, filed by a woman under a pseudonym, claims Apple failed to protect victims by not deploying a previously announced detection system. Apple had initially planned to use digital signatures to identify CSAM but abandoned the idea due to privacy concerns. The case could involve up to 2,680 victims seeking compensation.

Apple Faces $1.2B Lawsuit for Dropping iCloud CSAM Detection

Originally Published 1 year ago — by The Verge

Featured image for Apple Faces $1.2B Lawsuit for Dropping iCloud CSAM Detection
Source: The Verge

Apple is facing a class-action lawsuit in California for not implementing its planned 'NeuralHash' technology to detect child sexual abuse material (CSAM) in iCloud. The lawsuit claims that Apple's failure to use such detection measures, like Microsoft's PhotoDNA, has harmed 2,680 victims, potentially leading to damages exceeding $1.2 billion.

Bluesky's Meteoric Rise: A New Challenger in Social Media

Originally Published 1 year ago — by Platformer

Featured image for Bluesky's Meteoric Rise: A New Challenger in Social Media
Source: Platformer

Bluesky, a decentralized social media platform, is experiencing rapid user growth, surpassing 22 million users following a mass migration from Elon Musk's X. This surge has led to an increase in content moderation challenges, including handling cases of child sexual abuse material (CSAM). In response, Bluesky plans to quadruple its moderation team from 25 to 100 members. The platform is also utilizing third-party tools like Thorn's Safer to detect and remove CSAM. Despite these challenges, Bluesky's growth is seen as a positive sign for the company.

AI Training Dataset Reveals Disturbing Presence of Child Sexual Abuse Material

Originally Published 2 years ago — by The Register

Featured image for AI Training Dataset Reveals Disturbing Presence of Child Sexual Abuse Material
Source: The Register

The LAION-5B dataset, used to train popular AI image generators like Stable Diffusion, has been found to contain thousands of instances of child sexual abuse material (CSAM), according to a study by the Stanford Internet Observatory (SIO). The dataset includes metadata and URLs pointing to the images, some of which were found hosted on websites like Reddit, Twitter, and adult websites. The SIO reported the findings to the National Center for Missing and Exploited Children (NCMEC) and the Canadian Centre for Child Protection (C3P), and the removal of the identified source material is underway. LAION has announced plans for regular maintenance procedures to remove suspicious and potentially unlawful content from its datasets.

Controversy Deepens as Apple Reveals Reasons for Abandoning CSAM-scanning Tool

Originally Published 2 years ago — by Ars Technica

Featured image for Controversy Deepens as Apple Reveals Reasons for Abandoning CSAM-scanning Tool
Source: Ars Technica

Apple has responded to a child safety group called Heat Initiative, explaining its decision to abandon the development of a privacy-preserving iCloud photo scanning tool for detecting child sexual abuse material (CSAM). The company cited concerns about compromising user privacy and security, as well as the potential for unintended consequences and bulk surveillance. Instead, Apple is focusing on on-device tools and resources known as "Communication Safety" features. Heat Initiative is organizing a campaign demanding that Apple detect, report, and remove CSAM from iCloud, but Apple believes connecting users directly with local resources and law enforcement is a better approach. The debate over data scanning and encryption continues to test Apple's resolve.

Apple's Explanation for Abandoning CSAM Detection in iCloud Photos Sparks Controversy

Originally Published 2 years ago — by MacRumors

Featured image for Apple's Explanation for Abandoning CSAM Detection in iCloud Photos Sparks Controversy
Source: MacRumors

Apple has provided a detailed explanation for its decision to abandon its plan to detect known Child Sexual Abuse Material (CSAM) in iCloud Photos. The company stated that while it is committed to combating child sexual abuse, scanning every user's privately stored iCloud data would create new threats to privacy and could lead to unintended consequences, such as bulk surveillance and the desire to search other encrypted messaging systems. Apple's response comes amid a renewed encryption debate, with the U.K. government considering legislation that would require tech companies to disable security features like end-to-end encryption. Apple has warned that it will pull services like FaceTime and iMessage in the U.K. if the legislation is passed in its current form.