Tag

Csam

All articles tagged with #csam

States Push Back on Grok and xAI Over Nonconsensual AI Imagery
technology1 month ago

States Push Back on Grok and xAI Over Nonconsensual AI Imagery

More than three dozen state attorneys general have urged xAI to strengthen safeguards after Grok helped generate a flood of nonconsensual sexual imagery, including content involving minors. Regulators point to rapid, large-scale outputs (millions of deepfake images over an 11‑day period) and firm calls for content removal, user protections, and reporting mechanisms, with investigations or discussions underway in several states (AZ, CA, FL, MO, among others) and ongoing talks about age-verification requirements for platforms like X and Grok. The push signals a broad, state-led regulatory response to AI-generated CSAM and related abuses.

Grok’s AI-generated content tests payment rails and CSAM rules
technology1 month ago

Grok’s AI-generated content tests payment rails and CSAM rules

The Verge reports that a Center for Countering Digital Hate sample found 101 sexualized images in 20,000 Grok-generated images (Dec 29–Jan 8), extrapolating about 23,000 such images in 11 days—roughly one every 41 seconds—prompting a shift in payment processors’ CSAM policing as paid access and app stores intersect with Grok, despite guardrails; the piece also notes ongoing lawsuits and state laws targeting AI-generated sexual content.

Mara Wilson Breaks Silence on Child Exploitation and AI Risks
entertainment1 month ago

Mara Wilson Breaks Silence on Child Exploitation and AI Risks

Former Matilda star Mara Wilson details being sexually exploited and her image being misused online during her childhood, describing how the media’s and public’s gaze intensified the sexualization of young actors. She discusses years of harassment, Photoshopped images, and coercive attention, and in recent essays warns about AI-fueled CSAM risks, urging stronger laws and safeguards to protect child performers.

California probes xAI over AI-generated CSAM on X platform
technology1 month ago

California probes xAI over AI-generated CSAM on X platform

California AG Rob Bonta opened a probe into Elon Musk’s xAI to determine whether it violated state law after reports of AI-generated nude images of minors on X via the Grok tool; he said CSAM creation is a crime and signaled possible civil violations, while X has not commented. X said safeguards were added to prevent CSAM edits, but some users could still produce bikini images, underscoring ongoing regulatory ambiguity around AI-generated sexual content.

Apple Faces $1.2B Lawsuit Over Dropped CSAM Detection System
technology1 year ago

Apple Faces $1.2B Lawsuit Over Dropped CSAM Detection System

Apple is facing a $1.2 billion lawsuit for abandoning its plan to scan iCloud photos for child sexual abuse material (CSAM). The lawsuit, representing 2,680 victims, claims Apple's decision has allowed harmful content to persist, causing ongoing harm. Apple initially announced the CSAM detection system in 2021 but withdrew it due to privacy concerns and potential security vulnerabilities. The company maintains its commitment to child safety through other measures, despite the lawsuit's allegations.

Apple Faces Lawsuit for Dropping iCloud CSAM Detection
technology1 year ago

Apple Faces Lawsuit for Dropping iCloud CSAM Detection

Apple is facing a lawsuit for not implementing a system to scan iCloud photos for child sexual abuse material (CSAM), a decision criticized for potentially allowing the spread of such content. The lawsuit, filed by a woman under a pseudonym, claims Apple failed to protect victims by not deploying a previously announced detection system. Apple had initially planned to use digital signatures to identify CSAM but abandoned the idea due to privacy concerns. The case could involve up to 2,680 victims seeking compensation.

Bluesky's Meteoric Rise: A New Challenger in Social Media
technology1 year ago

Bluesky's Meteoric Rise: A New Challenger in Social Media

Bluesky, a decentralized social media platform, is experiencing rapid user growth, surpassing 22 million users following a mass migration from Elon Musk's X. This surge has led to an increase in content moderation challenges, including handling cases of child sexual abuse material (CSAM). In response, Bluesky plans to quadruple its moderation team from 25 to 100 members. The platform is also utilizing third-party tools like Thorn's Safer to detect and remove CSAM. Despite these challenges, Bluesky's growth is seen as a positive sign for the company.

AI Training Dataset Reveals Disturbing Presence of Child Sexual Abuse Material
technology2 years ago

AI Training Dataset Reveals Disturbing Presence of Child Sexual Abuse Material

The LAION-5B dataset, used to train popular AI image generators like Stable Diffusion, has been found to contain thousands of instances of child sexual abuse material (CSAM), according to a study by the Stanford Internet Observatory (SIO). The dataset includes metadata and URLs pointing to the images, some of which were found hosted on websites like Reddit, Twitter, and adult websites. The SIO reported the findings to the National Center for Missing and Exploited Children (NCMEC) and the Canadian Centre for Child Protection (C3P), and the removal of the identified source material is underway. LAION has announced plans for regular maintenance procedures to remove suspicious and potentially unlawful content from its datasets.

Controversy Deepens as Apple Reveals Reasons for Abandoning CSAM-scanning Tool
technology2 years ago

Controversy Deepens as Apple Reveals Reasons for Abandoning CSAM-scanning Tool

Apple has responded to a child safety group called Heat Initiative, explaining its decision to abandon the development of a privacy-preserving iCloud photo scanning tool for detecting child sexual abuse material (CSAM). The company cited concerns about compromising user privacy and security, as well as the potential for unintended consequences and bulk surveillance. Instead, Apple is focusing on on-device tools and resources known as "Communication Safety" features. Heat Initiative is organizing a campaign demanding that Apple detect, report, and remove CSAM from iCloud, but Apple believes connecting users directly with local resources and law enforcement is a better approach. The debate over data scanning and encryption continues to test Apple's resolve.

Apple's Explanation for Abandoning CSAM Detection in iCloud Photos Sparks Controversy
technology2 years ago

Apple's Explanation for Abandoning CSAM Detection in iCloud Photos Sparks Controversy

Apple has provided a detailed explanation for its decision to abandon its plan to detect known Child Sexual Abuse Material (CSAM) in iCloud Photos. The company stated that while it is committed to combating child sexual abuse, scanning every user's privately stored iCloud data would create new threats to privacy and could lead to unintended consequences, such as bulk surveillance and the desire to search other encrypted messaging systems. Apple's response comes amid a renewed encryption debate, with the U.K. government considering legislation that would require tech companies to disable security features like end-to-end encryption. Apple has warned that it will pull services like FaceTime and iMessage in the U.K. if the legislation is passed in its current form.