U.S. lawmakers from both parties are scrutinizing AI chatbots, especially their impact on minors, with discussions about safety, potential lawsuits, and possible legislation like the AI LEAD Act to hold AI companies accountable for harms caused by their products.
Whistleblowers allege that Meta has restricted research on the potential negative impacts of its VR products on kids and teens, with claims that legal teams have vetoed studies on youth safety since 2022. Meta denies these allegations, citing approved studies and product updates aimed at safety, and the issue is set to be discussed in a Senate hearing. Additionally, a lawsuit has been filed against Meta's WhatsApp over privacy concerns.
Teenagers in Washington D.C. feel unsafe due to the federal police takeover, which was initiated by President Trump to combat crime, but has led to fears and distrust among youth, especially in minority communities, highlighting concerns about policing tactics and the root causes of violence.
California is considering a bill (AB 56) that would require social media platforms to display warning labels about potential mental health risks to kids and teens. Introduced by Assembly member Rebecca Bauer-Kahan and Attorney General Rob Bonta, the bill aims to address the mental health crisis among young people by mandating a "black box warning" for all users. This initiative follows a proposal by US Surgeon General Dr. Vivek Murthy and is supported by nearly 40 states. The bill is part of broader efforts, including lawsuits against TikTok and Meta, to regulate social media's impact on youth.
A YouGov poll reveals that 77% of Germans support a ban on social media for under-16s, similar to a new Australian law. The survey indicates widespread concern over the harmful effects of social media on youth, with 82% believing it negatively impacts children and teenagers. Australia's law, effective in a year, will fine platforms like TikTok and Facebook for non-compliance, though enforcement methods are still being developed. Critics worry the ban may push teens to unregulated online spaces.
The chief executives of Meta, TikTok, Snap, Discord, and X (formerly Twitter) are set to testify before the Senate Judiciary Committee about the risks their products pose to young people, following a gruesome video posted on YouTube. Lawmakers are pushing for more accountability, citing whistleblowers, lawsuits, and new state legislation. The hearing will focus on youth safety efforts, with Meta CEO Mark Zuckerberg expected to face particular scrutiny. Newly released communications suggest that Zuckerberg ignored warnings from senior company officials about underinvestment in user safety, despite growing concerns about the impact of Facebook and Instagram on teen mental health.
New York City has launched a new program called "Subway Surfing Kills - Ride Inside, Stay Alive" to raise awareness among youth about the dangers of subway surfing. The initiative, developed by students for students, includes public service announcements, digital signage, student-created graphics, physical palm cards, school swag, social media posts, and anti-surfing messages on MetroCards. Google, Meta, and TikTok are providing space on their platforms to amplify the campaign, and the NYPD is contributing by deploying officers to stations and conducting home visits. The initiative comes after five teens lost their lives subway surfing this year.
Baltimore Mayor Brandon Scott plans to enforce a new curfew for young people after a violent weekend at the Inner Harbor. The curfew ordinance states that children under 14 must be inside by 9 p.m., while teens aged 14-16 must be inside by 10 p.m. on school nights and 11 p.m. on non-school nights. The city will open youth connection centers to offer wrap-around services to young people and their families. Community action group We Our Us is also engaging with teens over the summer and has a "stop the beef" hotline to mediate arguments.