tvOS 26.2 introduces two new features for Apple TV 4K users: the ability to create user profiles without requiring an Apple Account, and enhanced content restrictions for child profiles, making it more convenient and parental controls more effective.
YouTube has expanded its AI-based age verification system to more users, leading to account restrictions for those deemed under 18, which include content blocking and privacy reminders, as part of its efforts to enforce age-appropriate content and digital wellbeing tools.
Instagram is implementing new safety measures for teen users, including default PG-13 content settings, stricter content filters, and enhanced parental controls, to protect underage users from harmful content and interactions, with rollout starting in select countries and expanding globally next year.
YouTube is rapidly expanding its AI-based age verification system, leading to account restrictions for users it suspects are under 18, which include blocking age-restricted videos, limiting recommendations, and disabling personalized ads. The rollout has increased significantly recently, with many users reporting restrictions and prompts to verify their age through various methods.
YouTube is implementing AI-based age verification that requires users to verify their age with government ID or credit card to access certain content, aiming to protect minors and tailor user experiences, initially in the U.S.
GOG's 48-hour 'Freedom to Buy' campaign, offering controversial games for free to raise awareness about content restrictions, was claimed by over one million users. Due to high traffic, access has been temporarily extended, emphasizing that DRM-free titles cannot be revoked, making them permanently accessible to users.
Steam is tightening its rules on adult games, especially those with sexual content, amid recent removals and vague new guidelines that rely on payment processor standards, raising concerns about fairness and clarity in enforcement.
Meta, the parent company of Facebook and Instagram, announced that it will start blocking sensitive and "age-inappropriate" content from teenagers' feeds, including topics such as self-harm, eating disorders, and mental illnesses. The company will also restrict these topics from appearing in young users' feeds and Stories, even if the content was posted by people they follow. This move comes amid ongoing scrutiny over how Meta's products affect young people, with critics calling for legislation to reduce harm on children who are exposed to inappropriate content. Meta CEO Mark Zuckerberg has responded to the backlash, stating that the company is committed to building safe experiences for kids online.
Meta Platforms, the parent company of Instagram and Facebook, will implement automatic restrictions on teen accounts to protect them from harmful content such as self-harm, graphic violence, and eating disorders. This move comes amid legal action from multiple states alleging that the tech company has not done enough to safeguard young users.
YouTube will remove videos related to eating disorders that show or discuss "imitable behavior" and weight-based bullying. The platform will also restrict young viewers from accessing eating disorder content and require viewers to be over 18 years old or sign in to watch recovery videos. Eating disorder resource panels will appear below relevant videos in several countries. Other platforms like Pinterest and TikTok have also taken measures to address the problem of eating disorder and weight loss content.