The family of a Scottish teen who died by suicide after being blackmailed on Instagram is suing Meta, alleging the platform failed to protect children from sextortion, despite having safety features in place. The lawsuit highlights the tragic case of Murray Dowey, who was exploited by criminals posing as a young girl, leading to his death. Meta states it supports law enforcement and has implemented safety measures, but the families seek accountability for the platform's role in the tragedy.
Parents of a Scottish teen who died by suicide after being blackmailed on Instagram are suing Meta in the UK, alleging the platform failed to implement safety features that could have prevented sextortion, in what is believed to be the first such case in the UK.
The parents of a 16-year-old who died by suicide after being sextorted on Instagram are suing Meta in the UK, alleging that the company's design flaws and safety failures contributed to his death, marking the first such case in the UK and highlighting rising sextortion cases globally.
Instagram is introducing new tools to protect young users from sexual extortion, including a feature that automatically blurs nudity in direct messages. The platform aims to combat sexual scams and "image abuse" by making it harder for criminals to contact teens. The nudity-protection feature will be turned on by default for users under 18 and will include warnings and options to block senders and report chats. The move comes as part of a broader effort by Meta Platforms to enhance online safety for young people.
Instagram, owned by Meta, is introducing new tools to combat sextortion and abuse involving sexually explicit images, particularly among young people. The new features include a nudity protection feature that automatically blurs explicit images in direct messages, aiming to discourage users from sending or viewing them. This feature will be a default setting for users under 18, and anyone attempting to forward a nude image will receive a prompt encouraging them to reconsider. While some see this as a step in the right direction, others question whether blurring images is enough and suggest additional measures such as parental notification systems.
Instagram, owned by Meta, is introducing new tools to combat sextortion and abuse involving sexually explicit images, particularly among young people. The platform will automatically blur nudity in direct messages as a default setting for users under 18, and prompt anyone attempting to forward a nude image to reconsider. This move aims to protect teenagers from scammers who solicit intimate images and raise awareness about the dangers of sharing personal photos on social media.
Instagram plans to test a new "nudity protection" feature to prevent sextortion, automatically blurring detected nude images and providing warnings for users under 18. The platform will also push messages reminding users to be cautious when sending sensitive photos and offer support resources. This comes amid increasing efforts to regulate social media and concerns about online child safety, with Instagram emphasizing its commitment to protecting young users and taking severe action against sextortion. The new feature has been applauded by experts and will roll out in the U.S. in the coming weeks and globally in the coming months.
Instagram is introducing new tools to protect young users from sexual extortion, including a feature that automatically blurs nudity in direct messages. The company aims to combat sexual scams and "image abuse" by making it tougher for criminals to contact teens. The feature will be turned on by default for users under 18 and will encourage caution when sending sensitive photos. Instagram is also working on technology to identify potential sexual extortion accounts and taking measures to prevent these accounts from connecting with young people.
Meta is testing new tools on Instagram to combat "sextortion" by automatically blurring nude images in DMs and introducing safety features for users under 18, including warnings, blocking options, and safety tips. The company is also developing technology to identify potential sextortion accounts and is working to investigate and disrupt those engaged in sextortion. This comes after a rise in sextortion cases, with a recent high-profile case involving the alleged sextortion of an Australian teen boy who died by suicide.
Meta, the parent company of Instagram, has announced a new tool for Instagram direct messages aimed at protecting kids and teens from predators attempting to elicit nudes and engage in "financial sextortion." The new "nudity protection" feature will automatically blur nudes sent and received by users under 18, giving them the option to unsend their own intimate pictures and control whether they want to view nude photos sent to them. This initiative is part of Meta's efforts to address child exploitation and online safety concerns, following criticism over its use of encryption technology that reportedly favored predators.
Instagram is testing a new feature to automatically blur nudity in direct messages as part of its efforts to protect young people from sexual extortion and other forms of "image abuse." The feature will be turned on by default for users under 18 and will encourage people to think twice before sending nude images. The platform is also working on technology to identify accounts potentially engaging in sexual extortion scams and taking measures to prevent criminals from connecting with young people, including hiding teens from potential sextortion accounts.
Meta is introducing new tools to combat sextortion and intimate image abuse, including a nudity protection feature in Instagram DMs that blurs and warns against sending or receiving nude images. The company is also implementing measures to prevent potential scammers from connecting with teens, such as hiding message requests and restricting interactions with teen accounts. Additionally, Meta is providing resources and support for individuals who may have been approached by scammers, and collaborating with other tech companies through the Lantern program to share signals and combat sextortion scams across the internet.
Former Northeastern University track coach, Steve Waithe, has been sentenced to five years in federal prison for engaging in a 'sextortion' scheme. He pleaded guilty to cyberstalking, wire fraud, conspiracy to commit computer fraud, and computer fraud. Waithe exploited his position as a coach to college athletes to engage in a sextortion campaign, leaving a trail of emotional devastation. Many of his victims were women he knew from childhood, college, and his coaching career. The judge rejected his lawyer's request for a 27-month sentence, emphasizing the significant impact on the victims' lives.
Apple and Google have removed the Wizz app, targeted at teens, from their app stores due to concerns about sextortion scams. The app, which allowed users as young as 13 to connect with strangers, has been linked to instances of financial sextortion against minors. The National Center on Sexual Exploitation praised the removal, emphasizing the importance of online safety. The app's parent company, Voodoo, has not addressed the sextortion claims and stated that the removal is due to a technical issue.
Wizz, a popular social media app for teens, was removed from the Apple app store and Google Play due to concerns about its alleged use in sextortion scams. The app, similar to Tinder, allows users as young as 13 to set up accounts and connect with others in an "age-gated" group. Concerns about child safety and sextortion have led to the app's removal, with research groups criticizing its inadequate protection of minors. The app's parent company, Voodoo, is working with Apple and Google to address the concerns and hopes to resolve the matter soon.