Meta is updating its approach to handling manipulated media on its platforms, including Facebook, Instagram, and Threads, based on feedback from the Oversight Board. The changes involve implementing labels to provide context about AI-generated content, including videos, audio, and photos, and addressing manipulation that shows a person doing something they didn't do. The company plans to start labeling AI-generated content in May 2024 and will stop removing content solely based on its manipulated video policy in July. These decisions are informed by extensive public opinion surveys, consultations with global experts, and feedback from civil society organizations and academics.
A video circulating on X claimed to show Taylor Swift holding a "Trump Won, Democrats Cheated" flag at the 2024 Grammy Awards, but it was labeled as manipulated media. The original video from Variety showed Swift walking the red carpet without holding any flags. Swift has previously expressed opposition to former President Trump and Republican Senator Marsha Blackburn, making the claim inconsistent with her political beliefs.
Deepfake videos and images falsely depicting Taylor Swift supporting Trump and engaging in election denialism have been circulating on various social media platforms, including X, Instagram, Facebook, YouTube, and TikTok. Despite some posts being labeled as manipulated media, many others have not been, raising concerns about the platforms' ability to control the spread of malicious inauthentic media. The manipulated media appears to originate from a pro-Trump X account with over 1 million followers, and the issue highlights the ongoing struggle of social media platforms to effectively moderate disinformation, including AI-generated content.
Meta's Oversight Board has criticized Facebook's policies for allowing a fake video of President Joe Biden to remain on the platform, despite it being manipulated to falsely depict him inappropriately touching his granddaughter. The board called for a revision of the "incoherent" policies, urging Meta to cover all forms of manipulated media, including audio fakes, and to specify the harms it aims to prevent. While some suggest removing fake content, others advocate for labeling significantly altered content, as Meta faces pressure to update its policies ahead of upcoming elections.
The Oversight Board, an external advisory group for Meta, has called for a rewrite of the company's rules against faked videos, criticizing the current policy as "incoherent." The decision comes after the board reviewed a doctored video of President Biden and found Meta's policy on manipulated media to be lacking in addressing potential harms. The board urged Meta to urgently implement changes to its rules, including adding labels to altered videos and addressing manipulated audio, especially in the lead-up to global elections. Meta is reviewing the board's guidance and will issue a public response within 60 days.
Meta's Oversight Board criticized the company's "incoherent" and "confusing" policies on manipulated media after an altered video of President Biden spread on Facebook, calling for clearer guidelines and expanded policies to address potential harms, including voter suppression. The video, which showed Biden appearing to touch his granddaughter inappropriately, was not removed by Meta despite criticism. The board recommended extending manipulated media policy to cover altered audio and videos showing people doing things they didn't do, and suggested attaching a label to manipulated media instead of removing it if it doesn't violate other rules.
As the 2024 presidential election approaches, Meta is being urged to revamp its policy on manipulated videos to address fake or distorted clips that could mislead voters and interfere with elections. The Oversight Board called on Meta to crack down on all doctored content, including faked audio, and clearly define the aim of its policy to encompass election interference. Concerns are rising about the potential impact of AI-generated deepfakes on the election, with experts calling for regulations and standards to address the spread of misleading content on social media platforms.
Meta's Oversight Board has urged the company to update its manipulated media policy after a misleadingly edited video of President Joe Biden was allowed to stay on Facebook. The board criticized the current policy as "incoherent" and called for new rules that cover audio and video content, regardless of the method of creation. They recommended applying labels to posts with manipulated media instead of removing them, citing concerns about the potential for viral election misinformation. Meta is reviewing the guidance and plans to respond publicly within the next 60 days, with potential policy changes in response to the evolution of new AI tools.