A manipulated video of US President Joe Biden allegedly touching his adult granddaughter’s chest has sparked a debate on Meta’s policy regarding deepfakes and manipulated content [1]. This video [1] [2], which circulated on social media platforms in May 2023 [1], was created by altering actual footage of Biden voting in the US Midterm elections in October 2022 [1]. Despite being fake [1], the video was not removed from Facebook as it did not violate Meta’s Manipulated Media policy [1]. However, an independent body overseeing Meta’s content moderation criticized the policy as “incoherent and confusing.”


The independent body recommended that Meta should label manipulated media that does not violate its policies, including manipulated audio and edited videos showing people engaging in actions they did not actually do. They also urged Meta to calibrate its manipulated media policy to prevent real-world harms and enforce the policy against them [2]. Meta is currently reviewing the body’s guidance and will publicly respond to their recommendations within 60 days [2].

The increasing accessibility of AI and editing tools has made it easier for users to create realistic-seeming video and audio clips [2], raising concerns about the spread of manipulated content. The body is also monitoring how Meta handles content related to election integrity and has urged the company to develop a framework for evaluating false and misleading claims surrounding elections [2]. With the upcoming elections in 2024 [1], the body emphasized the need for Meta to reconsider its policy promptly.


The impact of deepfakes and manipulated content on social media platforms is a growing concern. Meta’s current policy on manipulated media has been criticized for being unclear and inconsistent. The recommendations from the independent body highlight the importance of labeling and addressing manipulated media that does not violate Meta’s policies. Additionally, the need to prevent real-world harms and enforce the policy against them is emphasized [2]. As AI and editing tools continue to advance, it is crucial for Meta to adapt its policies to mitigate the spread of manipulated content. Furthermore, with the upcoming elections [1], Meta must develop a framework to evaluate false and misleading claims to ensure election integrity.


[1] https://www.infosecurity-magazine.com/news/meta-oversight-board-policy-change/
[2] https://www.cbsnews.com/news/meta-oversight-board-says-manipulated-video-of-biden-can-stay-on-facebook/