The Oversight Board has shared the details surrounding a case involving Facebook video of President Joe Biden, which could have significant implications for Meta’s “manipulated media” policy.
At the center of the case is a video of Biden from last fall, when he joined his granddaughter who was voting in-person for the first time. After voting, Biden placed an “I voted” sticker on her shirt. A Facebook user later shared an edited version of the encounter, making it appear as if he repeatedly touched her chest. The video caption called him a “sick pedophile,” and said those who voted for him were “mentally unwell.”
In a statement, the board also raised the issue of manipulated media and elections. “Although this case involves President Biden, it touches on the much broader issue of how manipulated media might impact elections in every corner of the world,” Thomas Hughes, director of the Oversight Board Administration, said in a statement. “It’s important that we look at what challenges and best practices Meta should adopt when it comes to authenticating video content at scale.”
According to the Oversight Board, a Facebook user reported the video, but Meta ultimately left the clip up saying it didn’t break its rules. As the board notes, the company’s manipulated media prohibits misleading video artificial intelligence, but doesn’t apply to deceptive edits made with more conventional techniques. “The Board selected this case to assess whether Meta’s policies adequately cover altered videos that could mislead people into believing politicians have taken actions, outside of speech, that they have not,” the Oversight Board said in a statement announcing the case.
The case also underscores the often glacial pace of the Oversight Board and its ability to effect change at Meta. The Biden clip at the center of the case was originally filmed last October, and edited versions have been on social media since at least January (the version in this case was first posted in May). It will likely take several more weeks, if not months, for the board to make a decision on whether the Facebook video should be removed or left up. Meta will then have two months to respond to the board’s policy recommendations, though it could take many more weeks or months for the company to fully implement any suggestions it chooses to adopt. That means any meaningful policy change may fall much closer to the 2024 election than the 2022 midterm election that kickstarted the case in the first place.
Trending Products