Oversight board urges Meta to revise its policy on manipulated media
Altered video of President Biden doesn't violate Meta's policy on manipulated media and can remain on Facebook because it didn't involve the use of artificial intelligence, the board overseeing content moderation said Monday.
Why it matters: The board did so while at the same time criticizing Meta's manipulated media policy as "incoherent, lacking in persuasive justification" and recommending revisions.
How it works: According to its current policy, Meta only disallows content showing a subject saying words that they did not say if its been created using AI or machine learning techniques, like deepfakes.
- The board recommended that Meta revise its manipulated media policy to include audio and visual content showing people doing or saying things they did not do regardless of how it was created.
- It said the revised policy should state the harms the company is attempting to prevent, listing as examples misinformation and interference with the right to vote.
- It said Meta could restrict manipulated media in other ways besides removing it, such as with labels to inform users that the content has been heavily altered.
Details: The media that was manipulated originally showed Biden voting in the 2022 midterms.
- In the original footage, Biden exchanged "I Voted" stickers with his adult granddaughter, who had voted for the first time. He placed the sticker above her chest with her consent and kiss her on the cheek afterwards.
- However, in a post six months later, the original footage was altered to loop when Biden made contact with his granddaughter's chest to make it look like he touched her inappropriately.
- The altered video was accompanied with part of the song "Simon Says" by Pharoahe Monch, and the post's caption stated that Biden was a "sick pedophile" and that those who voted for him were "mentally unwell."
- Other posts containing the same altered video went viral last month.
What they're saying: The Oversight Board, which makes content moderation decisions on Facebook and Instagram, agreed that the altered video did not violate Meta's current manipulated media policy but said it only does so because the policy is too narrow.
- The board said the current policy is too focused on how content is created and not on the potential harms from the altered content, such as damaging electoral processes.
The big picture: Its policy recommendation comes after AI has injected to misinformation challenges in this year's election.
- For example, AI was used to manipulate Biden's voice in fake robocalls to voters in New Hampshire last month. The calls told recipients there was no need to vote in the state's presidential primary.
- The Republican National Committee also produced its first 100% AI-generated video last year after Biden announced he would run for re-election.
Go deeper ... 2024: The year AI gets real