With major elections looming, Meta’s policy on deep fake content is in urgent need of updating, an oversight body said on Monday, in a decision about a manipulated video of US President Joe Biden.
A video of Biden voting with his adult granddaughter, manipulated to falsely appear that he inappropriately touched her chest, went viral last year.
It was reported to Meta and later the company’s oversight board as hate speech.
The tech giant’s oversight board, which independently reviews Meta’s content moderation decisions, said the platform was technically correct to leave the video online.
But it also insisted that the company’s rules on manipulated content were no longer fit for purpose.
The board’s warning came amid fears of rampant misuse of artificial intelligence-powered applications for disinformation on social media platforms in a pivotal election year not only in the United States but worldwide as huge portions of the global population head to the polls.
The Board said that Meta’s policy in its current form was “incoherent, lacking in persuasive justification and inappropriately focused on how content has been created.”
This was instead of focusing on the “specific harms it aims to prevent (for example, to electoral processes),” the board added.
Meta in a response said it was “reviewing the Oversight Board’s guidance and will respond publicly to their recommendations within 60 days in accordance with the bylaws.”
According to the board, in the Biden case, the rules were not violated “because the video was not manipulated using artificial intelligence nor did it depict Biden saying something he did not.”
But the board insisted that “non-AI-altered content is prevalent and not necessarily any less misleading.”
For example, most smartphones have simple-to-use features to edit content into disinformation sometimes referred to as “cheap fakes,” it noted.
The board also underlined that altered audio content, unlike videos, was not under the policy’s current scope, even though deep fake audio can be very effective to deceive users.
Already one US robocall impersonating Biden urged New Hampshire residents not to cast ballots in the Democratic primary, prompting state authorities to launch a probe into possible voter suppression.
The oversight board urged Meta to reconsider the manipulated media policy “quickly, given the number of elections in 2024.”