Meta requires political advertisers to mark when deepfakes used

The Facebook and Instagram logosReuters

Meta will require political advertisers to flag when they have used AI or digital manipulation in adverts on Facebook and Instagram.

The social media company already has policies on using deepfakes in place, but says this goes a step further.

From January, adverts related to politics, elections or social issues will have to declare any digitally altered image or video.

The worldwide policy will be moderated by a mix of human and AI fact checkers.

In an announcement, Meta said this would include changing what somebody has said in a video, altering images or footage of real events, and depicting real-looking people who do not exist.

Users will be notified when adverts have been marked as being digitally changed. Meta told the BBC that it would add this information to the ad but didn’t go into detail on how it would be presented.

Advertisers do not have to declare when small changes have been made, such as cropping or colour correction, “unless such changes are consequential or material to the claim, assertion, or issue raised in the ad”.

Meta already has policies for all users – not just advertisers – about using deepfakes in videos.

Deepfakes are removed if they “would likely mislead an average person to believe a subject of the video said words that they did not say”.

The new rules require adverts relating to politics, elections or social issues to disclose any kind of digital alteration, whether done by a human or AI, before the ad goes live on Facebook or Instagram.

Meta’s other social media platform, Threads, follows the same policies as Instagram.

It says that if advertisers do not declare this when they upload adverts, “we will reject the ad and repeated failure to disclose may result in penalties against the advertiser.”

Google recently announced a similar policy on its platforms. TikTok does not allow any political advertising,

This video can not be played

To play this video you need to enable JavaScript in your browser.

General elections are expected in 2024 in some of the world’s biggest democracies, including India, Indonesia, the US and the UK.

Russia, South Africa and the EU also have elections scheduled for next year.

Deepfakes – where AI is used to changed what someone says or does in a video – are a growing concern in politics.

In March, a fake picture of former US President Donald Trump falsely showing him being arrested was shared on social media. The image was created by AI tools.

The same month, a deepfake video circulated of Ukrainian President Volodymyr Zelensky talking of surrendering to Russia.

However in July, false claims that a video of US President Joe Biden was a deepfake were debunked, with the video proven to be authentic.

Comments

Leave a Reply

Skip to toolbar