UK to ban deepfake AI ‘nudification’ apps

Liv McMahonTechnology reporter

Getty Images A close-up of a woman's hands clasping a smartphone in front of her.Getty Images

The UK government says it will ban so-called “nudification” apps as part of efforts to tackle misogyny online.

New laws – announced on Thursday as part of a wider strategy to halve violence against women and girls – will make it illegal to create and supply AI tools letting users edit images to seemingly remove someone’s clothing.

The new offences would build on existing rules around sexually explicit deepfakes and intimate image abuse, the government said.

“Women and girls deserve to be safe online as well as offline,” said Technology Secretary Liz Kendall.

“We will not stand by while technology is weaponised to abuse, humiliate and exploit them through the creation of non-consensual sexually explicit deepfakes.”

Creating deepfake explicit images of someone without their consent is already a criminal offence under the Online Safety Act.

Ms Kendall said the new offence – which makes it illegal to create or distribute nudifying apps – would mean “those who profit from them or enable their use will feel the full force of the law”.

Nudification or “de-clothing” apps use generative AI to realistically make it look like a person has been stripped of their clothing in an image or video.

Experts have issued warnings about the rise of such apps and the potential for fake nude imagery to inflict serious harm on victims – particularly when used to create child sexual abuse material (CSAM).

In April, the Children’s Commissioner for England Dame Rachel de Souza called for a total ban on nudification apps.

“The act of making such an image is rightly illegal – the technology enabling it should also be,” she said in a report.

The government said on Thursday it would “join forces with tech companies” to develop methods to combat intimate image abuse.

This would include continuing its work with UK safety tech firm SafeToNet, it said.

The UK company developed AI software it claimed could identify and block sexual content, as well as block cameras when they detect sexual content is being captured.

Such tech builds on existing filters implemented by platforms such as Meta to detect and flag potential nudity in imagery, often with the aim of stopping children taking or sharing intimate images of themselves.

‘No reason to exist’

Plans to ban nudifying apps come after previous calls from child protection charities for the government to crack down on the tech.

The Internet Watch Foundation (IWF) – whose Report Remove helpline allows under-18s to confidentially report explicit images of themselves online – said 19% of confirmed reporters had said some or all of their imagery had been manipulated.

Its chief executive Kerry Smith welcomed the measures.

“We are also glad to see concrete steps to ban these so-called nudification apps which have no reason to exist as a product,” she said.

“Apps like this put real children at even greater risk of harm, and we see the imagery produced being harvested in some of the darkest corners of the internet.”

However while children’s charity the NSPCC welcomed the news, its director of strategy Dr Maria Neophytou said it was “disappointed” to not see similar “ambition” to introduce mandatory device-level protections.

The charity is among organisations calling on the government to make tech firms find easier ways to identify and prevent spread of CSAM on their services, such as in private messages.

The government said on Thursday it would make it “impossible” for children to take, share or view a nude image on their phones.

It is also seeking to outlaw AI tools designed to create or distribute CSAM.

A green promotional banner with black squares and rectangles forming pixels, moving in from the right. The text says: “Tech Decoded: The world’s biggest tech news in your inbox every Monday.”
Comments

Leave a Reply

Skip to toolbar