Instagram trying nudity blurring to stop ‘sextortion’

A stock photo of a man holding a phoneGetty Images

Instagram will start testing new tools within weeks to fight “sextortion”, a form of blackmail involving intimate pictures sent online.

The tools include “nudity protection”, which blurs naked images in direct messages. It will be turned on by default for under-18s.

Pop-ups directing potential victims to support will also be trialled.

Governments around the world have warned of the increasing threat to young people from sextortion.

It often involves victims being sent a nude picture, before being invited to send their own in return – only to then be threatened that the image will be shared publicly access they give in to the blackmailer’s demands,

On Wednesday, two Nigerian men pleaded guilty to sexually extorting teenage boys and young men in the US, including one who took his own life.

The offence has also recently been linked to the suicide of a teenager in Australia.

There are also harrowing cases of paedophiles using sextortion to coerce and abuse children.

‘Horrific crime’

Much of the sextortion Instagram sees on its platform is the work sophisticated criminal gangs seeking money. It is a “horrific” crime, the platform says.

Its nudity protection system, first revealed in January, uses artificial intelligence (AI) running entirely on the user’s device to detect nude images in direct messages and give users a choice whether to view them.

It is designed “not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending their own images in return,” Instagram said.

But Instagram stresses the system, which is opt-in for adults, does not report nude images to the company automatically, though users will be reminded they can block and report accounts if they wish.

When the system detects naked images being sent, the user will be directed to safety tips, including reminders that recipients may screenshot or forward images without the sender’s knowledge.

The platform now says it will also deploy measures that detect signs that a user may potentially be a sextortionist and make it harder for them to interact with others including:

  • Sending message requests from possible sextortion accounts straight to the hidden folder.
  • Showing notices encouraging people already messaging potential sextortion accounts to report any threats to share their private images.
  • Hiding the “Message” button when a teenager views a potential sextortion account, even if they’re already connected.

For people who may have interacted with an account removed for sextortion, Instagram is testing pop-up messages that will direct them to expert advice and support.

Instagram also says it will share more data to fight sextortion and child abuse with other tech firms through an initiative called Lantern.

The announcement comes on the same day WhatsApp, another Meta platform, reduces its minimum age in the UK and Europe from 16 to 13.

According to the platform, the minimum age is already 13 in the majority of countries around the world

But Smartphone Free Childhood, a campaign group which believes smartphones expose children to harmful content, called the decision to reduce the minimum age “tone deaf”.

WhatsApp argues users, including teenagers, have extensive of controls over who they interact with.


Leave a Reply

Skip to toolbar