Digital fingerprints of a million images of child sexual abuse have been created, the Internet Watch Foundation has said.
The UK charity, which is responsible for finding and removing such material online, said the fingerprints, known as hashes, would help companies and police find copies of the images.
It is hoped that by doing this, the reuse of the images can be prevented.
The images are from the government’s Child Abuse Image Database.
The database contains some of the most extreme content that appears online – what is known as category A and B material.
The hashes are an identifying code produced by an algorithm and act as a fingerprint for each image or video.
Many tech companies use lists of hashes to search for child abuse material on their systems – by comparing hashes of images to lists of them created by organisations like the IWF.
The system has its limitations, however. Changes to images can change the value of the hash, meaning an image may escape detection – although the IWF insists the technology it uses means an image can be resized, cropped or have its colours changed without changing the hash.
Encrypted images cannot be identified using lists of hashes.
Before creating hashes, a human assessor will work out which category criminal material falls under, according to UK law.
The charity also produces accompanying metadata which explains the exact nature of the abuse taking place to the child, which it hopes will also speed up enforcement action.
In a quote provided by the IWF, an image analyst said: “I have three children 11 and under. The job has changed the way I think about them and the internet.
“It has surprised me how much material there is of very young children. Some of them are five, six, or seven years old.”
Susie Hargreaves, chief executive of the IWF, said in a statement that the nature of the material is such that its analysts were only allowed to work four-hour shifts, taking regular breaks, and with access to the best counselling and support.
The charity says it has helped remove an unprecedented amount of material.
In 2021, it says it took action to remove 252,000 web pages which it confirmed contained images or videos of children suffering sexual abuse – more than ever before.