How facial recognition is identifying the dead in Ukraine
Last month a controversial facial recognition company, Clearview AI, announced it had given its technology to the Ukrainian government.
The BBC has been given evidence of how it is being used – in more than a thousand cases – to identify both the living and the dead.
This story contains graphic descriptions that may be upsetting to some readers.
A man lies motionless on the floor, his head tilted down. His body is naked, apart from a pair of Calvin Klein boxers. His eyes are ringed with what look like bruises.
The body was found in Kharkiv, eastern Ukraine – in the wreckage of war. The BBC has seen pictures taken at the scene, but does not know the circumstances around his death. There is clear evidence of head trauma. He also had a tattoo on his left shoulder.
Ukrainian authorities didn’t know who the man was, so decided to turn to a cutting edge method: facial recognition using artificial intelligence.
Clearview is perhaps the most famous, and controversial, facial recognition system in the world.
The company has scraped billions of photos from social media companies, like Facebook and Twitter, to create an enormous database of what its CEO and founder Hoan Ton-That calls “a search engine for faces”.
“It kind of works like Google. But instead of putting in a string of words or text, the user puts in a photo of a face,” explains Mr Ton-That.
The company has faced a string of legal challenges. Facebook, YouTube, Google and Twitter have sent cease-and-desist letters to Clearview – to ask them to stop using pictures from the sites. The UK’s Information Commissioner’s Office even fined the company for failing to inform people it was collecting photos of them.
Now, its use by the Ukrainian government has raised questions over the implications of infusing this powerful technology into an active war.
Clearview is used extensively – though divisively – by law enforcement in America. Ton-That says 3,200 government agencies have either bought or trialled the technology.
After Vladimir Putin’s invasion of Ukraine, Clearview’s founder saw another application for the technology.
“We saw images of people who were prisoners of war and fleeing situations, and you know, it got us thinking that this could potentially be a technology that could be useful for identification, and also verification,” he says.
He quickly offered the Ukrainian government the technology – an offer that was accepted.
Back in Kharkiv, authorities took a picture of the dead man’s face – his head held up, his sunken eyes directed towards the camera.
They snapped a picture, and ran it through Clearview’s database. The search returned several pictures of someone who looked very similar to the dead man.
One picture had been taken on what looks like a hot day. The man was shirtless. He had a tattoo on his left shoulder.
The design matched. They had a name.
Using facial recognition to identify the dead is not new, and Clearview isn’t the only platform being used to do it in Ukraine.
“We’ve been using this stuff for years now” says Aric Toler, research director at Bellingcat, an organisation that specialises in investigative journalism.
In 2019, Bellingcat used facial recognition technology to help identify a Russian man who had filmed the torture and killing of a prisoner in Syria. This is not facial recognition’s first war.
But its use in Ukraine is more wide-ranging than in any previous conflict. Mr Toler says that he uses the facial recognition platform FindClone in Russia, and that it’s been particularly helpful for identifying dead Russian soldiers.
As with Clearview, FindClone searches through publicly available internet images, including Russian social media pages.
Even people who do not have social accounts can be found.
“They might not have a social media profile but their wives or girlfriends might… sometimes they do have profiles and they live in a small town with a big military base. Or they may have a lot of friends who are currently in their unit”, Mr Toler explains, describing FindClone’s use as an investigative tool.
This last point is fundamental in understanding the power of facial recognition technology.
It means that even if a person has never had a social media profile, and thinks they’ve wiped the internet clean of their image – they can still be found. By appearing in a photo uploaded by a friend or simply by being in the background of a random picture on the internet, they are in the database.
It means even military or security personnel, who barely have any presence on the internet, can still be traced.
A question of accuracy
Critics however point out that facial recognition technology is by no means always correct – and that in a time of war, errors could have potentially disastrous consequences.
Clearview isn’t just being used to identify dead bodies in Ukraine. The company also confirmed it was being used by the Ukrainian government at checkpoints to help identify enemy suspects.
Clearview showed the BBC an email, from a Ukrainian agency, confirming that the system was being used to identify the living.
“The system gave us the opportunity to quickly confirm the accuracy of the data of detained suspects” reads the email, from a Ukrainian official who did not want to be named.
“During the use of Clearview AI, more than 1,000 search queries were performed to conduct the appropriate verification and identification,” the email reads.
This worries some analysts.
Conor Healy is a facial recognition expert at IPVM, an organisation that reviews security technology.
“It’s important for the Ukrainian forces to recognise that this is not a 100% accurate way of determining whether somebody is your friend or your foe,” Mr Healy says.
“It shouldn’t be a life or death technology where you either pass or fail, where you could get imprisoned or, god forbid, even killed. That’s not how this should be used at all.”
Others have issued more dire warnings. Albert Fox Cahn, of the watchdog group Surveillance Technology Oversight Project, has called it “a human rights catastrophe in the making”.
“When facial recognition makes mistakes in peacetime, people are wrongly arrested. When facial recognition makes mistakes in a war zone, innocent people get shot,” he told Forbes.
The BBC contacted the Ukrainian government for comment on its use of Clearview, but did not receive a response.
Mr Ton-That has defended the accuracy of Clearview’s technology, saying tests had found it to be more than 99% accurate.
Much depends though on the quality of the image, the position of the head, or whether the face is covered, for example by a mask.
Then there is the issue of privacy, which has been problematic for Clearview in the US and Europe. The company pulls publicly available pictures from firms like Facebook and Instagram to build its database.
But it didn’t ask social media companies, or anyone in fact, whether it could scrape these pictures. If you are reading this, you are almost certainly in the database, though you likely didn’t give Clearview permission to use your image.
Last year, Clearview was fined by the UK’s Information Commissioner’s Office for failing to inform people that it was collecting photos of them from social media platforms.
Mr Ton-That accepts there is still debate around the legality of facial recognition technology, but believes Clearview operates within the law – saying the technology has been “misunderstood”.
Facial recognition technology, though, clearly has dystopian applications. In November last year the BBC reported that plans were being drawn up in China to use facial recognition tech to target journalists.
Mr Ton-That says Clearview wouldn’t allow these kinds of searches, even if they could be used this way. He says Clearview does not work with authoritarian governments and that the company would not work with Russia.
There are, however, applications for Clearview’s tech in a military context. Last year the company signed a contract with the Pentagon to explore putting its tech into augmented reality glasses for example. It is one of several companies developing facial recognition AI with military contracts.
Privacy advocates have another worry too. Facial recognition technology might be useful to the Ukrainian authorities in a time of war. But will they simply hand the technology back to Clearview in a time of peace?
“There are any number of examples of technologies that are introduced in wartime and that persist into peacetime,” says Mr Healy.
“I hope that that’s not the approach they take.”
-
- 29 November 2021