Cardiff Beyoncé concert: Face recognition use criticised

A facial recognition van in Cardiff

Thousands of fans attending the Beyoncé concert in Cardiff could be scanned by live face recognition cameras deployed in the area by South Wales Police.

The cameras will help identify people wanted for “priority offences”.

Police also used the cameras at the Coronation, but the technology has been criticised by human rights campaigners.

The Surveillance Camera watchdog Fraser Sampson said more work needs to be done to check for bias in the use of the technology.

He also warned that rules governing the technology could be weakened by a planned new law, and that without them we risk China-like surveillance.

European law-makers recently backed an effective ban on live face recognition cameras in public spaces.

About 60,000 fans are expected in Cardiff city centre for Beyoncé’s Principality Stadium concert on Wednesday.

A live facial recognition camera works by comparing faces with a “watch list” – using Artificial Intelligence.

The watchlist could be made up of people who are wanted for crimes, for example.

South Wales Police said that if you are not on a watch list, the biometric data won’t be stored – and immediately deleted.

The CCTV footage is recorded and kept for up to 31 days.

Fans queuing

They also say the decision to stop someone isn’t made by technology – but by officers who check to see if the alert from the cameras from the live footage of the street matches the wanted person.

South Wales Police say they’ll use the facial recognition cameras at the Beyoncé concert “to support policing in the identification of persons wanted for priority offences… to support law enforcement… and to ensure the safeguarding of children and vulnerable persons”.

Beyoncé fans

“Facial Recognition is not a condition of entry and it will not be on the stadium footprint,” a spokesperson said.

The BBC approached other forces to ask if the cameras would be operating in other cities where Beyoncé is performing.

Police Scotland said it does not use the technology, Northumbria Police declined to comment and the Metropolitan Police said it would publicise any plans in advance.

The tech was also used at the Coronation where it scanned 68,000 faces against a watchlist of over 10,000 faces, The Guardian reported.

On Wednesday, the Metropolitan Police gave more detail about its use to MPs on the Commons Home Affairs Committee.

This Twitter post cannot be displayed in your browser. Please enable Javascript or try a different browser.View original content on Twitter

The BBC is not responsible for the content of external sites.

Skip twitter post by Danny Shaw

Allow Twitter content?

This article contains content provided by Twitter. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. You may want to read Twitter’s cookie policy and privacy policy before accepting. To view this content choose ‘accept and continue’.

The BBC is not responsible for the content of external sites.

Ruled unlawful

South Wales Police had previously been successfully challenged over its use of face recognition cameras, in an important court ruling.

Cardiff resident Ed Bridges – whose face was scanned in the city centre and then again at an anti-arms protest – won an Appeal Court victory against the force. In 2020 judges ruled the use of the technology by them, since 2017, was unlawful.

But after the case, police continued to trial the cameras, at a Speedway event and at Wales versus Italy rugby match – which scanned 108,540 faces, leading to two arrests.

Beyonce and Jay Z performing on stage

Getty Images

This data, along with data from the Metropolitan Police, was fed into a review by the National Physical Laboratory last month which said there was a “substantial improvement” in accuracy – with a false match of one in 6,000 people who pass the camera.

Police also argue that review showed the technology could be operated so that it was not discriminatory against race and gender.

Katy Watts, a lawyer for the human rights group, Liberty, who acted for Mr Bridge,s has criticised the latest deployment of the technology.

“What the police won’t admit is that facial recognition doesn’t actually make people safer… it entrenches patterns of discrimination in policing,” she said.

“They’re violating the privacy of thousands of people to make only potentially one or two very minor arrests.”

Ms Watts added she was uncertain if it adequately incorporated safeguards on where it’s deployed and who ends up on the watch list, as the court judgement required.

“Seeing the police continue to violate our freedoms by rolling it out again is deeply concerning and the only safe thing to do is ban the technology altogether”, Ms Watts added.

Facial recognition van

A report published on Tuesday revealed that Policing Minister Chris Philp “expressed his desire to embed facial recognition technology in policing and is considering what more the government can do to support the police on this”.

The Biometrics and Surveillance Camera Commissioner Professor Fraser Sampson said “the intent now from the Home Office [is] to embed facial recognition across policing and law enforcement”, but he added that widespread use was still some way off.

In Prof Sampson’s view, bias was still a concern, as he said: “It’s long way off saying we’ve addressed issues of bias, either technically, or in terms of public perception of it,”

He said more work was needed to build public confidence as the tech was rolled out UK-wide.

However, Prof Sampson believes there is a role for it in policing but it has to be proportionate.

He also criticised plans in the Data Protection Bill and Digital Information Bill – currently working through parliament – which will remove the surveillance camera code which provided guidance to police.


Scotland has its own commissioner and a code of practice covering the technology.

Their commissioner has also expressed concerns about the impact of the bill.

Professor Sampson said that the government had its “foot on the gas”, but there were insufficient guard-rails to stop us entering a world of dystopian surveillance

“If you have at one end, a policy that is driving the police and law enforcement to use it forward, and you have the technology, which is on a daily basis, increasing their ability to do that, but at the other end, you have no buffers, you have no stop point and you have no sort of guidance along the way, then yes, I don’t think there could be any other outcome because it will just proliferate in, in an unguided and uncontrolled way,” he said.

The UK government said it was committed to empowering the police to use new technologies like facial recognition in a fair and proportionate way.

“Facial recognition plays a crucial role in helping the police tackle serious offences including murder, knife crime, rape, child sexual exploitation and terrorism,” a spokesperson said.

It argues the new bill will simplify regulation making it easier to understand.


Leave a Reply

Skip to toolbar