By James Clayton,BBC Newsnight
Sara needed some chocolate – she had had one of those days – so wandered into a Home Bargains store.
“Within less than a minute, I’m approached by a store worker who comes up to me and says, ‘You’re a thief, you need to leave the store’.”
Sara – who wants to remain anonymous – was wrongly accused after being flagged by a facial-recognition system called Facewatch.
She says after her bag was searched she was led out of the shop, and told she was banned from all stores using the technology.
“I was just crying and crying the entire journey home… I thought, ‘Oh, will my life be the same? I’m going to be looked at as a shoplifter when I’ve never stolen’.”
Facewatch later wrote to Sara and acknowledged it had made an error.
Facewatch is used in numerous stores in the UK – including Budgens, Sports Direct and Costcutter – to identify shoplifters.
The company declined to comment on Sara’s case to the BBC, but did say its technology helped to prevent crime and protect frontline workers. Home Bargains, too, declined to comment.
It’s not just retailers who are turning to the technology.
On a humid day in Bethnal Green, in east London, we joined the police as they positioned a modified white van on the high street.
Cameras attached to its roof captured thousands of images of people’s faces.
If they matched people on a police watchlist, officers would speak to them and potentially arrest them.
Unflattering references to the technology liken the process to a supermarket checkout – where your face becomes a bar code.
On the day we were filming, the Metropolitan Police said they made six arrests with the assistance of the tech.
That included two people who breached the terms of their sexual-harm prevention orders, a man wanted for grievous bodily harm and a person wanted for the assault of a police officer.
Lindsey Chiswick, director of intelligence for the Met, told the BBC the tech’s speed was extremely helpful.
“It takes less than a second for the technology to create a biometric image of a person’s face, assess it against the bespoke watchlist and automatically delete it when there is no match.”
The BBC spoke to several people approached by the police who confirmed that they had been correctly identified by the system – 192 arrests have been made so far this year as a result of it.
But civil liberty groups are worried that its accuracy is yet to be fully established, and point to cases such as Shaun Thompson’s.
Mr Thompson, who works for youth-advocacy group Streetfathers, didn’t think much of it when he walked by a white van near London Bridge in February.
Within a few seconds, though, he was approached by police and told he was a wanted man.
“That’s when I got a nudge on the shoulder, saying at that time I’m wanted”.
He was asked to give fingerprints and held for 20 minutes. He says he was let go only after handing over a copy of his passport.
But it was a case of mistaken identity.
“It felt intrusive… I was treated guilty until proven innocent,” he says.
The BBC understands the mistake might have been due to a family resemblance. The Metropolitan Police declined to comment.
‘Digital line-up’
Silkie Carlo, director of Big Brother Watch, has filmed the police on numerous facial-recognition deployments. She was there the night Shaun Thompson was picked up by police.
“My experience, observing live facial recognition for many years, [is that] most members of the public don’t really know what live facial recognition is,” she says.
She says that anyone’s face who is scanned is effectively part of a digital police line-up.
“If they trigger a match alert, then the police will come in, possibly detain them and question them and ask them to prove their innocence.”
The use of facial recognition by the police is ramping up.
Between 2020 and 2022 the Metropolitan Police used live facial recognition nine times. The following year the figure was 23.
Already in 2024 it has been used 67 times, so the direction of travel is clear.
Champions say that misidentifications are rare.
The Metropolitan Police say that around one in every 33,000 people who walk by its cameras is misidentified.
But the error count is much higher once someone is actually flagged. One in 40 alerts so far this year has been a false positive.
Michael Birtwhistle, head of research at the Ada Lovelace Institute research group, believes the technology is so new that the laws have not yet caught up.
“I think it absolutely is a Wild West at the moment. That’s what creates this legal uncertainty as to whether current uses are unlawful or not,” he says.
In Bethnal Green, although some people the BBC spoke to were worried about the use of the tech, a majority were supportive – if it helped to tackle crime.
That leads to another question about the technology: will it help in the long run?
As people get more used to seeing white vans parked on busy high streets, will people who know they are wanted by police simply get wise to the cameras and avoid them? Will shoplifters hide their faces?
Ms Carlo says society needs to guard against facial recognition becoming normalised.
“Once the police can say this is OK, this is something that we can do routinely, why not put it into the fixed-camera networks?”
This is the dystopian future that civil-liberty campaigners are most afraid of – a China-style mass-surveillance state.
Advocates dismiss such dire predictions as overblown.
And it is also clear there are plenty among the public who are willing to put up with having their faces scanned – if it means safer streets.