Wednesday, December 18, 2024

What is facial recognition technology and why is its use by police controversial?

Must read

The Metropolitan Police has been expanding its use of live facial recognition (LFR) technology, which it says has helped put away offenders who pose “significant risk” to communities in London.

The force said this week that more than 500 arrests have been made this year using the cameras, including 50 individuals suspected of serious violence against women and girls, and 50 sex offenders in breach of their court conditions.

While some welcome the technology as an effective means of getting dangerous criminals off the streets, campaign groups have warned of a risk of misidentification, and what could essentially become a “dystopian surveillance state” where everyone is “treated like suspects”.

Yahoo News looks at how LFR is used and why it is controversial.

LFR is a real-time operation involving a live camera feed (or multiple cameras) which are used to compare the faces of people passing by against a predetermined watchlist, the College of Policing says.

It is used to locate people of interest, such as missing people, or those with outstanding warrants.

The College of Policing says the first time LFR was used for policing in England and Wales was during the 2017 UEFA Champions League final in Cardiff between Real Madrid and Juventus.

However, Leicestershire Police scanned the faces of 90,000 attendees at the Download music festival in 2015 – cross-checking them against list of wanted criminals in Europe who target similar events.

Live facial recognition is used by some police forces in England and Wales, but has not been deployed in Scotland or Northern Ireland.

Only the Metropolitan Police and South Wales Police have a permanent capability for live facial recognition, but other forces have used it for certain events.

It is often used at large events or in busy areas, typically using mobile LFR vans, which have clear signs on the outside telling people about the operation, along with additional signage in the area.

Each face picked up by the cameras is mapped by software, taking measurements of facial features, such as the distance between the eyes and the length of the jawline, to create a unique set of biometric data.

This dataset is then compared to the watchlist, and if an image is sufficiently similar to someone in the police’s database, an alert is sent to an engagement team, who will then speak to the person of interest.

Police forces can also use retrospective facial recognition, which involves taking images or video from regular CCTV, and putting it through a database at a later period in an attempt to identify a person as part of an investigation.

In the US, experts have been debating whether this technique could be used to identify the man suspected of fatally shooting UnitedHealthcare chief executive Brian Thompson in New York City.

Forces that do use this technology, such as the Metropolitan Police, argue it enables “a more precise and intelligence-led approach to tackling crime”.

The Met’s director of performance, Lindsey Chiswick, said earlier this week: “This technology is helping us protect our communities from harm.

“It is a powerful tool that supports officers to identify and focus on people who present the highest risk that may otherwise have gone undetected.

“From targeting sex offenders to apprehending those responsible for violent crimes, Live Facial Recognition is helping us deliver justice more effectively while making our streets safer.”

The Met says it has “implemented robust safeguards” to address people’s concerns about this technology, insisting that if a person walks past an LFR camera and is not on a police watchlist, their biometric data is immediately and permanently deleted.

Civil liberties campaign group Big Brother Watch argues LFR technology risks “turning us all into walking ID cards”, describing its use as “an enormous expansion of the surveillance state”.

In August, the group’s director, Silkie Carlo told the Guardian: “These are the surveillance tactics of China and Russia and Starmer seems ignorant of the civil liberties implications.”

Not only do privacy campaigners argue that ordinary law-abiding citizens shouldn’t be the subject of such monitoring, they also claim there is a risk of people being misidentified and treated as a suspect on faulty grounds.

Big Brother Watch is currently assisting a legal claim against the Met Police by anti-knife crime campaigner Shaun Thompson, 38, who was wrongly flagged as a person of interest on the Met’s facial recognition database outside London Bridge station in February.

Shoppers walk past a Met Police van that's using Artificial Intelligence (AI) Facial Recognition (FI) technology to spot persons of interest, and pre-empt criminal activity, on Oxford Street, on 28th August 2024, in London, England. Facial Recognition (FR) technology is used by the Met to prevent and detect crime, find wanted criminals, safeguard vulnerable people. Real time surveillance helps officers locate people on a 'watchlist' who are sought by the police. FR cameras focus specific areas and  when people pass through, their images are streamed directly to the Live Facial Recognition system and compared to a watchlist. (Photo by Richard Baker / In Pictures via Getty Images)

Shoppers walk past a Met Police LRF van in Oxford Street. (Getty Images)

He was held by officers for almost 30 minutes, who repeatedly demanded scans of his fingerprints and threatened him with arrest, despite him providing multiple identity documents to show he was not the man they were looking for.

Thompson describes the technology as “stop and search on steroids” and says it “doesn’t make communities any safer”.

While facial recognition technology is more accurate now than it was a decade ago, research conducted by the MIT Media Lab in 2018 found the error rate was 11.8% to 19.2% higher when matching darker coloured faces – prompting concerns of racial discrimination.

The annual report of the ­biometrics and surveillance camera commissioner found that the images of innocent people who have been arrested and subsequently released are still being held on the police national database.

These images can be used for facial recognition checks of potential suspects, the Guardian reports.

Claims by the Met that the use of LFR technology in the borough of Lewisham has widespread support among the public has been challenged, with a community impact assessment obtained by Computer Weekly showing there had been minimum consultation among residents.

As rioting broke out in several towns and cities following the Southport stabbings, Sir Keir Starmer pledged a “wider deployment of facial recognition technology” to help police “tackle violent disorder”.

This prompted concerns among 27 campaign groups, including Amnesty International, Big Brother Watch and Privacy International, who sent a letter to the prime minister urging him to reconsider.

The letter says: “Whilst we urge you to take robust action to stop the violence, protect our communities and bring those responsible for this criminal behaviour to justice, we have serious concerns regarding the use of facial recognition surveillance and urge you to drop any plans to expand police use of live facial recognition surveillance in particular.

“Live facial recognition (LFR) cameras subject thousands of passers-by to unwarranted biometric identity checks and invert the democratic principle of the presumption of innocence. LFR continues to have issues with accuracy, bias and discrimination.”

A person is detained by Metropolitan Police officers in Croydon, south London, after being identified by live facial recognition technology. The Met use the technology to tackle crime and identify offenders, helping officers to target those wanted for causing serious violence. Picture date: Friday February 9, 2024. (Photo by Jordan Pettitt/PA Images via Getty Images)A person is detained by Metropolitan Police officers in Croydon, south London, after being identified by live facial recognition technology. The Met use the technology to tackle crime and identify offenders, helping officers to target those wanted for causing serious violence. Picture date: Friday February 9, 2024. (Photo by Jordan Pettitt/PA Images via Getty Images)

A person is detained by Metropolitan Police officers in Croydon after being identified by live facial recognition technology. (Getty Images)

Meanwhile, Police Scotland’s proposal to use live facial recognition technology has been criticised, with the Scottish government challenged on the question of whether legal advice had been sought.

Chief Constable Jo Farrell vowed to utilise Artificial Intelligence and live facial recognition technology as part of a six-year strategy revealed last month – where she compared it to medical breakthroughs in detecting cancer.

Scottish Lib Dem justice spokesperson Liam McArthur has submitted 40 parliamentary questions into the proposed policy. These include asking the Scottish government how it responds to “reports that the use of live facial recognition by South Wales Police has produced 2,833 false alerts, compared with only 72 resultant arrests”.

McArthur said: “I am concerned that decisions that dramatically reframe the relationship between the police and the public are being treated as an inevitable consequence of the march of technology. There needs to be a compelling need, an appropriate legal basis and a proper public debate before the police can consider moving forward with measures like this. That simply has not happened.”

A Scottish government spokesperson said: “The lawful, effective and proportionate use of any technology with facial recognition capability is an operational matter for Police Scotland, who must abide by the Scottish Biometrics Commissioner’s statutory code of practice.”

Read more

Latest article