Light bulb Limited Spots Available: Secure Your Lifetime Subscription on Gumroad!

Apr 23, 2026 · 7 min read

The UK Just Ruled That Police Can Scan 3 Million Faces a Year With AI—The Man It Wrongly Identified Plans to Appeal

London's High Court ruled live facial recognition lawful under human rights law. The Metropolitan Police scanned over 3 million faces in 2025. Independent testing found 80% of false positives were Black people.

Surveillance cameras on a pole overlooking a busy London street with pedestrians walking below

The Ruling

On April 22, the UK High Court dismissed a legal challenge to the Metropolitan Police's use of live facial recognition, ruling that the technology does not violate the European Convention on Human Rights. Lord Justice Holgate and Mrs Justice Farbey found that the Met's policy includes "clear, precise and effective safeguards" that make its deployment legal.

The ruling clears the way for a planned expansion from 10 to 50 facial recognition vans across England and Wales, a plan the UK government had already approved before the case was decided.

The Man Who Brought the Case

Shaun Thompson, an anti knife crime campaigner and youth worker, was walking near Croydon when a live facial recognition camera flagged him as a criminal suspect. He was stopped by officers and faced potential arrest based on the system's match. Thompson presented bank cards and a passport to prove his identity. The system had confused him with another person.

"No one should be treated like a criminal due to a computer error," Thompson said after the ruling. He has announced he will appeal.

Thompson and Silkie Carlo, director of the civil liberties organization Big Brother Watch, argued the Met's facial recognition policy gave officers excessive discretion and violated rights to privacy, expression, and assembly under the European Convention on Human Rights. The court disagreed, finding that crime hotspot restrictions, senior oversight, and immediate deletion of non matching faces were sufficient safeguards.

The Numbers Behind the Cameras

The Metropolitan Police's own data reveals the scale of the surveillance:

  • 3,147,436 faces scanned in the latest review period
  • 2,100+ arrests attributed to live facial recognition since 2024
  • 100+ sex offenders identified and arrested
  • 24% of arrests involved violent crimes against women and girls
  • 2,077 false alerts from total scans (0.48% false positive rate)

The Met describes this as a success story. Over two thousand arrests, including violent offenders, from a system that processes over three million faces with a sub one percent error rate. But the false positive rate tells only half the story.

The Bias the Court Dismissed

Independent testing by the National Physical Laboratory found the system "more likely to incorrectly include some demographic groups" in its matches. The data is stark: 80% of false positives from the Met's system involved Black people, despite Black residents making up roughly 13% of London's population.

The court acknowledged the bias data but ruled it did not invalidate the system's legality, stating they were "not able to accept, on the thin submissions advanced before us, that concerns about discrimination infect the legality of the policy."

Big Brother Watch's Silkie Carlo called the judgment "disappointing" and said the ruling "gives the green light to a surveillance infrastructure that disproportionately impacts communities of color."

How Live Facial Recognition Works

The Metropolitan Police deploys vans equipped with cameras in high traffic areas. The cameras capture the face of every person who passes within range and compare it in real time against a watchlist of criminal suspects. If the system detects a match, officers nearby are alerted to approach the individual.

Images of people who do not match the watchlist are deleted immediately, according to the Met. But the scan itself is not optional: everyone who walks past the camera has their face processed by the algorithm, whether they consent or not.

The technology differs from retrospective facial recognition, which searches existing databases of photos. Live facial recognition operates in the moment, scanning crowds as they move through public spaces. It is the difference between searching a filing cabinet and checking every person who walks through a door.

Why the Expansion Matters

The UK government has approved expanding from 10 facial recognition vans to 50, covering police forces across England and Wales. Some forces had paused their own rollouts over racial bias concerns, but the High Court ruling gives them legal cover to proceed.

At the current scanning rate, 50 vans could process tens of millions of faces per year. The Met's system alone scanned over 3 million in a single review period. Scaling that across the country would create one of the most extensive real time biometric surveillance networks in any democracy.

The UK already has more CCTV cameras per capita than almost any country outside China. Adding live facial recognition to that infrastructure transforms passive recording into active identification. Every camera becomes a checkpoint.

The Global Pattern

The UK ruling fits a growing trend. The National Cyber Security Centre estimates that commercial facial recognition technology is now accessible to over 100 countries, up from 80 in 2023. While the EU's AI Act restricts real time biometric identification in public spaces, the UK is no longer bound by EU rules and has chosen a permissive approach.

In the United States, cities like San Francisco and Portland banned government facial recognition, while others have embraced it. The patchwork mirrors what is happening with state privacy laws: the protection you get depends on where you stand, literally.

For journalists, activists, and anyone whose work involves operating under surveillance pressure, the ruling is a signal that legal challenges to biometric surveillance face an uphill battle in the UK. The appeal, when it comes, will test whether any court is willing to set a harder boundary.

What Happens Next

Thompson's planned appeal will go to the Court of Appeal, the second highest court in England and Wales. If it fails there, the only remaining option is the UK Supreme Court.

Meanwhile, the vans will continue to deploy. Every person who walks past one will have their face scanned, compared against a database, and, if the system works correctly, immediately discarded. If it makes an error, which it does 2,077 times for every 3 million scans, the consequences fall disproportionately on one community. The court ruled that is legal. Thompson disagrees.

Stop Email Tracking in Gmail

Spy pixels track when you open emails, where you are, and what device you use. Gblock blocks them automatically.

Try Gblock Free for 30 Days

No credit card required. Works with Chrome, Edge, Brave, and Arc.