The Watching Cut
live facial recognition
In January 2026, the Home Office published a policing white paper announcing forty more live facial recognition vans, on top of ten already operational. By the end of the rollout: fifty vans across every force in England and Wales. More than £140 million invested in police technology, including £26 million for a national facial recognition system, £11.6 million for live facial recognition capabilities, and £115 million over three years for a new National Centre for AI in Policing called Police.AI.^[1]
The Home Secretary called it “as revolutionary to modern policing as fingerprinting was a century ago.”
The comparison does not hold. Fingerprints follow arrest. Live facial recognition scans everyone in a public space by default, without cause, comparing faces against watchlists in real time. The Metropolitan Police conducted 203 deployments between September 2024 and September 2025, generating 2,077 alerts and 962 arrests. Ten confirmed false positive identifications were made public. The actual error rate across all alerts is not published.^[2]
The software chosen for the UK rollout — supplied by Corsight AI, an Israeli company subcontracted through UK firm Digital Barriers — was previously deployed in Gaza by Israeli military intelligence Unit 8200. It was used at checkpoints to scan Palestinian civilians and in drone footage analysis. Israeli intelligence officials later expressed doubts about its accuracy. Reporting by the New York Times documented hundreds of wrongful detentions, including the case of Palestinian poet Mosab Abu Toha, who was detained, beaten, and interrogated for two days before being released without explanation.^[3]
A national consultation on the legal framework for police use of live facial recognition opened alongside the white paper. The Home Office acknowledged that the current patchwork of legislation does not give police confidence to deploy at scale, nor the public confidence it will be used responsibly. New legislation has not yet been introduced.^[4]
The companies building these systems — and the civil servants procuring them — share a cultural assumption: that surveillance, properly managed, makes people safer. It is not a conspiracy. It is a worldview. One that has never been tested against the alternative view — that a population which knows it is being watched in public spaces changes its behaviour in ways that are not captured by any arrest statistic.
The tool optimises for what it can count. It cannot count what people chose not to do, say, or become.
Question: When the state can identify every face in a crowd, what happens to the crowd?
Footnotes
^[1] Home Office, "White paper sets out reforms to policing," 26 January 2026. https://www.gov.uk/government/news/white-paper-sets-out-reforms-to-policing
^[2] Metropolitan Police live facial recognition statistics (September 2024–September 2025), as reported by The Register, 28 January 2026. https://www.theregister.com/2026/01/28/tech_in_policing_white_paper/
^[3] Al Jazeera, "UK police to use AI facial recognition tech linked to Israel's war on Gaza," 28 January 2026. https://www.aljazeera.com/news/2026/1/28/uk-police-to-use-ai-facial-recognition-tech-linked-to-israels-war-on-gaza
^[4] Home Office consultation on live facial recognition legal framework, as reported by Computer Weekly, January 2026. https://www.computerweekly.com/news/366638196/Home-Office-announces-sweeping-police-technology-plans
“Morgan Hale is independent verification without the editorial filter. Every cut is evidenced. Every question is open. Because it matters”

