AI Facial Recognition Mistakes Software Engineer for Burglar 100 Miles Away
When the Algorithm Gets It Spectacularly Wrong
If you ever needed proof that artificial intelligence is not quite ready to play detective, meet Alvi Choudhury. The 26-year-old software engineer was working from home at his parents' house in Southampton back in January when police turned up at his door, arrested him, and held him in custody for close to 10 hours. His alleged crime? A burglary at the Milton Keynes Buddhist Vihara, roughly 100 miles away, where £3,000 and jewellery had been stolen in December 2025.
The problem? Choudhury did not do it. Not even close. And the person captured on CCTV reportedly looked about 10 years younger than him, with lighter skin, a bigger nose, no facial hair, and different facial features entirely. You might say the resemblance was, at best, a stretch.
How Did This Happen?
The culprit behind the wrongful arrest was Thames Valley Police's automated facial recognition system, built by German firm Cognitec. The software trawls through approximately 25,000 searches every month, comparing images against a database of around 19 million police mugshots. It flagged Choudhury as a match for the burglary suspect.
Here is the particularly galling detail: Choudhury's mugshot was only in the system because of a previous wrongful encounter with the police. Back in 2021, while at university in Portsmouth, he was attacked and then somehow ended up being the one arrested. He was released with no further action, but his photograph stayed on file. So a system built on flawed data produced a flawed result. There is a certain grim poetry to it.
Thames Valley Police maintained that the arrest was lawful, arguing that the facial recognition technology "provided the intelligence but did not determine the arrest" and that officers made their own visual assessment before proceeding. Choudhury, understandably, finds that explanation less than convincing.
The Numbers That Should Worry Everyone
This is not just one unfortunate mix-up. Home Office commissioned research from December 2025 revealed some deeply uncomfortable statistics about the technology's accuracy across different demographics:
- 0.04% false match rate for white faces
- 4% false match rate for Asian faces
- 5.5% false match rate for Black faces
To put that in perspective, the system is 100 times more likely to wrongly identify an Asian person than a white person. Black women are almost 250 times more likely to be misidentified than white men. These are not rounding errors. These are systemic failures baked into the algorithm itself, which reportedly uses technology dating back to 2020 that the Home Office is now in the process of replacing.
Fighting Back
Choudhury has not taken this quietly. He appeared on ITV's Good Morning Britain to share his story publicly, describing the facial recognition system's output as riddled with "horrific errors." He is now pursuing legal action, seeking damages from both Thames Valley Police and Hampshire Constabulary, with specialist solicitor Iain Gould representing him.
He is not alone in pushing back either. The Equality and Human Rights Commission has agreed to support related legal cases, and Essex Police separately suspended their live facial recognition deployment in March 2026 after a Cambridge University study found racial bias in their system.
The Bigger Picture
This case highlights a fundamental tension in modern policing. Facial recognition technology promises efficiency, but when it gets things wrong, it gets them wrong in ways that disproportionately affect people of colour. An innocent man lost 10 hours of his life sitting in a police cell because an algorithm decided his face was close enough. That is not a minor inconvenience. That is a serious failure of a system that society is increasingly being asked to trust.
Until the technology can demonstrate genuine accuracy across all demographics, cases like Choudhury's will keep happening. And frankly, "the computer said so" should never be good enough grounds to put someone in handcuffs.
Read the original article at source.
No comments yet. Be the first to share your thoughts.