UK law enforcement has proudly been using facial recognition for tech
for a few years now. As is the case with any new law enforcement tech
advancement, it's being deployed as broadly as possible with as little oversight as agencies can get away with.
As of 2015, UK law enforcement had 18 million faces stashed away in its databases. Presumably, the database did not contain 18 million criminals and their mugshots. Concerns were raised but waved away with promises to put policies in place at some point in the future and with grandiose claims of 100% reliability. The latter, naturally, came from the police inspector who headed the facial recognition department. Caveat: this had only been tested on a limited set using "clear images."
What works well in theory and/or with limited datasets doesn't work especially well in practice. Here's how things went down when the facial recognition program was deployed in the wild.
The controversial trial of facial recognition equipment at Notting Hill Carnival resulted in roughly 35 false matches and an erroneous arrest, highlighting questions about police use of the technology.
The system only produced a single accurate match during the course of Carnival, but the individual had already been processed by the justice system and was erroneously included on the suspect database.
Yeah, that's going to keep UK citizens from being menaced by terrorists, drug dealers, and whatever else was cited to keep the facial recognition program from being derailed by concerned legislators and citizens. And, while the tech was busy failing to do its job, a few thousand photos of people engaged in nothing more than being criminally underdressed were added to the pot of randomly-drawn faces for the next round of facial recognition roulette.
Supposedly, this was a trial run. The false positives were apparently derived from a list of suspects' faces wanted on rioting-related charges. Fortunately, those who were approached by officers as the result of bogus tech tip-offs had their identification documents on them. Nothing in the law requires you to carry them wherever you go, but if the law's going to use tech as faulty as this, it may as well be a criminal offense to leave home without them. You're going to get rung up -- at least temporarily -- if you can't prove you aren't who the software says you are.
Undeterred by this resounding lack of success, the Metropolitan police are planning to test the software again. This will give another set of UK citizens the chance to be wrongfully arrested at some point in the near future. Until the bugs are worked out -- which means violating the rights and freedoms of UK citizens during the beta testing phase -- UK law enforcement facial recognition tech will still be remembered as the thing that caught that shoplifter that one time.
As of 2015, UK law enforcement had 18 million faces stashed away in its databases. Presumably, the database did not contain 18 million criminals and their mugshots. Concerns were raised but waved away with promises to put policies in place at some point in the future and with grandiose claims of 100% reliability. The latter, naturally, came from the police inspector who headed the facial recognition department. Caveat: this had only been tested on a limited set using "clear images."
What works well in theory and/or with limited datasets doesn't work especially well in practice. Here's how things went down when the facial recognition program was deployed in the wild.
The controversial trial of facial recognition equipment at Notting Hill Carnival resulted in roughly 35 false matches and an erroneous arrest, highlighting questions about police use of the technology.
The system only produced a single accurate match during the course of Carnival, but the individual had already been processed by the justice system and was erroneously included on the suspect database.
Yeah, that's going to keep UK citizens from being menaced by terrorists, drug dealers, and whatever else was cited to keep the facial recognition program from being derailed by concerned legislators and citizens. And, while the tech was busy failing to do its job, a few thousand photos of people engaged in nothing more than being criminally underdressed were added to the pot of randomly-drawn faces for the next round of facial recognition roulette.
Supposedly, this was a trial run. The false positives were apparently derived from a list of suspects' faces wanted on rioting-related charges. Fortunately, those who were approached by officers as the result of bogus tech tip-offs had their identification documents on them. Nothing in the law requires you to carry them wherever you go, but if the law's going to use tech as faulty as this, it may as well be a criminal offense to leave home without them. You're going to get rung up -- at least temporarily -- if you can't prove you aren't who the software says you are.
Undeterred by this resounding lack of success, the Metropolitan police are planning to test the software again. This will give another set of UK citizens the chance to be wrongfully arrested at some point in the near future. Until the bugs are worked out -- which means violating the rights and freedoms of UK citizens during the beta testing phase -- UK law enforcement facial recognition tech will still be remembered as the thing that caught that shoplifter that one time.