Sir David Davis writes in The Guardian about the threat of live facial recognition to individual privacy
The pernicious growth of the surveillance state continues with the imminent rollout of live facial recognition cameras across Britain. That is why this week I signed a statement, alongside a cross-party group of MPs, experts and civil liberties campaigners, calling for a halt to facial recognition trials to give parliament the chance to debate it properly. There is a point at which crime-fighting measures cease to challenge the guilty and become a threat to the innocent. That point has come with unfettered facial recognition.
Over the past few years six forces, including the Metropolitan police, have trialled facial recognition, with spectacularly poor results. After surveilling hundreds of thousands of people, the Met has made a mere handful of arrests using this technology in the past four years. Indeed, 81% of the “suspects” caught by the cameras were simply innocent bystanders not on the police’s watchlists. Two of the trials even had an error rate of 100%.
Similarly, trials by South Wales police had an error rate of 90%. Clearly the technology does not yet work. As it stands, it is more likely to deliver a miscarriage of justice than solve a crime.
Imagine taking part in a demonstration on climate change or Brexit, or enjoying the Notting Hill carnival. You could be surrounded by cameras sucking in your personal data, then pulled out of the protest and accused of a crime you did not commit. You would rightly feel failed by an intrusive justice system.
The consequences can be even more malign. Experts including the London police ethics panel argue that facial recognition could have a racial and gender bias. That is certainly what the American experience with this technology implies. The technology relies on sifting through the biometric data of thousands of people on criminal databases. But the datasets do not have enough information on racial minorities or women to be accurate.
Many of these groups already have a deep mistrust of the police. Being wrongly targeted by a racially biased algorithm will not help this.
And it is not just the state that is involved. An investigation by Big Brother Watch found that privately owned sites – including shopping centres, property developers, museums and casinos – have been using facial recognition, too. A trial in Manchester’s Trafford Centre scanned more than 15 million faces before ultimately being stopped in its tracks by the surveillance camera commissioner.
Just last month the Financial Times revealed that the King’s Cross estate in London was trialling facial recognition. It later emerged the Met provided much of the biometric data to train the algorithms. With the UK already the most surveilled nation in the world, facial recognition has the potential to become an epidemic of intrusiveness.
Sadly, the high court in Wales did not grasp the conflict with civil liberties, recently ruling that a facial recognition trial by South Wales police was legal. The court made constant comparisons between facial recognition and the police’s use of fingerprints and DNA. But they are entirely different. Officers take fingerprints or DNA samples when they have a reasonable suspicion of a crime. They do not put a DNA or fingerprint scanner on the roadside and collect our data en masse.
When the police do scan your fingerprints or take a DNA sample, they are not simultaneously scooping up the fingerprint data of millions of other innocent people.
The court cited vague common law principles and legislation to back up its ruling. But when these laws were written, they did not have facial recognition or biometric data in mind – the technology did not exist.
The court’s approach leaves the door open to even more disturbing interferences. Technology is being developed to analyse the way we walk, our heartbeat, our microbial trace, even the scent we leave behind.
Police forces jump at new technologies, and private organisations are often just as keen. Once they get their hands on these new developments, day-to-day individual privacy will be well and truly dead. That is why parliament must have the chance to debate a new legal framework before the technology goes any further.
The respected biometrics commissioner has said we need an “informed public debate to help guide our lawmakers”. Police forces must be clear on when they can use the technology; what data they can store; how long they can store
it for; and how it is used in investigations and trials.
MPs must have the opportunity to consider the accuracy of facial recognition and its potential biases. We need a new legal framework to control these emerging technologies: one that will help police forces tackle crime, but that also protects individual privacy and Britain’s traditions of civil liberties and the rule of law.