The Metropolitan Police today revealed it will start using ‘Big Brother’ facial recognition cameras on the streets of London within the next month.
Scotland Yard says trials have allowed them to secure a 70 per cent success rate at picking up suspects who walk past cameras – but privacy campaigners say it is a ‘breath-taking assault on rights’.
Detectives will draw up 2,500-strong watchlists of those suspected of the most serious crimes including murders, gun and knife crime, drug dealing and child abuse.
Then cameras will be set up in busy areas such as in the West End, at major shopping centres, near sports and music events or high crime areas, for stints of five to six hours with officers in the area poised to grab people.
Suspects will be asked to identify themselves – and arrested if they are confirmed as wanted men or women. The cameras will also be used to trace missing people and vulnerable children.
Campaign group Big Brother Watch claims the Met’s accuracy claims are bogus citing a recent independent report saying that the technology is only right in just one in five cases – and cited a trial at Westfield in Stratford in 2018 where no one was arrested
Critics have also said that signs put up to alert the public that cameras are on could alert the criminals and allow them to flee.
This is the Met’s new facial recognition system known as ‘NeoFace, pictured at a press conference today. It compares the faces in the police database on the right of the screen with those passing the cameras – an officer gets an alert if there is a match. If someone is not wanted their face is automatically blurred
Campaigners say the use of the technology (file image) is a step too far towards a police state – but the Met is bringing it in permanently saying it is a crucial tool in stopping crime
A man was fined £90 in Romford by officers for disorderly behaviour after he tried to cover his face last year during a trial – but the Met says the trials are over and future policing will involve using these cameras
The Met has claimed that one in 1000 innocent people generate a ‘false alert’ but privacy campaigners are unhappy about the controversial technology and the rights of people not to be filmed.
Met uses Japanese facial recognition technology – and insists only 1 in 1,000 innocent people are ‘pinged’
The Metropolitan Police uses facial recognition technology called NeoFace, developed by Japanese IT firm NEC, which matches faces up to a so-called watch list of offenders wanted by the police and courts for existing offences.
Cameras scan faces in its view measuring the structure of each face, creating a digital version that is searched up against the watch list. If a match is detected, an officer on the scene is alerted, who will be able to see the camera image and the watch list image, before deciding whether to stop the individual.
Silkie Carlo, director of Big Brother Watch said: ‘This decision represents an enormous expansion of the surveillance state and a serious threat to civil liberties in the UK.
‘It flies in the face of the independent review showing the Met’s use of facial recognition was likely unlawful, risked harming public rights and was 81% inaccurate.
‘This is a breath-taking assault on our rights and we will challenge it, including by urgently considering next steps in our ongoing legal claim against the Met and the Home Secretary. This move instantly stains the new Government’s human rights record and we urge an immediate reconsideration’.
Big Brother Watch and Green Baroness Jenny Jones, who sits in the House of Lords, are pursuing a crowdfunded legal challenge against the Metropolitan Police and Home Secretary over facial recognition surveillance.
Protesters have also donned masks and picketed events where the cameras were tested, including during the crunch South Wales football derby between Cardiff and Swansea this month.
Last year there was a high profile storm last year when officers fined a pedestrian in Romford £90 for disorderly behaviour after he tried to cover his face when he saw a controversial facial recognition camera on the street.
He was not wanted for any crime but did not want to be pictured, which he said was his right.
But Scotland Yard, which is trying tackle record knife crime, gang violence and a high murder rate in the past year, insists it is a ‘vital tool’ to get more criminals off the streets.
Met Assistant Commissioner Nick Ephgrave, said today: ‘Facial recognition technology will be particularly useful in helping us tackle knife and gun crime, child sexual exploitation, as well as other serious crimes, and to protect the most vulnerable people.
A No Facial Recognition banner at the stadium during the Sky Bet Championship match between Cardiff City and Swansea City
Police insist people can decline to be scanned without arousing suspicion and the move is necessary to crack down on spiraling violence crime
Q&A: How police will b be using facial recognition technology to catch suspects in London
Why are the police using facial recognition technology?
The Metropolitan Police hopes live facial recognition technology will help reduce crime, especially violent incidents, and could be used as a tactic to deter people from offending. They claim trialling the system in real life conditions will enable them to gather accurate data and learn as much as possible.
Are faces stored in a database?
The Metropolitan Police said it will only keep faces matching the watch list for up to 30 days – all other data is deleted immediately.
Can you refuse to be scanned?
People can refuse to be scanned without being viewed as suspicious, although the Metropolitan Police said ‘there must be additional information available to support such a view’.
How accurate is the technology?
Trials in London and Wales have had mixed results so far. Last May, the Metropolitan Police released figures showing it had identified 102 false positives – cases where someone was incorrectly matched to a photo – with only two correct matches. South Wales Police said its trial results improved after changes to the algorithm used to identify people.
‘The public rightly expect us to test and to use emerging technology to tackle crime and stop violent criminals. Bearing down on serious violence is our number one priority and we believe we should explore the value of new technology to help us do that.
‘Locating people who are wanted by the police is not new. Every day police officers are briefed with images of suspects to look out for, resulting in positive identifications and arrests every day.
‘Live facial recognition is about modernising this practice through technology to improve effectiveness and bring more offenders to justice.’
Civil rights groups have raised concerns over the technology, and in July last year, the data watchdog warned police forces testing the scanners that privacy and data protection issues must be addressed.
At the time, Information Commissioner Elizabeth Denham said: ‘We understand the purpose is to catch criminals but these trials also represent the widespread processing of biometric data of thousands of people as they go about their daily lives.
‘And that is a potential threat to privacy that should concern us all.’
She has also called for a legal code of practice to be established before the technology was deployed.
But Mr Ephgrave said the Met is ‘in the business of policing by consent’ and thinks the force is effectively balancing the right to privacy with crime prevention.
He said: ‘Everything we do in policing is a balance between common law powers to investigate and prevent crime, and Article eight rights to privacy.
‘It’s not just in respect of live facial recognition, it’s in respect of covert operations, stop and search, there’s any number of examples where we have to balance individuals right to privacy against our duty to prevent and deter crime.’
HOW DOES FACIAL RECOGNITION TECHNOLOGY WORK?
Facial recognition software works by matching real time images to a previous photograph of a person.
Each face has approximately 80 unique nodal points across the eyes, nose, cheeks and mouth which distinguish one person from another.
A digital video camera measures the distance between various points on the human face, such as the width of the nose, depth of the eye sockets, distance between the eyes and shape of the jawline.
A different smart surveillance system (pictured) can scan 2 billion faces within seconds has been revealed in China. The system connects to millions of CCTV cameras and uses artificial intelligence to pick out targets. The military is working on applying a similar version of this with AI to track people across the country
This produces a unique numerical code that can then be linked with a matching code gleaned from a previous photograph.
A facial recognition system used by officials in China connects to millions of CCTV cameras and uses artificial intelligence to pick out targets.
Experts believe that facial recognition technology will soon overtake fingerprint technology as the most effective way to identify people.