Live facial recognition surveillance should not be used by police in Islington, councillors have said.
Islington Council passed a motion calling for an immediate stop on the use of the technology by Metropolitan Police at a full council meeting last week.
The technology was described in the motion as “intrusive and unreliable”, and condemned for having the potential to “exacerbate racist outcomes in policing”.
The council claims that more than 3,000 people have been wrongly recognised by live facial recognition, something the Metropolitan Police has denied.
The authority added that the technology has an error rate of up to 35% when identifying black women.
Cllr Jenny Kay said that the technology had been deployed by the police in Islington at least twice – once in April last year, and again in September.
Big Brother Watch, a UK civil liberties campaign group, has described the motion as “extremely welcome”.
Mark Johnson, advocacy manager for the group, said: “This Orwellian technology is deeply intrusive and has well-documented issues with accuracy and bias.
“Both the Met Commissioner and the Home Secretary should take note that communities in areas where this technology has been used are rejecting live facial recognition.”
The group added that Islington is the third council in London to call for a stop to the use of live facial recognition technology, following similar motions in Haringey and Newham.
Metropolitan Police have said that the technology has led to the arrest of 270 people, and claimed this meant “communities are safer”.
A spokesperson for the Met said: “Every deployment considers the local impact, and we have undertaken a significant amount of community engagement and considered, in partnership with, for example, the London Policing Ethics Panel, the wider impact and effectiveness of the technology.
“We are open and transparent in our use of live facial recognition and the Met’s website publishes the results of every deployment.”
The force added that in 2022 it commissioned the National Physical Laboratory to conduct independent scientific performance testing of the facial recognition algorithm it uses.
It claims that this testing showed where the system threshold needs to be set to ensure that the performance of the system is balanced across race and gender.
Comments: Our rules
We want our comments to be a lively and valuable part of our community - a place where readers can debate and engage with the most important local issues. The ability to comment on our stories is a privilege, not a right, however, and that privilege may be withdrawn if it is abused or misused.
Please report any comments that break our rules.
Read the rules here