This post is also available in: עברית (Hebrew)
Police departments in the world have been experimenting the use of facial recognition technologies during events with massive crowds in an attempt to prevent violence and enhance law enforcement. London’s Metropolitan Police will use facial recognition software to scan the faces of tens of thousands of revellers at this year’s Notting Hill Carnival.
The planned deployment of the technology was described as a pilot project intended to look for suspected troublemakers to keep those attending safe. The police stated that “the technology involves the use of overt cameras which scan the faces of those passing by and flag up potential matches against a database of custody images. The database will be populated with images of individuals who are forbidden from attending carnival, as well as individuals wanted by police.”
The Notting Hill Carnival is the biggest annual public order test for the Met, attracting crowds of up to 1 million people. Last year’s carnival led to 45 officers being assaulted and eight were spat at, requiring them to take medication in case of infection. There were also 454 arrests, the highest number in a decade, according to theguardian.com.
The Met trialled the system last year, but it failed to pick out any suspects. Facial recognition technology is improving rapidly and the force believes it has the potential to provide a powerful new tool to law enforcement. Only images that come up as a match with a wanted offender will be retained by police, the Met said.
However, critics say the use of real-time biometric tracking has no basis in law and that the plan to deploy it during the carnival is institutionally racist, as it targets Britain’s main annual African-Caribbean celebration. Civil liberties groups believe such an action would be discriminatory.
Martha Spurrier, the director of Liberty, said: “This intrusive biometric surveillance has no place at the Notting Hill Carnival. There is no basis in law for facial recognition, no transparency around its use and we’ve had no public or parliamentary debate about whether this technology could ever be lawful in a democracy. There is also serious doubt about its accuracy – with research showing some cameras are significantly more likely to misidentify black people and women.”