This post is also available in: עברית (Hebrew)
When thousands of football fans enter Cardiff’ Stadium on June 3 to watch the UEFA Champions League final, few will be aware that their faces have already been scanned, processed, and compared to a police database of some 500,000 “persons of interest”.
Despite significant criticism against the technology from fans, the police will pilot a new facial recognition surveillance system.
According to motherboard.vice’s report, the system will be deployed during the day of the game in Cardiff’s main train station and around the Principality Stadium situated in the heart of Cardiff’s central retail district. The security operation will be based on previous police use of Automated Facial Recognition (AFR) technology by London’s Metropolitan Police during 2016’s Notting Hill Carnival.
The UK Government’s surveillance camera commissioner Tony Porter, explained that incidents like the recent attack on the Borussia Dortmund team bus before a Champions League quarter-final match are examples of why law enforcement organizations are looking at AFR. However, he outlined that police must utilize the technology in a measured way that is compliant with the surveillance camera code of practice.
“I have seen the use of AFR increase over the past few years and a recent report by the National Institute of Standards and Technology (NIST) indicated that facial recognition is a difficult challenge. Getting the best, most accurate results for each intended application requires good algorithms, a dedicated design effort, a multidisciplinary team of experts, limited-size image databases, and field tests to properly calibrate and optimize the technology.”
Questions of effectiveness of AFR still surround the technology. Recent findings in NIST’s Face In Video Evaluation program report details the limitations of AFR in identifying “non-cooperative” subjects — subjects who are not facing the camera or whose faces are obscured. The report finds that accurate facial recognition can only be achieved in controlled environments with high-quality cameras, as a subject’s face can easily be obscured for any number of reasons.
The accuracy of facial recognition software has also recently been publicly criticized in the US during a House Committee on Oversight and Government Reform. The committee revealed findings by the Government Accountability Office that algorithms used by the Federal Bureau of Investigation were inaccurate 14 percent of the time and were more likely to misidentify black people.
The report was also damming of the FBI’s unregulated and disproportionate use of facial recognition tech, which mirrors recent controversy caused by findings of unlawful retention of photos of millions of innocent people by UK police forces.