Face Recognition for Police Use Criticized

Face Recognition for Police Use Criticized

face recognition

This post is also available in: heעברית (Hebrew)

Face recognition is one of the most contentious areas in studies, because of issues of both privacy and race. Now it is claimed that face recognition software is not ready for use by law enforcement. Recently, Amazon employees rallied against the use of Rekognition, the company’s face recognition technology, by police. Once optional for U.S. citizens, the Orlando Airport now mandates face scans for all international travelers. And CBP has moved to institute face recognition at the Mexican border.

In an op-ed published by techcrunch,com, Brian Brackeen, CEO of the face recognition and AI startup Kairos, warns that the technology will be used to harm citizens if given to governments or police.

Brackeen declined a recent request by bodycam maker Axon for a partnership with Kairos to explore face recognition.

“Using commercial facial recognition in law enforcement is irresponsible and dangerous,” he writes. “As the Black chief executive of a software company developing facial recognition services, I have a personal connection to the technology both culturally, and socially,” Brackeen writes.

A study by MIT computer scientist Joy Buolamwini found face recognition is routinely less accurate on darker-skinned faces than it is on lighter-skinned faces. A serious problem, Brackeen reasons, is that as law enforcement relies more and more on face recognition, the racial disparity in accuracy will lead to consequences for people of color.

“The more images of people of color it sees, the more likely it is to properly identify them,” he writes. “The problem is, existing software has not been exposed to enough images of people of color to be confidently relied upon to identify them. And misidentification could lead to wrongful conviction, or far worse.”

According to gizmodo.com, law enforcement agencies have increasingly relied on face recognition in the U.S., celebrating the tech as a public safety service. In areas where identifying yourself is tied to physical safety, any inaccuracies or anomalies could lead to secondary searches and more interactions with law enforcement. If non-white faces are already more heavily scrutinized in high-security spaces, face recognition could only add to that.