This post is also available in: עברית (Hebrew)
The Federal Bureau of Investigation (FBI) has run over 390,000 facial recognition searches since 2011. The Department of Homeland Security has also said it could use facial recognition at the border and for travelers, although it dropped plans this month to seek permission to use the technology to scan travelers coming in and out of the country.
According to new studies, top facial recognition systems misidentify people of color at higher rates than white people, according to a US federal study from the National Institute of Standards and Technology (NIST). While conducting a particular type of database search known as “one-to-one” matching, many facial recognition algorithms falsely identified African-American and Asian faces 10 to 100 times more than white faces.
In one-to-many matching (the type of search done by law enforcement investigators that compares an image to a database of other faces), African American women were falsely identified most frequently. Joy Buolamwini, founder of the Algorithmic Justice League and also a researcher at the Massachusetts Institute of Technology (MIT), called the report “a comprehensive rebuttal” of those saying artificial intelligence (AI) bias was no longer an issue.
The study comes at a time of growing discontent over the technology across the US, with critics warning it can lead to unjust harassment or arrests, according to eandt.theiet.org. The rate of misidentification can lead to false accusations and even security concerns, allowing access to imposters, according to the report.
The study assessed 189 software algorithms from 99 developers, representing a majority of the industry. NIST reviewed algorithms from tech giants like Intel and Microsoft, although notably the study does not cover the algorithm Amazon uses in its Rekognition system, which has been marketed to police departments. In a statement, NIST computer scientist Patrick Grother said that finding was an “encouraging sign that more diverse training data may produce more equitable outcomes, should it be possible for developers to use such data.” Overall, he said, the report should help policymakers and developers think “about the limitations and appropriate use of these algorithms.”
Prepared to dive into the world of futuristic technology? Attend INNOTECH 2023, the international convention and exhibition for cyber, HLS and innovation at Expo, Tel Aviv, on March 29th-30th
Interested in sponsoring / a display booth at the 2023 INNOTECH exhibition? Click here for details!