New AI Technology Helps Police Catch Criminals

New AI Technology Helps Police Catch Criminals

This post is also available in: heעברית (Hebrew)

A new pattern detecting computer software has been developed by the New York Police Department to help catch criminals.

The NYPD has been using a new kind of software, developed in-house, that recognizes certain behavioral patterns and compares them to a database of thousands of reported thefts, larceneries, and robberies. The software, better known as “Patterizr”, is a group of learning algorithms taken from 10 years of police data that scans through the information of crimes. Patternizr takes information from crimes such as time, method of entry, and types of force used, and attempts to spot a certain pattern which can be used to help identifying suspects.

Identifying crime patterns is a critical part in police investigation work. Traditional methods of identifying patterns involve a lot legwork and manpower, something that would take up valuable time from analysts and detectives. Patternizr supplies detectives with a short list of potential suspects in much less time thanks to computer algorithms. This saves detectives a lot of time, which can allow for the detectives to better utilize their time in catching criminals.

With the help of Patternizr, the NYPD were able to catch a syringe-wielding thief who attempted to rob a Home Depot. Campussafetymagazine.com reports that the algorithms managed to pick up data stating that a few weeks prior a different Home Depot was robbed by a man wielding a syringe. Officers were then able to put the two cases together. “Because Patternizr picked up those key details in the algorithm, it brought back complaints from other precincts that I wouldn’t have known,” said Rebecca Shutt, Bronx crime analyst that worked on the Home Depot case.

Even though Patternizr seems like a great tool for finding criminals, concerns regarding potential AI biases have been raised, people are concerned that a machine wouldn’t be enough to catch the right person. In the best case an innocent person may be inconvenienced for questioning due to AI biases, whereas in the worst case an innocent person may get in trouble for something he or she didn’t do, because of this the software doesn’t capture race, gender, or specific locations of the crime. The NYPD claims that they share the public’s concern for the possibility of biases, as reported by Searchbusinessanalytics.com, but have gone on the record saying that they have run several fairness tests and have not found any indication that Patternizr shows any racial biases.

Traditional pattern finding methods are still in use for more serious crimes such as rape and homicide.

Patternizr has been in use by the NYPD since 2016, but only recently its use has been declassified.