New Crime Prediction System

New Crime Prediction System

crime prediction

This post is also available in: heעברית (Hebrew)

A new crime prediction technology has been creating controversy. Brett Goldstein, an ex-cop, founded CivicScape, a technology company that sells crime-predicting software to police departments. Nine cities are either using the software or in the process of implementing it, including four of the US’s 35 largest cities by population. Departments pay between $30,000 a year to use the software in cities with less than 100,000 people to $155,000 a year in cities with populations that exceed 1 million. Goldstein wanted to check in on the two clients who were furthest along—the police departments in the New Jersey towns of Camden and Linden.

The criminal justice system produces reams of data, and new computing methods offer to turn any pool of numbers into something useful. Today, almost every major police department is using or has used some form of commercial software that makes predictions about crime, either to determine what blocks warrant heightened police presence or even which people are most likely to be involved. Technology is transforming the craft of policing.

Not everyone is rubbing their hands in anticipation. Many police officers still see so-called predictive policing software as mumbo jumbo. Critics outside of law enforcement argue that it’s actively destructive. The historical information these programs use to predict patterns of crime aren’t a neutral recounting of objective fact; they’re a reflection of socioeconomic disparities and the aggressive policing of black neighborhoods. Computer scientists have held up predictive policing as a poster child of a how automated decision making can be misused.

“Systems that manufacture unexplained ‘threat’ assessments have no valid place in constitutional policing,” wrote a coalition of civil rights and technology associations in a statement last summer. A numbing progression of police shootings in the past several years serve as a reminder of what’s at stake when police officers see certain communities as disproportionately threatening. Over the course of eight days in late June, juries failed to convict officers who killed black men in Minnesota, Ohio, and Wisconsin. In each case, the officer’s defense relied on his perception of danger. The worst-case scenario with predictive policing software is deploying officers to target areas with their ears raised, leading them to turn violent in what would otherwise be routine encounters.

Goldstein’s company published its code on GitHub In March, a website where computer programmers post and critique one another’s work. It was an unprecedented move, given the usual secrecy of the industry and it caused an immediate stir among people who follow the cop tech industry.

The Camden County Police Department’s Real-Time Tactical Operation Intelligence Center (RT-TOIC) is a great test on how you feel about tech in law enforcement. The RT-TOIC, is a windowless room from which the department runs its technological initiatives. Camden integrated CivicScape into the RT-TOIC three months ago. The company’s maps are always running, changing every hour to reflect updated data. When targets change, analysts switch their screens to the surveillance cameras pointed at those blocks. Officers translate what’s happening in the RT-TOIC to the cops on the street. The guys in patrol cars don’t know whether an order is derived from some newfangled math, the judgment of a superior officer, or a mixture. The ambiguity is deliberate, said Kerry Yerico, the department’s director of criminal intelligence and analysis.

Unlike an algorithm in which a human has consciously told the system what to think about each factor, neural networks find their own paths and can’t effectively explain to humans what they’ve done. This has the potential to make CivicScape even less transparent than other predictive policing software, which use different types of algorithms.

Scott Thompson, Camden’s police chief, said he hasn’t heard any criticism about transparency. For its part, CivicScape said its openness comes from inviting discussion about the types of data its models use. The company decided against using arrests for marijuana possession at all, for instance, given widespread research showing racial disparities in these arrests.

Kristian Lum and William Isaac, researchers who have written their own statistical models for the Human Rights Data Analysis Group demonstrating how bias works in predictive policing, have examined the code. They both described CivicScape’s move as positive but withheld praise until they see how the company followed through.