NYPD Announces Use of New Software to Track Crime Patterns
The largest police force in the country is using a new pattern-recognition software to track criminal behavior. The New York City Police Department (NYPD) revealed this month that they have been using a system called Patternizr since 2016 to notice patterns in criminal behavior and use those patterns to identify potential criminals.
The Patternizr algorithm works by scanning the entire NYPD database and identifying certain aspects of a crime- such as method of entry, weapons used, and location- and ranking those aspects with a similarity score with other crimes. A human data analyst then groups together complaints and presents those findings to detectives to assist in their investigations.
According to Brian Charles of Governing, “On average, more than 600 complaints per week are run through Patternizr. The program is not designed to track certain crimes, including rapes and homicides. In the short term, the department is using the technology to track petty larcenies.”
For human officers, scanning through hundreds of cases attempting to find similarities can be time consuming and laborious, and officers may only have access to information from their precinct. But Patternizr can collect data from all 77 NYPD precincts and sift through them in record time.
Devore Kaye, an NYPD spokesperson, told TechTarget that the department has hired approximately 100 new analysts to perform crime analysis who are trained to use Patternizr as part of their daily routine.
"Analytics will continue to play an increasingly important role in law enforcement to help ensure public safety," Kaye said. "However, it's only one important component of good policymaking. Any effective and fair law enforcement policy should be transparent to the public, incorporate feedback from impacted stakeholders and be frequently evaluated for impact.”
Groups, such as the American Civil Liberties Union, have already expressed concerns that this technology may reinforce human biases via technology.
“The institution of policing in America is systemically biased against communities of color,” New York Civil Liberties Union legal director Christopher Dunn told Fast Company. “Any predictive policing platform runs the risks of perpetuating disparities because of the over-policing of communities of color that will inform their inputs. To ensure fairness, the NYPD should be transparent about the technologies it deploys and allow independent researchers to audit these systems before they are tested on New Yorkers.”
However, the NYPD defends that their program was designed to exclude race and gender from the algorithm. Based on internal testing, the NYPD believes the software is no more likely to generate links to crimes committed by persons of a specific race than a random sampling of police reports.
Debra Piehl, the NYPD’s senior crime analyst, has said to TechTarget that the software requires human oversight to be most effective. Piehl notes, “It still allows the analysts that work for me to apply their own thinking and analysis. The science doesn't overwhelm the art."
Posted in Featured News