Sounding the Alarm on Predictive Policing

Source: 
Author: 
Coverage Type: 

[Commentary] “Predictive policing” sounds good on paper. After all, what could go wrong with a data-based approach to law enforcement? It turns out: plenty. That’s why Free Press joined a broad coalition of civil rights, privacy and technology groups in sounding the alarm about how predictive policing reinforces racial bias. The Leadership Conference on Civil and Human Rights mobilized the coalition, which counts the ACLU, the Brennan Center for Justice, Color Of Change and the NAACP among the 17 signers. The statement released last Wednesday notes that “the data driving predictive enforcement activities — such as the location and timing of previously reported crimes, or patterns of community- and officer-initiated 911 calls — is profoundly limited and biased.” Indeed, a damning report from the tech consulting group Upturn, which surveyed the nation’s 50 largest police forces, confirms this view. Upturn found “little evidence” that predictive policing works — and “significant reason to fear that [it] may reinforce disproportionate and discriminatory policing practices.”

While the idea of using data to direct police resources sounds like an effort to remove human bias from the equation, that isn’t how it works in practice. In fact, predictive policing embeds police bias in an algorithm that then has the appearance of being neutral. The police response to low-income communities — in particular communities of color — is completely different from the response to wealthy white communities.


Sounding the Alarm on Predictive Policing