Judges on a federal appeals court took aim yesterday at predictive policing, the practice of using algorithmic analysis to predict crime and direct law enforcement resources. The Fourth Circuit ruled that Richmond police violated the Fourth Amendment when they stopped and searched the defendant, Billy Curry, simply because he was walking near the scene of a shooting. In a dissent, Judge J. Harvie Wilkinson called the court’s decision a “gut-punch to predictive policing.” But others on the court responded to highlight the dangers and failings of the practice. Chief Judge Roger Gregory questioned whether predictive policing is "a high-tech version of racial profiling.” Judge James A. Wynn highlighted the “devastating effects of over-policing on minority communities” and explained that predictive policing “results in the citizens of those communities being accorded fewer constitutional protections than citizens of other communities.” Judge Stephanie D. Thacker warned that “any computer program or algorithm is only as good as the data that goes into it” and that predictive policing “has been shown to be, at best, of questionable, effectiveness, and at worst, deeply flawed and infused with racial bias.” EPIC has long highlighted the risks of algorithms in the criminal justice system and recently obtained a 2014 Justice Department report detailing the dangers of predictive policing.
Share this page:
Subscribe to the EPIC Alert
The EPIC Alert is a biweekly newsletter highlighting emerging privacy issues.