EPIC v. DOJ (Criminal Justice Algorithms)
- NYC Establishes Algorithm Accountability Task Force: New York City has passed the first bill to examine the discriminatory impacts of "automated decision systems." A task force will develop recommendations for how to make the city's algorithms fairer and more transparent. James Vacca, the bill's sponsor, said "If we're going to be governed by machines and algorithms and data, well, they better be transparent." EPIC supports algorithmic transparency and opposed systemic bias in "risk assessment" tools used in the criminal justice system. EPIC has filed Freedom of Information lawsuits to obtain information about "predictive policing" and "future crime prediction" algorithms. EPIC President Marc Rotenberg has called for laws that mandate algorithmic transparency and prohibit automated decision-making that results in discrimination. (Dec. 21, 2017)
- EPIC Sues Justice Department Over "Risk Assessment" Techniques: EPIC has filed a FOIA lawsuit against the Department of Justice for information about the use of "risk assessment" tools in the criminal justice system. These proprietary techniques are used to set bail, determine criminal sentences, and even contribute to determinations about guilt or innocence. Many criminal justice experts oppose their use. EPIC has pursued several FOIA cases to promote "algorithmic transparency." The EPIC cases include passenger risk assessment, "future crime" prediction, and proprietary forensic analysis. The Supreme Court is now considering whether to take a case on the use of a secretive technique to predict possible recidivism. (Mar. 7, 2017)
"Evidence-based assessment tools," or "risk assessments," are algorithms that "try to predict recidivism -- repeat offending or breaking the rules of probation or parole -- using statistical probabilities based on factors such as age, employment history and prior criminal record." Today, federal and state officials across the country use evidence-based risk assessment tools to make decisions at all stages of criminal justice process. These techniques are controversial: the reliability, fairness, and constitutional legitimacy of "evidence-based" tools are vigorously contested.
Nonetheless, risk assessments are increasingly used to make sentencing and other significant decisions in the criminal justice system. With many tools the product of private enterprise, risk assessment has become a competitive industry. Transparency of these techniques is of the utmost importance and is necessary to secure fair outcomes, preserve the rights of individuals, and maintain accountability across the criminal justice system.
Commercial risk assessment tools are already in use in criminal cases across the country. The Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) and the Level of Service Inventory Revised (LSI-R) purport to assess individuals' risk levels and criminogenic needs based on a wide range of personal factors. COMPAS, for example, considers factors such as social isolation, criminal associations, and criminal personality, while LSI-R uses factors including leisure, accommodations, and attitudes or orientation. The federal Post-Conviction Risk Assessment (PCRA) likewise uses information such as criminal history, education, employment, and social networks to reach a "final conclusion regarding risk level and criminogenic needs."
The DOJ has said that it aims "to build a systemwide framework (arrest through final disposition and discharge)" of evidence-based decision-making. Yet even the DOJ has expressed reservations about the use of criminal justice algorithms. The department's Criminal Division called assessments based on sociological and personal information rather than prior bad acts "dangerous" and constitutionally suspect, citing the disparate impacts of risk assessments and the erosion of consistent sentencing. Former U.S. Attorney General Eric Holder has said that "basing sentencing decisions on static factors and immutable characteristics . . . may exacerbate unwarranted and unjust disparities that are already far too common in our criminal justice system and in our society."
EPIC's Open Government project seeks to ensure that the public is fully informed about the activities of government. The public cannot assess the fairness and reliability of the criminal justice algorithms used by the DOJ without access to relevant departmental records. EPIC therefore has a significant interest in obtaining DOJ documents concerning "evidence-based" practices in sentencing—including policies, guidelines, source codes, and validation studies.
EPIC has also promoted Algorithmic Transparency for many years and has litigated several related FOIA cases:
- EPIC v. CPB, in which EPIC successfully sued U.S. Customs and Border Protection for documents relating to its use of secret, analytic tools to assign "risk assessments" to travelers
- EPIC v. DHS, a suit for records concerning a DHS program that assesses "physiological and behavioral signals" to determine the probability that an individual might commit a crime.
In 2016, EPIC obtained extensive records from Missouri and Wisconsin concerning both states' use of criminal justice algorithms.
- EPIC's FOIA Request (June 15, 2016)
- DOJ FOIA Acknowledgement (Aug. 9, 2016)
- DOJ FOIA Interim Response Letter (Aug. 16, 2017)
- FOIA Production 1 (Oct. 31, 2017)
- FOIA Production 2 (Oct. 31, 2017)
- FOIA Production 3 (Oct. 31, 2017)
U.S. District Court for the District of Columbia
- EPIC Complaint (Mar. 7, 2017)
- DOJ Answer (May 19, 2017)
- DOJ Motion for Summary Judgment (Feb. 15, 2018)
- EPIC Opposition and Cross-Motion for Summary Judgment (Mar. 16, 2018)
- Shayna Posses, Justices Seek Solicitor General's Take On Reoffense-Risk Tool, Law360 (Mar. 6, 2017)
- Sarah Kramer, One State is Replacing Bail Hearings With . . . An Algorithm, Futurism (Mar. 6, 2017)
- Ephrat Livni, In the US, some criminal court judges now use algorithms to guide decisions on bail, Quartz (Feb. 28, 2017)
- Julia Angwin & Jeff Larson, Racial Bias in Criminal Risk Scores Is Mathematically Inevitable, Quartz (Feb. 17, 2017)
- Laurel Eckhouse, Big data may be reinforcing racial bias in the criminal justice system, Wash. Post (Feb. 10, 2017)