EPIC Letter to Attorney General Garland Re: Title VI Compliance and Predictive Algorithms
Dear Attorney General Garland:
Pursuant to Executive Order 14074, “Advancing Effective, Accountable Policing and Criminal Justice Practices to Enhance Public Trust and Public Safety,” the Attorney General must commission a study about biometric information and predictive algorithms in law enforcement by November 21, 2022.[1]
The Electronic Privacy Information Center (EPIC) urges you to ensure the study (1) accurately assesses and discloses the predictive technologies in law enforcement that the DOJ has funded since 2009 and (2) assesses agency compliance with Title VI, which prohibits federal funding of programs and activities that discriminate based on “race, color, or national origin.”[2] This study should be coordinated with and support the Department of Justice’s ongoing Title VI review.[3] The study should also recommend specific criteria to identify contractors and vendors that would be ineligible to receive federal funds based on a demonstrated history of inaccuracy and/or bias in their products.
In April 2021, several members of Congress sent a letter to the DOJ asking for a full list of police departments spending federal grant money on predictive policing tools, citing concerns of bias and noting that “multiple audits of [predictive policing] systems have found no evidence they are effective at preventing crime.”[4] In response, the DOJ admitted there were “no specific records” with that information.
Facial recognition and predictive algorithms used in law enforcement contexts disproportionately harm people of color. Studies analyzing commercially available facial recognition systems found that they misidentified women and people of color far more frequently than White men.[5] Indeed, facial recognition algorithms may misidentify Black women in up to 35% of cases.[6] A National Institute of Standards and Technology study of “a majority” of the facial recognition industry found higher rates of false positives for Asian people, Black people, and native groups.[7] In many cases, facial recognition technology is also disproportionately deployed in poor and minority communities, heightening the risk of misidentification and oversurveillance of vulnerable populations.[8] As a result, Black men have been wrongfully arrested after being misidentified by facial recognition systems.[9]
Predictive policing algorithms also have a disparate impact on people of color because they are built on “dirty data” that recreates historically biased law enforcement practices.[10] These algorithms are prone to overstating the likelihood of crime occurring among poor and minority populations that are already overpoliced. Some crimes, like sexual assault and drug use, may be under-reported, causing algorithms understate the likelihood of crime occurring in less-policed neighborhoods.[11] Predictive algorithms are also vulnerable to “juked stats,” the result of law enforcement agencies deliberately discouraging reporting in certain locations to appear more effective.[12] Most significantly, algorithms based on historical crime data will recreate trends in overpolicing.
One of the most common predictive policing tools, PredPol, routinely identifies “hotspots” in historically overpoliced minority communities.[13] Several studies have demonstrated that PredPol overstates the likelihood of crime occurring in minority communities and understates the likelihood of crime in White neighborhoods.[14] Many law enforcement agencies across the country, including the Los Angeles Police Department, have stopped using PredPol in recent years.[15] Other crime prediction software has caused similar harm. The Chicago Police Department used an algorithm to create a watch list of individuals most likely to commit violent crimes, but a Chicago Sun-Times investigation found that 85 percent of individuals on the list were Black men, some with no record of prior violence.[16]
Given the proven disparate impact and secrecy inherent in the use of these systems, a lack of a clear accounting of federal funding streams for these technologies is unacceptable and prevents meaningful compliance with civil rights laws like Title VI. We urge you, as part of your study, to disclose any records cataloging the downstream effects of predictive policing algorithms. If no such records exist, we ask you to create and publish an accounting of the harms caused by predictive policing tools and to investigate why such records have not been kept before.
Based on publicly available information, the Department of Justice has awarded over $57,000,000 in grants to local police departments through Smart Policing Programs. Since 2021, the DOJ has granted $3,943,002 to police departments to improve “data-driven” policing, “smart” policing, and related activities.[17] The Pasco County Sherriff’s predictive policing tool, which was created with federal funds, was used to “monitor and harass families across the country,” even when police acknowledged that the subjects of this harassment had not violated the law but were merely predicted to do so in the future.[18]
Considering the dangerous effects these technologies are having every day and the lack of reliable information about their design and use, EPIC urges you to implement the following additional recommendations:
- Halt any current grants or funding opportunities that allow for the procurement or development of predictive technologies or biometric tools for law enforcement use until the study is complete and its recommendations implemented;
- Require any technologies supported by federal funds to be affirmatively and empirically shown to be nondiscriminatory; and
- Develop additional oversight of bad actor jurisdictions, including jurisdictions currently under consent decrees for violating civil rights claims in the last 10 years.
Please don’t hesitate to reach out to Ben Winters, EPIC Counsel, at [email protected] if you have any questions or would like to work with EPIC to ensure meaningful compliance with our recommendations.
[1] Exec. Order No. 14,074, Advancing Effective, Accountable Policing and Criminal Justice Practices to Enhance Public Trust and Public Safety§ 10(d), available at https://www.whitehouse.gov/briefing-room/presidential-actions/2022/05/25/executive-order-on-advancing-effective-accountable-policing-and-criminal-justice-practices-to-enhance-public-trust-and-public-safety/.
[2] 42 U.S.C §§ 2000d–2000d-7.
[3] Vanita Gupta, U.S. Dep’t of Just., Associate Attorney General Vanita Gupta Releases Memo on Implementation and Administrative Enforcement of Title VI and Safe Streets Act, Justice Blogs (June 23, 2022), https://www.justice.gov/opa/blog/associate-attorney-general-vanita-gupta-releases-memo-implementation-and-administrative.
[4] Dell Cameron, Justice Department Admits: We Don’t Even Know How Many Predictive Policing Tools We’ve Funded, Gizmodo (Mar. 17, 2022), https://gizmodo.com/justice-department-kept-few-records-on-predictive-polic-1848660323; Letter from 8 Members of Congress to Att’y Gen. Merrick Garland (Apr. 15, 2021) (emphasis added), available at https://drive.google.com/file/d/1l56rBOiDA7k-vQScVfTu6eEMck1VAiLb/view.
[5] Joy Buolamwini & Timnit Gebru, Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification, 81 Procs. of Machine Learning Res. 77–91 (2018), available athttp://proceedings.mlr.press/v81/buolamwini18a.html;Inioluwa Deborah Raji & Joy Buolamwini, Actionable Auditing: Investigating the Impact of Publicly Naming Biased Performance Results of Commercial AI Products, AIES ’19 (Jan. 2019), available at https://dl.acm.org/doi/abs/10.1145/3306618.3314244; Nat. Inst. of Standards & Tech., NIST Study Evaluates Effects of Race, Age, Sex on Face Recognition Software (Dec. 19, 2019), https://www.nist.gov/news-events/news/2019/12/nist-study-evaluates-effects-race-age-sex-face-recognition-software.
[6] Id.
[7] Id.
[8] See, e.g., Amnesty Intl., Inside the NYPD’s Surveillance Machine (Feb. 15, 2022), https://banthescan.amnesty.org/decode/ (showing that facial recognition-compatible public and private cameras were deployed more in New York City neighborhoods with higher non-White populations).
[9] Kashmir Hill, Another Arrest, and Jail Time, Due to a Bad Facial Recognition Match, N.Y. Times (Jan. 6, 2021), https://www.nytimes.com/2020/12/29/technology/facial-recognition-misidentify-jail.html (reviewing three cases since 2019 of Black men being wrongfully arrested based on false facial recognition matches).
[10] Rashida Richardson, Jason Schultz, & Kate Crawford, Dirty Data, Bad Predictions: how civil rights violations impact police data, predictive policing systems, and justice, 94 N.Y.U. L. Rev. 192–233 (2019), https://www.nyulawreview.org/wp-content/uploads/2019/04/NYULawReview-94-Richardson-Schultz-Crawford.pdf.
[11] Annie Gilbertson, Data-Informed Predictive Policing Was Heralded as Less Biased. Is It?, The Markup (Aug. 20, 2020), https://themarkup.org/ask-the-markup/2020/08/20/does-predictive-police-technology-contribute-to-bias.
[12] Rashida Richardson, Jason Schultz, & Kate Crawford, supra note 10.
[13] Gilbertson, supra note 11.
[14] Id.; Kristin Lum & William Isaac, To predict and serve?, 13 Significance 14–19 (2016).
[15] Gilbertson, supra note 11.
[16] Mick Dumke & Frank Maine, A look inside the watch list Chicago police fought to keep secret, Chicago Sun-Times (May 18, 2017), https://chicago.suntimes.com/2017/5/18/18386116/a-look-inside-the-watch-list-chicago-police-fought-to-keep-secret.
[17] Funding & Awards, Bureau of Just. Assistance (2022) https://bja.ojp.gov/funding.
[18] Kathleen McGrory & Neil Bedi, Targeted, Tampa Bay Times (Sep. 3, 2020), https://projects.tampabay.com/projects/2020/investigations/police-pasco-sheriff-targeted/intelligence-led-policing/.
Support Our Work
EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.
Donate