Pursuant to Executive Order 14074, “Advancing Effective, Accountable Policing and Criminal Justice Practices to Enhance Public Trust and Public Safety,” the Attorney General must commission a study about biometric information and predictive algorithms in law enforcement by November 21, 2022.
The Electronic Privacy Information Center (EPIC) urges you toensure the study (1) accurately assesses and discloses the predictive technologies in law enforcement that the DOJ has funded since 2009 and (2) assesses agency compliance with Title VI, which prohibits federal funding of programs and activities that discriminate based on “race, color, or national origin.” This study should be coordinated with and support the Department of Justice’s ongoing Title VI review.The study should also recommendspecific criteria to identify contractors and vendors that would be ineligible to receive federal funds based on a demonstrated history of inaccuracy and/or bias in their products.
In April 2021, several members of Congress sent a letter to the DOJ asking for a full list of police departments spending federal grant money on predictive policing tools, citing concerns of bias and noting that “multiple audits of [predictive policing] systems have found no evidence they are effective at preventing crime.” In response, the DOJ admitted there were “no specific records” with that information.
Facial recognition and predictive algorithms used in law enforcement contexts disproportionately harm people of color. Studies analyzing commercially available facial recognition systems found that they misidentified women and people of color far more frequently than White men. Indeed, facial recognition algorithms may misidentify Black women in up to 35% of cases. A National Institute of Standards and Technology study of “a majority” of the facial recognition industry found higher rates of false positives for Asian people, Black people, and native groups. In many cases, facial recognition technology is also disproportionately deployed in poor and minority communities, heightening the risk of misidentification and oversurveillance of vulnerable populations. As a result, Black men have been wrongfully arrested after being misidentified by facial recognition systems.
Predictive policing algorithms also have a disparate impact on people of color because they are built on “dirty data” that recreates historically biased law enforcement practices. These algorithms are prone to overstating the likelihood of crime occurring among poor and minority populations that are already overpoliced. Some crimes, like sexual assault and drug use, may be under-reported, causing algorithms understate the likelihood of crime occurring in less-policed neighborhoods. Predictive algorithms are also vulnerable to “juked stats,” the result of law enforcement agencies deliberately discouraging reporting in certain locations to appear more effective. Most significantly, algorithms based on historical crime data will recreate trends in overpolicing.
One of the most common predictive policing tools, PredPol, routinely identifies “hotspots” in historically overpoliced minority communities. Several studies have demonstrated that PredPol overstates the likelihood of crime occurring in minority communities and understates the likelihood of crime in White neighborhoods. Many law enforcement agencies across the country, including the Los Angeles Police Department, have stopped using PredPol in recent years. Other crime prediction software has caused similar harm. The Chicago Police Department used an algorithm to create a watch list of individuals most likely to commit violent crimes, but a Chicago Sun-Times investigation found that 85 percent of individuals on the list were Black men, some with no record of prior violence.
Given the proven disparate impact and secrecy inherent in the use of these systems, a lack of a clear accounting of federal funding streams for these technologies is unacceptable and prevents meaningful compliance with civil rights laws like Title VI. We urge you, as part of your study, to disclose any records cataloging the downstream effects of predictive policing algorithms. If no such records exist, we ask you to create and publish an accounting of the harms caused by predictive policing tools and to investigate why such records have not been kept before.
Based on publicly available information, the Department of Justice has awarded over $57,000,000 in grants to local police departments through Smart Policing Programs. Since 2021, the DOJ has granted $3,943,002 to police departments to improve “data-driven” policing, “smart” policing, and related activities. The Pasco County Sherriff’s predictive policing tool, which was created with federal funds, was used to “monitor and harass families across the country,” even when police acknowledged that the subjects of this harassment had not violated the law but were merely predicted to do so in the future.
Considering the dangerous effects these technologies are having every day and the lack of reliable information about their design and use, EPIC urges you to implement the following additional recommendations:
Haltany current grants or funding opportunities that allow for the procurement or development of predictive technologies or biometric tools for law enforcement use until the study is complete and its recommendations implemented;
Require any technologies supported by federal funds to be affirmatively and empirically shown to be nondiscriminatory; and
Develop additional oversight of bad actor jurisdictions, including jurisdictions currently under consent decrees for violating civil rights claims in the last 10 years.
Please don’t hesitate to reach out to Ben Winters, EPIC Counsel, at email@example.com if you have any questions or would like to work with EPIC to ensure meaningful compliance with our recommendations.
 Vanita Gupta, U.S. Dep’t of Just., Associate Attorney General Vanita Gupta Releases Memo on Implementation and Administrative Enforcement of Title VI and Safe Streets Act, Justice Blogs (June 23, 2022), https://www.justice.gov/opa/blog/associate-attorney-general-vanita-gupta-releases-memo-implementation-and-administrative.
 Joy Buolamwini & Timnit Gebru, Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification, 81 Procs. of Machine Learning Res. 77–91 (2018), available athttp://proceedings.mlr.press/v81/buolamwini18a.html;Inioluwa Deborah Raji & Joy Buolamwini, Actionable Auditing: Investigating the Impact of Publicly Naming Biased Performance Results of Commercial AI Products, AIES ’19 (Jan. 2019), available athttps://dl.acm.org/doi/abs/10.1145/3306618.3314244; Nat. Inst. of Standards & Tech., NIST Study Evaluates Effects of Race, Age, Sex on Face Recognition Software (Dec. 19, 2019), https://www.nist.gov/news-events/news/2019/12/nist-study-evaluates-effects-race-age-sex-face-recognition-software.
See, e.g., Amnesty Intl., Inside the NYPD’s Surveillance Machine (Feb. 15, 2022), https://banthescan.amnesty.org/decode/ (showing that facial recognition-compatible public and private cameras were deployed more in New York City neighborhoods with higher non-White populations).