Overview of EPIC’s Comments to DOJ and DHS on the use of facial recognition, other technologies using biometric information, and predictive algorithms.

March 8, 2024 | Maria Villegas Bravo, EPIC Law Fellow

EPIC submitted comments in response to DOJ and DHS’ Request for Written Submissions on Sec. 13e of Executive Order 14074 urging DOJ and DHS to center vulnerable communities  as the agencies craft new guidance for law enforcement on certain advanced technologies. The proposed guidance will cover use of facial recognition, predictive policing technologies, social media surveillance tools, and DNA analysis. DOJ and DHS have a long and historied pattern of misuse and abuse[1] of surveillance tools. Artificial intelligence is now accelerating the extensive harms to civil rights engendered by suspect policing practices. EPIC strongly urges law enforcement agencies to immediately end the use of certain technologies entirely, most notably facial recognition technology, but we reiterated our framework of ten principles to preserve privacy, civil rights, and civil liberties in the interim. Across the board, these technologies are unreliable, enshrine discriminatory policing practices, and stand to undermine the most fundamental rights guaranteed by the Bill of Rights. While EPIC focused on four specific types of technologies, these principles should be taken into consideration across the board. This blogpost is a snapshot of EPIC’s extensive comments submitted on the matter.

Facial Recognition Technology

Facial recognition technology is a dangerous tool of oppression. While current iterations of the tools often employ inaccurate and bias algorithms, facial recognition technology will always be a tangible threat to privacy and civil rights due to its unique ability to track individuals in public and enable comprehensive surveillance. Despite widespread adoption by the federal government, there is little regulation and oversight of the use of facial recognition.

Firstly, facial recognition is often plagued by alarming racial and gender disparities, even products designed by market leaders. When using high quality photos for both the probe image and the images in the dataset, women and black men are still consistently misidentified at a higher rate than white men. However, law enforcement agents often use low quality and inappropriate probe images when using the technology, which exacerbate the inaccuracy issues. Instead of high quality, well-lit mugshots, police officers have used celebrity look alike photos, forensic sketches of a suspect, and photoshopped images to find a match in the system. Even if the probe image and the dataset contained high quality, representative images, human review of the outputs of the algorithm may still be flawed, leading to further misidentifications.

Beyond the quality of the algorithms and the processing of images by law enforcement, the use of facial recognition itself is accelerating the existing historical racial inequalities in the criminal justice system. This biased deployment is not theoretical, nor is it a thing of the past. Project Green Light, a Detroit surveillance program using live camera feeds to pull facial images for searches in facial recognition algorithms, was mostly deployed in majority-Black neighborhoods despite including nearly every Michigan resident in its database. Similarly, New York City has deployed more surveillance cameras compatible with facial recognition in non-white communities. A 2022 study found that the use of facial recognition technology actively “contributes to greater racial disparity in arrests.”

Law enforcement already have indiscriminate access to facial image databases and surveillance cameras to enable real time tracking and identification of individuals in crowds.[2] This comprehensive real time surveillance chills freedom of speech and protest, and the effects are already widespread. Several guides on how to protest safely advise facial coverings and other measures to avoid detection within crowds.[3]

EPIC strongly urges law enforcement agencies to immediately stop the use of facial recognition technologies. The dangers are far too great, and federal regulation is unlikely to meaningfully reduce all the risks that stem from the technology.

Person Based Predictive Policing Tools

Person based predictive policing tools are technologies that “[try] to measure the risk that a given individual will commit crimes.” There are a wide range of tools that fall under this umbrella, including more classic tools that try to predict risk of offense based on prior contact with the criminal justice system, tools that try to predict criminal intent, and comprehensive surveillance tools that try to predict human behavior. These tools enshrine and accelerate biased policing norms and are increasingly being used on younger populations.

Law enforcement have been experimenting with untested technologies that are rife with bias. For example, criminal intent is a vague and internal process that is difficult to identify and measure with objective, external standards. Many of the technologies based on criminal intent, such as auditory aggression detectors and face based emotion recognition systems, are based on pseudoscience. The Government Accountability Office (GAO) has repeatedly requested empirical evidence to support TSA’s behavioral detection and analysis techniques, but has found TSA’s responses to be lacking. In fact, the auditory aggression sensors being deployed in schools to prevent school shooters were tested in wholly unrelated conditions (a noisy, pub filled street in Europe), leading to major questions as to its efficacy. Law enforcement have repeatedly been forced to abandon or reform and rebrand costly projects due to the lack of efficacy and public outcry regarding the entrenchment of harmful policing practices.

These tools are increasingly being deployed against minors and used to funnel individuals into the criminal justice system by purporting to detect crime or its precursors. Schools mistakenly increase police presence in schools to deter crime, but this only leads to feelings of unease and anxiety as well as disproportionately violent punishments, particularly among students of color. More contact with law enforcement only lends itself to more time spent in the criminal justice system and serious mental health issues. Children are subject to heightened legal protections from criminal punishment due to the way the brain develops impulse control later in life leading to a “period of heightened vulnerability to risk taking” in early adolescence.[4] Because juveniles are more likely to desist from involvement in criminal activity as they mature, they’re less likely to be seen as “incorrigible criminals” and are more likely to receive help to address the root issues of criminality. Schools shouldn’t deploy faulty aggression detectors or send disciplinary records to law enforcement in a misguided attempt to lead students to a crime-free future. All the schools are doing is opening students to court fees, legal challenges, and extended police harassment campaigns.

Social Media Surveillance Tools

Social media surveillance is a pervasive and ever-expanding system of intelligence gathering. Social media information has unique technical issues that make meaningful intelligence gathering impractical, if not impossible. Furthermore, social media tools are already proving to add limited value in investigations, wasting the effort and expense poured into them. Finally, social media surveillance builds on the extensive surveillance dragnet built by the federal government and stands to severely undermine First Amendment rights.

Social media surveillance is of limited investigatory use because of the inadequacies of both the technology and the humans that gather and analyze information. Text based artificial intelligence systems are trained on high quality English language samples, but social media is rife with low quality content that creates noise for the algorithms. Text based social media is filled with bad grammar, misspelled words, slang, and hundreds of non-English languages that make it hard to identify and interpret targeted content. Furthermore, assessing true threats from hyperbole is difficult for both artificial intelligence systems and humans alike. An internal review of DHS’ monitoring of Portland, Oregon social media during the civil unrest and protests in the summer of 2020 following the murder of George Floyd found that the human monitors were inadequately trained on differentiating true threats from mere hyperbole which lead to false alarms and reports full of “crap.”

Beyond the potential for massive data breaches exposing sensitive data, the overbroad collection of information has led to a “virtual stakeout” of the entire world. Privacy in public and the related concept of intellectual privacy protects an individual’s ability to generate ideas, form beliefs, and self-realize as well as supporting freedom of thought, association rights, and prevents conformity of thought. DOJ and DHS do not monitor social media in isolation. This data is collated into databases, which feeds into several programs such as DHS’ Automated Targeted System which includes data on immigrants, no fly lists, commercially available data, and classified intelligence. Individually, social media data can be damaging and highly sensitive, but in combination with the hoard of data kept by the federal government, it is a ticking time bomb for civil liberties and civil rights.

First Amendment rights, in particular, are at stake due to the expressive quality of social media and the fact that protests are often highly publicized on social media. As referenced above, protest guides already warn protestors against exposing their faces to protect their identities. The guides go even further to discourage posting pictures or videos with identifiable faces in frame to ensure that social media monitoring cannot implicate fellow protestors when used in combination with facial recognition technology. Beyond protest monitoring, the monitoring of social media itself chills free expression. The Brennan Center for Justice and the Knight First Amendment Institute filed a lawsuit against DHS collecting social media identifiers on visa forms. The suit alleges that the registration of social media accounts “deprive[s] visa applicants of the rights to anonymous speech and private association,” and chill constitutionally protected speech and association, all while being poorly tailored to the government’s stated interests.” The U.S. District Court for the District of Columbia failed to protect the First Amendment by giving significant deference to immigration officials on national security grounds despite noting that the challenged policies do run the risk of chilling constitutionally protected speech and association. Plaintiffs filed an appeal in late 2023.

EPIC strongly urges law enforcement agencies to immediately stop the use of social media surveillance technologies. The dangers are far too great and federal regulation is unlikely to meaningfully reduce all the risks that stem from the technology.

DNA Analysis Tools

DNA has earned a reputation for unparalleled reliability, but the past several decades have shown the cracks in that veneer. DNA analysis tools come with flaws at every level, from the collection and retention of DNA, the processing of DNA, the deployment of DNA analysis tools, and the opaque nature of law enforcement’s use of such tools.

When building up DNA databases, law enforcement’s collection of DNA is severely overinclusive. For example, DNA is collected from newborns, immigrants, victims of crimes, juvenile offenders, and suspects who were never arrested and immediately added to law enforcement DNA databases. Law enforcement also has unprecedented access to new streams of DNA wholly unrelated to contact with the criminal justice system through databases created by direct-to-consumer genetic testing companies (e.g. 23andMe) and forensic genetic genealogical DNA analysis and searching. Often, DNA is also collected earlier than necessary in criminal investigations, considering that DNA is not a smoking gun––it merely suggests presence or proximity.[5] Since it cannot alone indicate guilt or innocence, DNA collected by law enforcement or for other purposes should only be added to a database pursuant to a warrant or after the individual has been convicted of a crime. These databases also continue to grow because expungement is all but a myth. Few profiles have been expunged, and, in the majority of states, the process is “burdensome, costly, and must be initiated by the arrestee.” Only a few states automatically destroy DNA samples if the suspect is acquitted, and some states with automatic expungement leave loopholes to retain voluntarily provided samples.

Once the databases are built and the infrastructure is in place, law enforcement is able to expand a narrow use-case tool to an expansive ecosystem of information gathering at the cost of privacy. Direct-to-consumer databases are particularly at fault, allowing law enforcement to get around Fourth Amendment warrant requirements by using the plausible defense that the law enforcement official is looking for other crimes committed by the individual, rather than using an expanded database for identification. DNA is uniquely vulnerable to scope creep because, as new analysis techniques arise, DNA can be used for far more than mere identification. DNA is not equivalent to fingerprinting, because when combined with the data in commercial databases, CODIS DNA data can reveal a person’s race or ancestry, health data, and other sensitive data. In particular, DNA can be used to determine familial association and implicate individuals unconnected to a crime by the mere fact that they are related to someone whose DNA was analyzed. These unconnected individuals could then be prosecuted for crimes without law enforcement filing a search warrant or other court order.

Results from single-source DNA matching in a controlled lab environment are generally reliable, but in many cases, the analysis conditions are far from pristine. There is a noted lack of standardization in DNA lab best practices, studies have found lab error rates as high as 5%,[6] and one prominent study even found that subjective elements may influence whether an analyst will report a match. Less rigorous methods of analysis also introduce greater error rates, such as Rapid DNA testing done in a police station. 

Finally, the way law enforcement deploys and leverages DNA analysis tools heavily impacts civil rights and erodes public trust due to lack of transparency. Law enforcement deploy DNA analysis to investigate constitutionally protected protests and other political speech, such as collecting DNA from a chain used by Occupy Wall Street protestors. DNA databases also enshrine disparate impact into daily practice because the databases come with a major, systemic issue: minority communities who are overrepresented in the database will be overrepresented in possible matches and connections to future crimes, creating a feedback loop. This issue is compounded by “spit and acquit” programs which coerce individuals into turning over their genetic data. Marginalized groups may have fewer resources to effectively challenge weak cases or improper procedure and would therefore be more susceptible to such tactics leading to further overrepresentation in databases. Despite these risks, law enforcement conduct most of these analyses under a shroud of secrecy–– they do not disclose how samples were obtained or when direct-to-consumer databases were used, nor the error rates and match percentages resulting from analysis. The lack of oversight and accountability due to the incentives to solve crimes over protecting civil rights exacerbates these power dynamics and demands greater consideration of the costs of DNA-driven investigations.

Recommendations

New developments in biometric analysis and artificial intelligence are accelerating and escalating the risks to privacy, civil rights, and civil liberties at a lower cost than ever before. By imposing guardrails, some of these risks can be mitigated. The following are ten key recommendations to safeguard privacy, civil rights, and civil liberties.

Recommendation One: DOJ and DHS should prohibit mass surveillance by ensuring that use of these technologies is context dependent and for explicit and legitimate purposes. 

These technologies are not used in a vacuum––they are all part of a larger surveillance ecosystem within DHS and DOJ. EPIC recommends limiting the use of these technologies to specific permissible purposes, explicitly delineating impermissible uses of the technology, reaffirming warrant requirements and closing warrant loopholes, minimizing the data kept in databases, and targeting the use of these technologies on individuals who have already been identified rather than using the tools as fishing expeditions.

Recommendation Two: DOJ and DHS should protect civil rights by prohibiting arrests based solely on untested surveillance technology.

Law enforcement cannot be permitted to rely on these untested, risky technologies. There are multiple documented instances of wrongful arrests based solely on the use of facial recognition and DNA analysis tools. DOJ and DHS must take initiative to correct these issues because the courts have not done an adequate job protecting the interests of defendants. EPIC recommends denying search warrants based on results from these technologies that are not corroborated by independent, objective evidence. DOJ and DHS should also draft guidelines as to when law enforcement may intervene (house visits, further investigations, or otherwise) in an individual’s life based solely on outputs from predictive technologies and/or social media surveillance. DOJ and DHS should also prohibit coercive tactics to grow DNA databases by placing acquittal behind a genetic data paywall.

Recommendation Three: DOJ and DHS should protect criminal defendants’ constitutional rights by requiring adequate notice of the use of these surveillance technologies and ensuring that the technology is subject to adversarial interrogation during criminal litigation.

Under Brady v. Maryland, 373 U.S. 83 (1963) and the Sixth Amendment confrontation clause, criminal defendants have a constitutional right to disclosure of evidence that may be exculpatory in nature. [7] In particular, DOJ and DHS should expressly prohibit parallel construction––a common practice to avoid scrutiny where agents conceal the role of certain surveillance techniques by recreating evidentiary trails. [8] EPIC urges DOJ and DHS to require the disclosure of the following information during criminal litigation:

  • The fact that a certain type of technology or system identified the defendant as a suspect or possible suspect;
  • Information about the technology or system used for that identification, including the specific system used as well as results from any independent audits and/or internal reviews of the technology that may indicate reliability or accuracy issues;
  • The exact query and resulting output that law enforcement purports to identify the defendant;
  • The output from that technology or system that identified individuals other than the defendant as a suspect or possible suspect; and
  • Any other information that may be relevant to challenging the output of the technology or system.

Recommendation Four: DOJ and DHS should ensure that any surveillance technology it plans to use is provably non-discriminatory and prohibit the use of such technology unless this non-discrimination is verified.

Title VI prohibits recipients of federal financial assistance from discriminating based on race, color, and national origin.[9] DHS and DOJ are one of the primary financial drivers of these technologies, funneling hundreds of millions of dollars into research and development. To ensure meaningful compliance with Title VI, EPIC urges DOJ and DHS to implement strong evaluation and testing protocols focused on bias and disparate impact during the procurement process as well as periodic assessments throughout the lifetime of the technology to ensure continuing non-discrimination. In particular, the privacy impact assessment process could be amended to include mandatory testing prior to deployment, with explicit bars from using any data collected during the testing process operationally.

Recommendation Five: New surveillance technology—and new uses for existing surveillance technologies—should be deployed only after an adequate evaluation of its purpose and objectives, its benefits, and its risks.

Privacy impact assessments (PIAs) are required under the E-Government Act of 2002, but broad discretion is given to agencies on how they are designed and implemented. DOJ and DHS should engage in robust PIAs to ensure that the technologies being acquired meet the mission of the department and adequately protect civil rights and civil liberties. EPIC recommends including the following mandatory factors in PIAs:

  • Mission analysis and fit between the mission and the proposed use of the technology;
  • Needless over-collection of data;
  • Lack of consent from data subjects;
  • Failure to minimize data collection and retention;
  • Lack of transparency;
  • Lack of due diligence; and
  • Lack of accountability.  

Beyond PIAs, EPIC also urges DOJ and DHS to require adequate training for all personnel using the technology and/or interpreting outputs from the technologies. This includes any third party or private entities that are collecting data, processing data, and/or retaining data on behalf of DOJ and DHS.

Recommendation Six: DOJ and DHS should adopt stricter data minimization procedures, including a prohibition on retention of biometric data after identity is confirmed.

EPIC is a strong proponent of data minimization in all respects, but particularly in the law enforcement context where there is a culture of over-inclusive collection. DOJ and DHS should stop the automatic addition of collected data (probe images, DNA samples, etc.) to government databases. The addition of data to databases should be strictly limited to data connected to convictions, and DOJ and DHS should expunge data collected from individuals that were not convicted. There should also be limits on secondary processing of data such as health data used for DNA analysis and routine police data included in predictive policing algorithms. Original samples (including facial images for probe images and DNA samples) should be destroyed after identification has occurred.

Recommendation Seven: DOJ and DHS should ensure adequate security for all retained data, with correspondingly greater protections for more sensitive data.

The data collected by law enforcement is highly sensitive, so proper data security is crucial. The most important component of security is data minimization by design, as discussed above. DOJ and DHS must identify and fund adequate security protocols across all systems. Highly sensitive data such as faceprints, DNA samples, and other biometric data should be encrypted and stored separately from other data. In addition, special attention should be given to third party vendors of these technologies to ensure that the third parties are providing a similar level of protection for this data.    

Recommendation Eight: DOJ and DHS should require regular independent auditing of all surveillance technologies both prior to deployment and periodically thereafter.

EPIC strongly recommends independent audits as a prerequisite for acquisition and deployment of the technology, as well as being done periodically during the lifecycle of the technology. These audits should, at a minimum, include the following characteristics:

  • The audits should be conducted by a qualified, independent body such as NIST;
  • The audits should review both the inputs and the outputs of the technology;
  • The audits should assess the technology for biases and disparate impact;
  • Audits should be performed regularly, at least annually; and
  • The audits should assess the level of training given to all personnel procuring, deploying, and overseeing the use of the technology.

Recommendation Nine: DOJ and DHS should strengthen accountability and oversight mechanisms, including by requiring robust training, incident reporting, and consequences for misuse or other harms.

To ensure appropriate accountability and oversight of these technologies, DOJ and DHS should implement several layers of compliance mechanisms and should train personnel at all levels. The technologies need to have systems that log all queries, including the agent who made the query, the substance of the query, and all outputs of the query. These logs must be retained and supervised to ensure agents aren’t engaging in misuse or abuse. DOJ and DHS should implement an incident reporting system that ends in tangible consequences for misuse and abuse, including revocation of access to the technology, administrative sanctions, and more serious consequences for willful misconduct. Third party vendors should be subject to additional scrutiny to ensure adequate protection measures are in place. DOJ and DHS should ensure robust and consistent training for all personnel who interact with the technology, including but not limited to: procurement personnel, personnel who deploy the technology, personnel who interpret outputs from the technology, and personnel who audit or otherwise evaluate the technology.

Recommendation Ten: DOJ and DHS should advance public trust, prioritize transparency, and require substantiation of claims relating to surveillance technology.

DHS and DOJ must center transparency to advance DOJ and DHS’ goal of strengthening public trust. PIAs are the most common form of publicly available information relating to these technologies. DOJ and DHS must consistently and thoroughly engage in the PIA process in a timely manner to ensure transparency. The PIA process should be revised to be more robust and include information on the limitations of the technology, including but not limited to error rates and disclosures relating to the number of queries and non-compliant use. Artificial intelligence and new technology enjoy a veneer of legitimacy, and law enforcement must be clear to the public as to the actual capabilities of the technology. This advances public trust and ensures criminal defendants have accurate grounds upon which to challenge the use of the technology.


[1] For a sampling of law enforcement abuse and misuse of surveillance tech see: https://epic.org/ices-privacy-impact-assessment-on-surveillance-technologies-is-an-exercise-in-disregarding-reality/; https://epic.org/documents/epic-comments-pclob-investigation-of-section-702-surveillance/; https://epic.org/sen-wyden-reveals-new-details-about-the-massive-hemisphere-surveillance-program/;  https://epic.org/dhs-oig-report-secret-service-and-ice-illegally-used-cell-site-simulators/; https://epic.org/documents/epic-v-doj-csli-section-2703d-orders/; https://epic.org/documents/epic-v-doj-pen-register-reports/, https://epic.org/documents/epic-v-doj-prism/.   

[2] See, e.g., U.S. Gov’t Accountability Office, GAO-21-518, Facial Recognition Technology: Federal Law Enforcement Agencies Should Better Assess Privacy and Other Risks 17 (June 3, 2021), https://www.gao.gov/assets/gao-21-518.pdf (finding that at least six agencies used facial recognition to surveil Black Lives Matter protestors); Benjamin Powers, Eyes Over Baltimore: How Police Use Military Technology to Secretly Track You, Rolling Stone (Jan. 6, 2017), https://www.rollingstone.com/culture/culture-features/eyes-over-baltimore-how-police-use-military-technology-to-secretly-track-you-126885/ (reporting that the Baltimore Police Department used facial recognition and social media surveillance to surveil protestors following the death of Freddie Gray). 

[3] Natural Resource Defense Council, How To Protest Safely (Jan 26, 2022), https://www.nrdc.org/stories/how-protest-safely  ; Louryn Strampe and Lauren Goode, How to Protest Safely: What to Bring, What to Do, and What to Avoid, WIRED (Jun. 24, 2022), https://www.wired.com/story/how-to-protest-safely-gear-tips/ ; Center on Race Inequality & The Law NYU School of Law, Protest Tips and Resources, https://www.law.nyu.edu/centers/race-inequality-law/protest-tips

[4]Nadia Rossbach, Innocent Until Predicted Guilty: How Premature Predictive Policing Can Lead to a Self-Fulfilling Prophecy of Juvenile Delinquency, 75 Fla. L. Rev. 167, 184 (2023); see e.g. Roper v. Simmons, 542 U.S. 551 (2005) (sentencing someone to death for a crime committed when they were under 18 is unconstitutional); Graham v. Florida, 560 U.S. 48 (2010) (sentencing someone to life imprisonment without the possibility of parole for a non-homicide crime committed while under the age of 18 is unconstitutional); Miller v. Alabama, 567 U.S. 460 (2012) (sentencing someone to life imprisonment without the possibility of parole for a homicide committed while under the age of 18 is unconstitutional).

[5] In one study in which individuals shook hands for two minutes and later handled knives, DNA testing of 20% of the knives indicated not the DNA of the person who actually handled the knife but rather that the DNA of the person with whom they shook hands was the primary or sole DNA contributor. Mary Graw Leary, Touch DNA and Chemical Analysis of Skin Trace Evidence: Protecting Privacy While Advancing Investigations, 26 Wm. & Mary Bill Rts. J. 251, 273 (2017). 

[6] Adrienne N. Kitchen, Genetic Privacy and Latent Crime Scene DNA of Nonsuspects: How the Law can Protect an Individual’s Right to Genetic Privacy While Respecting the Government’s Important Interest in Combatting Crime, 52 No. 2 Crim. L. Bulletin 

[7] See DOJ, Justice Manual 9-5.001, https://www.justice.gov/jm/jm-9-5000-issues-related-trials-and-other-court-proceedings#9-5.001 (DOJ policy regarding disclosure of exculpatory information).

[8] See Hum. Rts. Watch, Dark Side Secret Origins of Evidence in US Criminal Cases (Jan. 9, 2018), https://www.hrw.org/report/2018/01/09/darkside/secret-origins-evidence-us-criminal-cases

[9] 42 U.S.C. § 2000d; see also EPIC’s Title VI petition regarding a location based predictive policing tool, ShotSpotter. EPIC Letter to Attorney General Garland Re: Title VI Compliance and Predictive Algorithms (Jul. 6, 2022), https://epic.org/documents/epic-letter-to-attorney-general-garland-re-title-vi-compliance-and-predictive-algorithms/

Support Our Work

EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.

Donate