APA Comments
Comments of EPIC to OMB on Privacy Impact Assessments
2024-01756
By notice published on January 30, 2024, the Office of Management and Budget (OMB) requested comments regarding Privacy Impact Assessments (PIAs).[1] More specifically, OMB requested input “on how privacy impact assessments (PIAs) may be more effective a mitigating privacy risks, including those that are further exacerbated by artificial intelligence (AI) and other advances in technology and capabilities.”[2] The Electronic Privacy Information Center (EPIC) submits these comments to urge OMB to ensure agencies comply with current PIA requirements, improve transparency around PIAs and associated documents, and update guidance around conducting PIAs to make them more detailed and capable of addressing the privacy risks of newer technologies.
EPIC is a public interest research center in Washington, D.C., established in 1994 to focus public attention on emerging civil liberties issues and to secure the fundamental right to privacy in the digital age for all people through advocacy, research, and litigation.[3] EPIC has a particular interest in accountability, civil rights, privacy, and civil liberties with respect to the government’s use of personally identifiable information. PIAs play an important role in government accountability and determining the privacy risks of systems that use personally identifiable information (PII). A properly conducted PIA enables an agency to identify privacy risks, determine if and how those risks can be mitigated, and make an informed decision whether the proposed collection or system can be justified in light of its privacy impact. Additionally, a PIA informs the public of data collection or an information system that poses a threat to privacy. PIAs not only help to protect privacy but in doing so inherently help to protect civil rights and civil liberties.
EPIC has a long history of advocating for improvements to PIAs, using the Freedom of Information Act to make PIAs public and trying to force agencies to conduct PIAs. EPIC’s work has made clear the shortcomings of PIAs and the consistent failure of agencies to comply with the E-Government Act’s PIA requirement.
- EPIC’s work exposing the failure of federal agencies to comply with the PIA requirement
Over the past decade, EPIC has identified numerous instances in which the DHS, FBI, DEA, United States Postal Service, and other agencies have failed to complete required PIAs under the E-Government Act for activities implicating personal data.
In 2015, EPIC sued the FBI over its FOIA request for all unpublished PTAs and PIAs, particularly those related to facial recognition technology, license plate readers, and domestic drone surveillance—documents which had not been publicly updated for years, if at all.[4] EPIC filed the FOIA request because in the past several years prior to EPIC’S request it had come to light that the FBI was using technology in ways that should require a PIA and the agency had indicated it was going to do a number of PIAs that were not publicly available at the time of the request.
For example, in July 2012, the Senate Subcommittee on Privacy, Technology and the Law held a hearing on “What Facial Recognition Technology Means for Privacy and Civil Liberties,” where the FBI stated they were updating its PIA on facial recognition.[5] In a statement for the record by Jerome Pender, the Deputy Assistant Director of the Information Services Branch for Criminal Justice Information Services Division of the FBI, stated “the 2008 [Interstate Photo System] PIA is currently in the process of being renewed by way of Privacy Threshold Analysis, with an emphasis on facial recognition. An updated PIA is planned and will address all evolutionary changes since the preparation of the 2008 PIA.”[6]
Similarly, in June 2013 the Senate Judiciary Committee held a hearing on “Oversight of the Federal Bureau of Investigation.”[7] During that hearing, Senator Chuck Grassley asked FBI Director Robert Mueller about the FBI’s use drones. Director Mueller responded that the FBI did use drones domestically for surveillance. During that same exchange, Senator Grassley asked about the development of policies, procedures, and operational limits on the FBI’s use of drones and the privacy impact on Americans. Director Mueller indicated that the FBI was at the beginning stages and were “exploring not only the use but also the necessary guidelines for that use.”[8]
In 2013 through a FOIA request to the FBI, EPIC obtained emails from 2012 that indicated the FBI was required to do a PIA for its license plate reader (“LPR”) program and make that PIA publicly available.[9] Additionally, the emails indicated a draft PIA existed for the LPR program.[10] Despite receiving hundreds of pages of documents from the FBI in EPIC’s 2015 FOIA lawsuit against the Bureau for all PTAs and PIAs, EPIC did not receive any PIAs for the FBI’s use of facial recognition technology, drones, or license plate readers despite evidence that such documents should have been completed.
In 2015 EPIC also filed a lawsuit against the DEA over its FOIA request for all unpublished PTAs and PIAs, particularly those related to the Hemisphere telephone record collection program, National License Plate Reader Program, and DEA Internet Connectivity Endeavor data aggregation and sharing program—programs for which there was no publicly available PTA or PIA documentation.[11] At the time of the 2015 EPIC FOIA request, some of the information known about Hemisphere program was that it was funded by the DEA and the White House’s Office of National Drug Control Policy, and since at least 2007 the program had allowed the DEA and other law enforcement agencies to access billions of phone records of AT&T customers as well as other non-customers whose communications were routed through an AT&T switch.[12]
On May 21, 2012, the U.S. House of Representatives Subcommittee on Border and Maritime Security held a field hearing on “Stopping the Flow of Illicit Drugs In Arizona By Leveraging State, Local And Federal Information Sharing.”[13] At the hearing, Douglas W. Coleman, Special Agent in Charge, Phoenix Field Division of the DEA, was one of the witnesses. In his statement for the record, Mr. Coleman indicated that “[i]n December 2008, DEA launched a National License Plate Reader (LPR) Initiative in direct response to the smuggling of illicit drug monies out of the United States, primarily via the U.S.-Mexico border.”[14] According to Mr. Coleman’s statement for the record, the DEA’s LPR program monitors and targets vehicles, uses existing database technology, and promotes information sharing.[15] In 2015 the DEA’s LPR program came under scrutiny by U.S. news media.[16] Additionally, in January 2015 then Chairman Chuck Grassley and Rank Member Patrick Leahy of the Senate Judiciary Committee sent a letter to Attorney General Eric Holder regarding the privacy concerns related to the government’s use of LPRs.[17]
During the same May 2012 field hearing, Mr. Coleman’s statement for the record identified another program entitled the DEA Internet Connectivity Endeavor (DICE) that “… enables any participating federal, state, local and tribal law enforcement agency to de-conflict investigative information, such as phone numbers, email addresses, bank accounts, plane tail numbers and license plates, to identify investigative overlaps.”[18] DICE provided access to information collected through the LPR program (among other information) and allows the accessibility of such data through the Internet.[19] DICE reportedly contained approximately a billions records, including phone log data, at the time.[20] Despite the DEA programs described above and EPIC’s FOIA lawsuit, the DEA did not produce a single PTA or PIA relevant to those programs.
In addition to EPIC’s work to uncover whether PIAs have been conducted for various information systems and if so to make them public, EPIC has also filed lawsuits to compel agencies to produce required PIAs under the E-Government Act. In 2017, EPIC sued the now-defunct Presidential Advisory Commission on Election Integrity for failing to conduct a PIA before seeking citizens’ personal voting information, eventually securing the deletion of the unlawfully collected data.[21] In 2018, EPIC sued the Department of Commerce and U.S. Census Bureau to compel the agencies to complete a PIA for the census’s addition of a citizenship question—a question which was later dropped.[22] In 2021, EPIC sued the U.S. Postal Inspection Service under the E-Government Act for failing to produce a PIA for its Internet Covert Operations Program, highlighting the Service’s surveillance of protesters and other individuals using facial recognition and social media monitoring services.[23]
Despite some positive results EPIC has had bringing lawsuits against agencies for the failure to conduct a PIA, enforcement by civil society is not a reliable means to force the completion of a PIA. Indeed, courts have generally not ruled in EPIC’s favor regarding its standing to compel an agency to conduct a PIA. Furthermore, it is clear from EPIC’s experience that the failure of agencies to conduct a PIA is not isolated to a few incidents but a widespread occurrence that goes beyond EPIC’s means to rectify even if lawsuits could reliably force agencies to conduct PIAs.
- Role of PIAs in addressing and mitigating privacy risks
The E-Government Act of 2002 established Privacy Impact Assessments as an important step agencies must take before engaging in activities that risk Americans’ privacy and civil rights. But PIAs are not currently doing the job they are meant to do. PIAs should require a thorough evaluation of the potential harms of an information collection system that helps an agency decide whether to implement that system. Now, agencies treat PIAs as a box checking exercise to complete after information collection systems are in place, removing the decision-making value of the document. PIAs often omit important information that could help the public better understand the risks of federal systems—that is, if the agency conducts and publishes the PIA at all. To address the deficiencies in PIAs across the government, OMB should update its guidance to require that agencies:
- Conduct PIAs as required and publish them promptly;
- Conduct PIAs before systems are in place so that PIAs are pre-decisional documents, not post-hoc rationalizations;
- Produce PIAs that are sufficiently detailed to give the public a full accounting of agency activities and the risks they create; and
- Fully disclose and evaluate the risks created by using third-party technology and third-party data.
Only if the deficiencies in current implementation of PIAs are addressed can PIAs be, as OMB put it, “one of the most valuable tools Federal agencies use to ensure compliance with applicable privacy requirements and manage privacy risks.”[24]
- Agencies must conduct PIAs on all information collection systems in a timely manner.
The most basic requirement of Section 208 of the E-Government Act is unfortunately one of the most frequently flouted. Agencies regularly fail to complete PIAs at all, or do so on a timeline of decades instead of weeks and months. Failure to produce PIAs usually leaves agencies without any analysis of the privacy impacts and potential flaws in their systems, and it always leaves the public without critical information. For agencies that don’t want to comply with the spirit of the E-Government Act, the current guidance offers loopholes that agencies can lean on to excuse blatant non-compliance.
The U.S. Postal Inspection Service’s (USPIS) failure to complete a PIA for a controversial secret intelligence program leveraging advanced technology illustrates the harms that can propagate when agencies choose not to comply with the E-Government Act. USPIS is a law enforcement agency housed within the Postal Service, tasked with enforcing mail fraud and protecting postal workers.[25] In 2021, EPIC sued the U.S. Postal Inspection Service under the E-Government Act for failing to produce a PIA for its Internet Covert Operations Program (iCOP).[26]
This program runs a bevy of cutting-edge information collection systems, including “‘cryptocurrency tracking, open source intelligence and social media analysis, geospatial mapping, and data visualization, and USPS backend and network data exploitation’” alongside Clearview AI’s facial recognition software, and specialized social media software for creating fake online identities.[27] iCOP operates with little functional oversight, and virtually no transparency. In 2020-21, the iCOP program tasked its analysts with tracking and collecting online evidence of both left-wing and right-wing protests.[28] Lacking even the oversight and transparency requirements of DHS, the iCOP was able to secretly perform controversial and First Amendment infringing surveillance and then distribute the intelligence it gathered across both federal and local law enforcement agencies. Without a PIA to FOIA, there was functionally no way to discover the program until its existence was leaked to a journalist.
Outside of our litigation work, EPIC regularly calls out agencies for PIAs that are conducted and published long after harmful mass surveillance systems are put in place, or updated so infrequently that document no longer serves to meaningfully inform the public about what an agency is doing. In one egregious example, DHS subcomponent Immigrations and Customs Enforcement (ICE) ran the notorious Alternatives to Detention/ISAP program for nearly 20 years without conducting a PIA, despite using the most invasive forms of surveillance technology like GPS ankle monitors and facial recognition equipped smartphone apps.[29] ICE claimed to be in compliance with the E-Government Act by grandfathering the rapidly expanding ISAP program under an existing System of Records and PIA for the ENFORCE system.[30] But the agency has faced no consequences for delaying a real analysis of the privacy harms of surveilling immigrants for two decades, nor for failing to meaningfully account for those harms. Similarly, DHS’ Office of Inspector General found that three agencies failed to implement PIAs before collecting sensitive geolocation data, in one of the most egregious privacy failures in recent agency history.[31] Although agencies regularly shirk their responsibilities to conduct PIAs at all, when it comes to preventing harmful agency actions a late PIA is little better than a nonexistent PIA.
- PIAs should be pre-decisional, not an exercise in post-hoc justifications and box-checking.
PIAs are modelled after Environmental Impact Assessments (EIS) required by the National Environment Policy Act of 1970,[32] but agencies regularly fail to use PIAs in the in the same way that EISs are used. By federal law, any federal agency must complete an EIS and consider viable alternatives before breaking ground on a new project that might have significant environmental impact.[33] The E-Government Act of 2002 is clear that agencies “shall” conduct a Privacy Impact Assessment “before developing or procuring information technology … or initiating a new collection of information …”[34] But even by agencies’ own accounting, most do not meet this requirement.
Agencies fail to complete PIAs on time even though the same agencies recognize that PIAs are helpful. A GAO report on compliance with privacy protections found that only 6 of the 24 agencies surveyed “always” initiated PIAs early enough in the system development process to impact the design or outcome of the system.[35] Only half of agencies claimed to be able to regularly hold staff accountable for failing to conduct a PIA in a timely manner, and one agency even claimed it could never hold staff accountable for the failure.[36] And this issue is not new, the GAO found as far back as 2007 that DHS was not completing and publishing PIAs in a timely manner, reducing the decision-making impacts and transparency effects of PIAs.[37] Neither internal oversight nor external pressure from organizations like EPIC has worked to compel agencies to conduct timely PIAs.
A loose requirement that PIAs be pre-decisional is likely part of the problem. OMB can do more to make it clear to agencies that PIAs must be completed before any covered system is implemented. For agencies, this means clearer guidance that if there is no PIA in place, then the system cannot be activated, full stop.
PIAs simply cannot be meaningful if they are completed after-the-fact because the system is already in place, and likely already in use. When PIAs identify systemic problems with the information being collected or the technology being used, it is hard for agencies to reverse course and re-engineer existing systems. It also becomes much harder to police compliance after
In particular, OMB should narrow or eliminate the guidance allowing agencies to post-pone completing a PIA when the technology has been assessed by another PIA or pertains only to internal agency data.[38] As described above, agencies regularly misuse these loopholes to effectively grandfather in new systems that may behave differently from legacy systems, or may collect and analyze different data that creates different privacy risks.
- PIAs must be more detailed to ensure that agencies make full and accurate evaluations of privacy harms.
Most PIAs that agencies currently publish do not provide enough detail for the public to fully understand federal agency systems, nor for the agencies themselves to make a meaningful accounting of potential privacy harms. As a result, the public is unaware of significant risks created by agency action, and agencies are failing to implement low or no-cost privacy protections. The worst results of these oversights are massive data breaches, infringements on individuals’ civil rights, and errors leading to wrongful denials of benefits or wrongful arrests alongside reputational and other privacy harms.[39] OMB can do more in its guidance to direct agencies to fully consider the consequences of data breaches, unauthorized access to sensitive systems, and the downstream impacts of systems that are interconnected.
PIAs regularly fail to account for the potential harms of data breaches by implementing weak data minimization requirements. Data minimization is one of the most effective privacy protections because information that is not collected cannot be breached or abused.[40] Very few PIAs undertake a meaningful analysis of the necessity of collecting voluminous amounts of information and tend to defer to agency claims that limiting collection is harmful or impractical. For example, in ICE’s ATD PIA from 2023, the agency claims that it is fully implementing data minimization practices despite collecting, retaining, and giving agency employees and contractors full access to immigrants historical geolocation data.[41] But such access is fully unnecessary for tracking down immigrants when they skip court, or for other legitimate enforcement reasons.[42] Agencies similarly are often cavalier about collecting Social Security Numbers (SSN) despite repeated guidance from GAO and OMB that agencies should avoid collecting and storing SSNs where possible.[43] And data minimization routines rarely ask whether retention schedules are necessary. ICE’s privacy assessment for facial recognition services for example greenlights a 20 year retention schedule for all data without accounting for the potential damage that could be caused by retaining a false match for an extended period of time.[44]
While agencies are generally good about chronicling what broad types of information is collected, they generally fail on the details and under-report how much data is being collected or how that data is linked with other data. A 2023 DHS OIG report found that CBP had failed to account for how commercially obtained phone geolocation data could be used to identify and track individuals, meaning that the agency was vastly under-reporting the potential risks of its actions.[45] And ICE did little better in accounting for the potential for abuse of its advanced surveillance technologies last year, failing to identify areas where the agency needed to obtain a warrant to use cell-site simulators and otherwise under-describing both the technology ICE has access to and the impact of that tech.[46]
A particular area of concern is agencies failure to account for networked systems and data flows. PIAs tend to capture a single system, and do not often account for just how much data flows between systems.[47] This means that PIAs are often insubstantial checks on agency data transfers, as chronicled in EPIC’s 2022 report, DHS’ Data Reservoir: ICE and CBP’s Capture and Circulation of Location Information.[48] By maintaining discrete PIAs for various location data systems, ICE and CBP are able to skirt effective oversight for just how much location data the agencies buy, and who gets access to it. And the concerns with purchased data only get worse from there. The FBI similarly redacts what systems its facial recognition product FACE is attached to, denying the public the ability to understand how many law enforcment agencies can access the system.[49]
- Agencies must fully disclose and account for third party data and third party systems.
PIAs often do not name the vendors responsible for providing purchased data, data analysis, or off-the-shelf technology. This practice prevents the public from getting a fulsome view of how data is being analyzed and transferred, who has access to sensitive information, and precisely how proprietary technology works. The vendor matters because the accuracy and security of surveillance products varies widely by vendor. There may be a significant different between using a data broker Lexis Nexis for identity verification and using a more bespoke or lower-tech identity verification provider. OMB should set up specific rules for disclosing information about third party vendors. In particular:
- Agencies should identify the vendor selling any purchased data;
- Agencies should identify the vendor selling surveillance technologies or surveillance services; and
- Agencies should account for the risk that a vendor will get unauthorized access to sensitive data.
PIAs generally do not name the vendor responsible for providing technology or services to the federal government. In extreme cases, like ICE’s ATD program, the failure to identify known contractors crosses the border into farce. The ATD PIA refers to a “ATD Servicer” repeatedly, failing to disclose that the servicer is prison-surveillance giant BI Industries.[50] But BI’s particular history and corporate structure are highly relevant when assessing the risks of unauthorized use of data, failure to delete data, and other abuses. Agencies that use facial recognition systems similarly fail to disclose the vendor of the system, which has a substantial impact on the accuracy of facial recognition algorithm provided.[51] ICE’s PIA for facial recognition services fails to name the facial recognition vendors it contracts with, leaving out any analysis of the wide disparity between a particularly bad actor like Clearview AI and a still-harmful but more limited facial recognition database.[52]
A PIA simply cannot be effective, either as a means for analyzing privacy risks, or as a means for informing the public, without considering the specific vendor and product being used. And even a PIA conducted early cannot be used as a way to engage multiple stakeholders if it lacks the information necessary for impacted communities and experts to determine how harmful a system might be. When it comes to third party vendors and third party systems, more disclosure is needed across the board.
- Role of PIAs in facilitating transparency
EPIC regularly uses PIAs and Privacy Threshold Assessments (PTAs) to learn more about federal agency activities, and to inform the public about the systems federal agencies use. PIAs are a particularly important tool to facilitate greater transparency through well-crafted FOIA requests. But PIAs often fall short when they are not published and easily searchable, fail to contain sufficient detail, are not written with enough context for an average person, and fail to consult stakeholders outside the agency. To improve the transparency function of PIAs, OMB’s new guidance should:
- Set up a single centralized and searchable database for PIAs, or at a minimum require agencies to publish PIAs detailed, searchable agency databases; and
- Require agencies to complete Privacy Threshold Assessments and proactively publish them.
- PIAs are a key tool for non-profits and citizens to inform the public about the risks of federal information collection systems.
EPIC regularly relies on PIAs in our role as a public interest research center to understand the existence and impact of government systems. PIAs often disclose the existence of an important and potentially harmful system, provide information to craft narrow and effective FOIA requests, and allow for analysis of otherwise opaque federal programs. EPIC regularly refers to PIAs in all aspects of our work, particularly in comments to federal agencies,[53] advocacy before Congress,[54] and analysis we publish to inform the public.[55]
PIAs are also a crucial tool in crafting narrow FOIA requests that can uncover further information about federal agency activities. EPIC attorneys regularly consult PIAs when drafting FOIA requests and cite specifically to PIAs in the body of the request. We use PIAs to direct FOIA officers to the proper resources, provide keywords for more efficient searching, and save time by only consulting with the relevant sub-components of an agency. PIAs actually reduce FOIA office workloads by allowing the public to craft narrowly tailored FOIAs that do not require searching voluminous amounts of extraneous documents.
EPIC relied heavily on Immigrations and Customs Enforcement’s (ICE) PIA on the agency’s use of facial recognition services in drafting a highly impactful FOIA request that uncovered a number of new documents detailing how the agency uses controversial facial recognition technology.[56] ICE’s 2020 PIA detailed the different sources for facial recognition technology that ICE agents could access, identified the forthcoming existence of training materials, and revealed that ICE was using commercial vendors to obtain facial recognition technology.[57] EPIC eventually litigated the FOIA request after ICE was unresponsive, resulting in thousands of pages of new documents that help the public understand how ICE was using facial recognition technology during times of significant public concern.[58]
- Agencies fail to publish PIAs in an easily useable manner.
Across the federal government, agencies are not doing enough to make PIAs easily available to the public. At best, agencies publish their PIAs on webpages that are difficult to search and lack key details like the type of data or system involved.[59] The Department of Veterans Affairs (VA) maintains a fairly detailed and searchable list of PIAs, but buries this webpage behind several layers of click-throughs from the agency’s main webpage labelled “Privacy Impact Assessment (PIA) Repository”.[60] A diligent user can find the VA’s PIA page, but the experience is unnecessarily complicated.
More often, agencies publish their PIAs in disorganized lists with little if any information beyond the name of the system.[61] For example, the Department of Justice maintains a scattershot PIA webpage that directly houses some PIAs in PDF form for some agency subcomponents, links directly to the PIA page for other subcomponents like the FBI, and links to the general privacy webpage for still others like the Bureau of Prisons.[62] The FCC and many other agencies maintain similarly deficient PIA webpages. Across the DOJ, subcomponents fail to list when PIAs were conducted alongside the name of the system, making it difficult to know which PIAs are for new systems, which are for legacy systems, and which may be completely outdated. The ordinary citizen coming across DOJ’s FOIA page would have no way to figure out which PIAs cover systems that impact their lives, and which cover only internal-facing systems of data on federal employees.
At worst, some agencies don’t publish their PIAs at all, providing a list of documents that can be obtained through FOIA requests. The U.S. Postal Service simply provides a downloadable list of its’ PIAs (labelled Business Impact Assessments) and directs interested citizens to submit a FOIA request for the PIA in question. [63] It is not clear how often the Postal Service updates their downloadable list of PIAs, making it difficult to determine which systems are still active. The FOIA process is an unnecessarily slow and labor-intensive way to access what should be publicly available documents. OMB should not allow agencies to skirt compliance with the E-Government Act by putting a wall of FOIA procedures between the public and PIAs.
PIAs should be proactively disclosed through a centralized and searchable database that is run by the OMB, similar to regulations.gov. This approach would have myriad benefits. Currently, some federal agencies provide their own databases of PIAs, but centralization would encourage uniform publication of PIAs by all federal agencies.[64] Management by OMB would ensure oversight by a third-party agency. Centralization would encourage economies of scale and thus lead to a more functional and searchable database. Creating a robust database of PIAs would bring more than mere convenience: it would enable the PIA provision of the E-Government Act to fulfill Congress’s goals of encouraging agency accountability, public transparency, and public trust. With regulations.gov as a model, OMB could spin up a centralized website that made it easier for agencies to publish PIAs and easier for the public to find them.
In the alternative, agencies should maintain an updated PIA webpage with PIAs available as PDFs. This webpage should be searchable, and PDFs should be able to be filtered by sub-agency, topic, and date of publication. A good example of such a webpage is the Consumer Product Safety Commission’s Privacy Impact Assessments repository.[65] OMB should go further than the Consumer Product Safety Commission though and direct agencies to tag PIAs with keywords to allow the public to find the relevant PIAs based on basic types of information collected, e.g. social security number or fingerprints, as well as any advanced technologies the system uses like AI or facial recognition.
- PTAs should be mandatory for all agencies and published in a timely manner alongside PIAs.
OMB should direct agencies to follow DHS’ model and conduct Privacy Threshold Assessments as a useful exercise for agencies and a transparency tool for the public. The DHS is required to assess and mitigate the privacy risks of the information technology systems and technologies they use through a four-part cycle, beginning with conducting a Privacy Threshold Analysis (PTA).[66] Depending on the results of the PTA, the DHS Privacy Office will reach a conclusion about whether the system or program requires additional privacy compliance documentation, like a Privacy Impact Assessment (PIA).[67] As such, these privacy assessments are crucial for the public to assess how new technologies intrude on the lives of ordinary people. However, the requisite PTAs for many DHS programs have not been released. Without published PTAs, it’s nearly impossible for organizations like EPIC to check agencies’ work and ensure that PIAs are conducted when necessary.
OMB should direct DHS and other agencies to proactively and consistently disclose PTAs soon after they are completed. PTAs identify privacy concerns and determine whether further privacy assessments are required. The results of PTAs therefore determine whether the public is entitled to disclosure about potentially privacy-threatening programs. Withholding PTAs from the public eye obscures one of the most important steps in the process of implementing or updating system and programs. This secrecy undermines the purpose of Section 208 of the E-Government Act, which is to ensure that “privacy considerations and protections are incorporated into all activities of the Department.”[68]
- Privacy risks associated with advances in technology and data capabilities
Advancing technology and data capabilities have increased the privacy risks associated with the government’s use of information systems and privacy impact assessments have largely not kept up. The increasing use of AI and AI-enabled systems implicate privacy in new ways because they 1) are trained on large amounts of personal data from commercial databases and public records and 2) make inferences and assumptions and produce outputs based on persona data that go beyond the risks associated with collection and dissemination of personal data.[69] These systems are used for things like risk scoring, eligibility screening, fraud detection, and predictive policing. The use of commercially available information (CAI) tends to exacerbate privacy risks associated with data collection and undermines constitutional protections that would not allow government agencies to collect the same data directly without a warrant.
AI systems implicate privacy in a number of ways, and it starts with the model training and development. AI systems used by federal agencies are likely to be trained on commercial data that can include consumer data and public records. Since AI systems reflect the data they are trained on, procuring an AI system trained on personal information from commercial data and public records is a data collection of that training data. Allowing these types of AI systems to operate on agency records produce inferences that wouldn’t otherwise occur. The accurate inferences may reveal private information about someone without their consent[70] and incorrect inferences can restrict or undermine someone’s access to services and opportunities.[71] Additionally, the data collection that occurs to train AI systems can contain historical bias that will then be reflected in AI decisions—perpetuating harmful biases and disproportionately impacting historically marginalized groups.[72]
When the government procures an AI system from a private vendor, it is often the case that the vendor will be the one maintaining the system.[73] Consequently, to operate these systems a government agency will need to transmit government data that includes PII to a vendor through the vendor’s web portal. AI vendors may not properly separate government data from its own commercial and proprietary data—commingling sensitive data from the government with commercial datasets that are resold or otherwise transmitted to third parties.
The ways in which PII might be exposed go beyond the government handing it over to an AI vendor. PII can also be exposed through data leaks and security vulnerabilities. An agency might not intend to use an AI system to share PII, but some systems—especially large-language models (LLMs)—may unintentionally leak PII during use. ChatGPT, for example, has exposed people’s personal information.[74] Data leaks can also occur when an AI developer fails to secure PII in the training data. Microsoft, for example, accidentally leaked 38TB of data that included passwords and encryption keys while uploading open-source LLM training data set to Github.[75] In addition to unintended data leaks, many AI systems are vulnerable to jailbreaking. This is particularly true for generative AI where, despite guardrails on generative AI systems to restrict what the system can output, hackers have easily circumvented these guardrails and tricked generative AI systems into outputting PII.[76]
The use of AI requires updated guidance by OMB. First and foremost, OMB should specify that the procurement and use of AI requires a PIA to be conducted. As described above, the procurement and/or use of AI constitutes a new data collection that create some unique privacy risks that PIAs should address. Additionally, OMB should consider incorporating NIST guidance on AI risk management into its PIA guidance. For example, testing AI systems for validity and reliability—and documenting AI system limitations—before they are deployed.[77] Lastly, OMB should mandate specific PIA requirements for AI systems, including: 1) Reporting additional information about the procurement and use of AI systems; 2) Conducting regular AI testing and evaluation processes to identify any errors, biases, vulnerabilities, or privacy risks within AI systems; and 3) Setting interagency privacy risk tolerance threshold based on the NIST AI RMF. For additional details about EPIC’s recommendations to integrate AI requirements into PIAs, see EPIC’s August 8, 2023 memorandum to OMB attached here as Appendix 1.
Of course, it’s not just AI systems that pose a threat to privacy in ways that PIAs, as currently conducted, are ill-equipped to handle. The purposeful collection of massive amounts of personal and sensitive data into commercial databases available for purchase by government agencies and other entities presents its own challenges to privacy. Commercially available information (CAI) includes large amounts of sensitive data that government agencies do not have the resources to collect on their own and could not collect in the first place without a warrant or some other court order. Consequently, CAI not only comes with the traditional privacy concerns associated with the government using PII, it also undermines Constitutional protections that would prevent government agencies from collecting certain data in the first place. In particular, the availability of CAI acts as an end run around the Fourth Amendment, allowing government agencies to purchase data they would otherwise not be able to obtain unless they could justify a warrant for it. This undermines a foundational aspect of our Constitution that protects our privacy and civil liberties. CAI could potentially have a chilling effect on our First Amendment protected rights of religion and association because of the prevalence of location data in CAI that could easily be used to determine who goes to particular places of worship or who associates with who. This may be the case even if government agencies do not use CAI in this way given the public understanding of its availability to the government and knowing the fact that CAI is already used to get around Fourth Amendment requirements.
Additionally, the indiscriminate nature of the data collection related to CAI risks amplifying data quality issues. The scale of the collection poses risk related to data access and abuse. The scaled combined with the indiscriminate nature of the collection risks overcollection. These risks will all likely translate to government agencies who buy large amounts of CAI.
Similar to AI systems, OMB should make clear that the potential purchase of CAI requires a PIA. This should not be left up to interpretation as some agencies have already tried to avoid any privacy compliance when it comes to using CAI.[78] Additionally, OMB needs to make clear that agencies need to consider and address the privacy risks associated with the use of CAI regardless if it directly contains PII. Datasets with sensitive information, even if it is not traditional standalone PII, can easily be used to identify someone. Lastly, PIAs assessing CAI should directly address whether the agency could directly collect the information without a warrant or other court order. If an agency cannot collect it directly itself without a warrant then the agency should not purchase the data to avoid a warrant requirement.
- Conclusion
Privacy impact assessments have the potential to be a powerful tool of transparency that can anticipate and prevent problems and protect our privacy. But they must be conducted in a timely and thoughtful manner that grapples with the growing privacy risks created by advancing technologies. To do that, EPIC urges OMB to implement recommendations described in this comment. For any further questions please contact EPIC Senior Counsel Jeramie Scott at [email protected].
A summary of EPIC’s recommendations is provided below:
- Recommendations to improve PIAs generally:
- Conduct PIAs as required and publish them promptly;
- Conduct PIAs before systems are in place so that PIAs are pre-decisional documents, not post-hoc rationalizations;
- Produce PIAs that are sufficiently detailed to give the public a full accounting of agency activities and the risks they create; and
- Fully disclose and evaluate the risks created by using third-party technology and third-party data.
- Recommendations to improve transparency:
- Set up a single centralized and searchable database for PIAs, or at a minimum require agencies to publish PIAs detailed, searchable agency databases; and
- Require agencies to complete Privacy Threshold Assessments and proactively publish them.
- Recommendations to improve how PIAs address AI systems:
- Specify that procurement and use of AI require a PIA to be conducted;
- Incorporate relevant NIST guidance on AI risk management into PIA guidance; and
- Mandate specific PIA requirements for AI systems that include reporting additional information, regular testing and evaluation, and setting privacy risk tolerance threshold based on NIST AI RMF.
- Recommendations to improve how PIAs assess commercially available information:
- Specify that purchasing CAI requires a PIA first;
- Make clear that a CAI PIA should be conducted even if there is no traditional PII present in the data; and
- PIAs for CAI should address whether a warrant or other court order would be required for an agency to directly collect the information and prevent end runs around the Fourth Amendment.
[1] Notice of Request for Information (RFI) on Privacy Impact Assessments, 89 Fed. Reg. 5945 (Jan. 30, 2024), https://www.govinfo.gov/content/pkg/FR-2024-01-30/pdf/2024-01756.pdf.
[2] 89 Fed. Reg. at 5945.
[3] EPIC, About Us (2024), https://epic.org/about/.
[4] EPIC, EPIC v. FBI – Privacy Assessments (2017), https://epic.org/documents/epic-v-fbi-privacy-assessments/.
[5] What Facial Recognition Technology Means for Privacy and Civil Liberties: Hearing Before the Subcomm. on Privacy, Technology, and the Law of the S. Comm. on the Judiciary, 112th Cong. (2012).
[6] What Facial Recognition Technology Means for Privacy and Civil Liberties: Hearing Before the Subcomm. on Privacy, Technology, and the Law of the S. Comm. on the Judiciary, 112th Cong. 7 (2012) (statement of Jerome Pender, Deputy Assistant Director, FBI), available at https://www.govinfo.gov/content/pkg/CHRG-112shrg86599/pdf/CHRG-112shrg86599.pdf.
[7] Oversight Hearing of the Federal Bureau of Investigation: Hearing Before the Comm. on the Judiciary, 113th (2013).
[8] Oversight Hearing of the Federal Bureau of Investigation: Hearing Before the Comm. on the Judiciary, 113th 13 (2013), https://www.govinfo.gov/content/pkg/CHRG-113shrg88484/pdf/CHRG-113shrg88484.pdf.
[9] Jeramie D. Scott, License Plate Readers – Will the FBI Ever Address Their Privacy Implications (Jan. 28, 2014), https://blog.epic.org/2014/01/28/license-plate-readers-will-the-fbi-ever-address-their-privacy-implications/.
[10] Id.
[11] EPIC, EPIC v. DEA – Privacy Impact Assessments (2016), https://epic.org/documents/epic-v-dea-privacy-impact-assessments/.
[12] Scott Shane & Colin Moynihan, Drug Agents Use Vast Phone Trove, Eclipsing N.S.A.’s, N.Y. Times, Sept. 2, 2013, https://www.nytimes.com/2013/09/02/us/drug-agents-use-vast-phone-trove-eclipsing-nsas.html; Mike Levine, DEA Puts Phone Company Inside Government Offices, ABC News, Sept 1, 2013, https://abcnews.go.com/blogs/headlines/2013/09/dea-program-puts-phone-company-inside-government-offices.
[13] Stopping the Flow of Illicit Drugs In Arizona By Leveraging State, Local And Federal Information Sharing: Hearing Before the Subcomm. on Border & Maritime Security of the House Comm. on Homeland Security, 112th Cong. (2012).
[14] Stopping the Flow of Illicit Drugs In Arizona By Leveraging State, Local And Federal Information Sharing: Hearing Before the Subcomm. on Border & Maritime Security of the House Comm. on Homeland Security, 112th Cong. (2012) (statement for the record of Douglas W. Coleman, DEA special agent), https://www.justice.gov/d9/testimonies/witnesses/attachments/05/21/12//05-21-12-dea-coleman.pdf.
[15] Id.
[16] See, e.g., Devlin Barrett, U.S. Spies on Millions of Drivers, Wall St. J. (Jan. 26, 2015), https://www.wsj.com/articles/u-s-spies-on-millions-of-cars-1422314779.
[17] Letter from Senators Patrick Leahy and Charles Grassley to Attorney General Eric Holder on DEA License Plate Reader Privacy Concerns (Jan. 28, 2015), https://www.grassley.senate.gov/news/news-releases/grassley-leahy-raise-privacy-concerns-about-dea-license-plate-tracking-system.
[18] Stopping the Flow of Illicit Drugs In Arizona By Leveraging State, Local And Federal Information Sharing: Hearing Before the Subcomm. on Border & Maritime Security of the House Comm. on Homeland Security, 112th Cong. (2012) (statement for the record of Douglas W. Coleman, DEA special agent), https://www.justice.gov/d9/testimonies/witnesses/attachments/05/21/12//05-21-12-dea-coleman.pdf.
[19] Id.
[20] John Shiffman, How DEA program differs from recent NSA revelations, Reuters (Aug. 5, 2013), https://www.reuters.com/article/idUSBRE9740AI/.
[21] EPIC, EPIC v. Presidential Election Commission (2018), https://epic.org/documents/epic-v-presidential-election-commission/.
[22] EPIC, EPIC v. Commerce (Census Privacy) (2019), https://epic.org/documents/epic-v-commerce-census-privacy/.
[23] EPIC, EPIC v. U.S. Postal Service (2022), https://epic.org/documents/epic-v-u-s-postal-service/.
[24] Appendix II to Circular A-130, Responsibilities for Managing Personally Identifiable Information, 10, https://obamawhitehouse.archives.gov/sites/default/files/omb/assets/OMB/circulars/a130/a130revised.pdf.
[25] 39 C.F.R § 233.1 – Arrest and investigative powers of Postal Inspectors (2007), https://www.ecfr.gov/current/title-39/chapter-I/subchapter-D/part-233/section-233.1.
[26] EPIC, EPIC v. U.S. Postal Service (2022), https://epic.org/documents/epic-v-u-s-postal-service/.
[27] Joseph Cox, Here’s How the Post Office’s Internet Cops Describe Themselves, Vice (Aug. 31, 2021), https://www.vice.com/en/article/m7enk3/us-postal-inspection-service-icop-presentation (quoting an internal USPIS training presentation); Jana Winter, Facial recognition, fake identities and digital surveillance tools: Inside the post office’s covert internet operations program, Yahoo! News (May 18, 2021), https://news.yahoo.com/facial-recognition-fake-identities-and-digital-surveillance-tools-inside-the-post-offices-covert-internet-operations-program-214234762.html.
[28] Jana Winter, The Postal Service is running a ‘covert operations program’ that monitors Americans’ social media posts, Yahoo! News (Apr. 21, 2021), https://news.yahoo.com/the-postal-service-is-running-a-running-a-covert-operations-program-that-monitors-americans-social-media-posts-160022919.html.
[29] Jake Wiener, New ICE Privacy Impact Assessment Shows All the Ways the Agency Fails to Protect Immigrants’ Privacy, EPIC (Apr. 20, 2023), https://epic.org/new-ice-privacy-impact-assessment-shows-all-the-way-the-agency-fails-to-protect-immigrants-privacy/; American Immigration Council DHS Publishes Privacy Document About ATDs and the Data They Collect – Two Decades Late (Apr. 23, 2023), https://immigrationimpact.com/2023/04/06/dhs-publishes-privacy-document-alternatives-to-detention/; for information on the ATD program see: Audrey Singer, Immigration: Alternatives to Detention (ATD) Programs, Cong. Research Serv. (Jul. 8, 2019), https://crsreports.congress.gov/product/pdf/r/r45804; American Immigration Council,Alternatives to Immigration Detention: An Overview (Jul. 11, 2023), https://www.americanimmigrationcouncil.org/research/alternatives-immigration-detention-overview.
[30] Privacy Act of 1974; Department of Homeland Security United States Immigration Customs and Enforcement-011 Criminal Arrest Records and Immigration Enforcement Records System of Records, 81 Fed. Reg. 72080 at 72081-3 (Oct. 19, 2016), https://www.federalregister.gov/documents/2016/10/19/2016-25197/privacy-act-of-1974-department-of-homeland-security-united-states-immigration-customs-and.
[31] Joseph V. Cuffari, OIG-23-61 CBP, ICE, and Secret Service Did Not Adhere to Privacy Policies or Develop Sufficient Policies Before Procuring and Using Commercial Telemetry Data (REDACTED), DHS OIG (Sept. 28, 2023), https://www.oig.dhs.gov/sites/default/files/assets/2023-09/OIG-23-61-Sep23-Redacted.pdf.
[32] Pub. L. 91–190, 42 U.S.C. § 4321 et seq. (hereinafter NEPA).
[33] NEPA Title I.
[34] § 208 (b)(1)(A) E-Government Act, Pub. L. No. 107-347, 116 Stat. 2899 (Dec. 17, 2002).
[35] GAO-22-105065 Federal Agency Privacy Programs, Gov’t Accountability Off. at 42 (Sept. 2022), https://www.gao.gov/assets/gao-22-105065.pdf.
[36] Gov’t Accountability Off., GAO-22-105065 Federal Agency Privacy Programs at 43 (Sept. 2022), https://www.gao.gov/assets/gao-22-105065.pdf.
[37] Gov’t Accountability Off. GAO-07-522 DHS Privacy Office at 25-30 (Apr. 2007), https://www.gao.gov/assets/gao-07-522.pdf.
[38] Joshua B. Bolten, M-03-22 OMB Guidance for Implementing the Privacy Provisions of the E-Government Act of 2002, Off. of Management & Budget § II(B)(c) (Sept. 26, 2003), https://obamawhitehouse.archives.gov/omb/memoranda_m03-22/#5.
[39] Danielle Keaks Citron & Daniel J. Solove, Privacy Harms, 102 Boston U. L. Rev. 793 (2022), https://www.bu.edu/bulawreview/files/2022/04/CITRON-SOLOVE.pdf.
[40] See generally, EPIC & Consumer Reports, How the FTC Can Mandate Data Minimization Through a Section 5 Unfairness Rulemaking, (Jan. 2022), https://epic.org/documents/how-the-ftc-can-mandate-data-minimization-through-a-section-5-unfairness-rulemaking/.
[41] Immigr. & CustoDHS/ICE/PIA-062 Alternatives to Detention (ATD) Program, ICE (Mar. 17, 2023), https://www.dhs.gov/publication/dhsicepia-062-alternatives-detention-atd-program.
[42] Jake Wiener, New ICE Privacy Impact Assessment Shows All the Ways the Agency Fails to Protect Immigrants’ Privacy, EPIC (Apr. 20, 2023), https://epic.org/new-ice-privacy-impact-assessment-shows-all-the-way-the-agency-fails-to-protect-immigrants-privacy/.
[43] OMB Memorandum 07-16, Safeguarding Against and Responding to the Breach of Personally Identifiable Information (May 22, 2007), https://georgewbush-whitehouse.archives.gov/omb/memoranda/fy2007/m07-16.pdf; GAO-17-553, Social Security Numbers: OMB Actions Needed to Strengthen Federal Efforts to Limit Identity Theft Risks by Reducing Collection, Use, and Display (Jul. 25, 2017), https://www.gao.gov/products/gao-17-553.
[44] U.S. Dep’t of Homeland Sec., Privacy Impact Assessment for the ICE Use of Facial Recognition Services, DHS/ICE/PIA-054 (May 13, 2020), https://www.dhs.gov/publication/dhsicepia-054-ice-use-facial-recognition-services.
[45] Joseph V. Cuffari, OIG-23-61 CBP, ICE, and Secret Service Did Not Adhere to Privacy Policies or Develop Sufficient Policies Before Procuring and Using Commercial Telemetry Data (REDACTED), DHS OIG at 6-8 (Sept. 28, 2023), https://www.oig.dhs.gov/sites/default/files/assets/2023-09/OIG-23-61-Sep23-Redacted.pdf.
[46] Kiran Wattamwar, ICE’s Privacy Impact Assessment on Surveillance Technologies is an Exercise in Disregarding Reality, EPIC (Oct. 5, 2023), https://epic.org/ices-privacy-impact-assessment-on-surveillance-technologies-is-an-exercise-in-disregarding-reality/.
[47] Government Databases, EPIC (last accessed Mar. 28, 2024), https://epic.org/issues/surveillance-oversight/government-databases/.
[48] Dana Khabbaz, DHS’s Data Reservoir: ICE and CBP’s Capture and Circulation of Location Information, EPIC (Aug. 2022), https://epic.org/documents/dhss-data-reservoir-ice-and-cbps-capture-and-circulation-of-location-information/.
[49] Ernest J. Babcock, Privacy Impact Assessment for the Facial Analysis, Comparison, and Evaluation (FACE) Services Unit, FBI (May 1, 2015), https://www.fbi.gov/how-we-can-help-you/more-fbi-services-and-information/freedom-of-information-privacy-act/department-of-justice-fbi-privacy-impact-assessments/facial-analysis-comparison-and-evaluation-face-services-unit.
[50] Jake Wiener, New ICE Privacy Impact Assessment Shows All the Ways the Agency Fails to Protect Immigrants’ Privacy, EPIC (Apr. 20, 2023), https://epic.org/new-ice-privacy-impact-assessment-shows-all-the-way-the-agency-fails-to-protect-immigrants-privacy/.
[51] Patrick Grother, Mei Ngan, & Kayee Hanaoka, Face Recognition Vender Test Part 3: Demographic Effects, NIST (Dec. 2019), https://nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8280.pdf; Patrick Grother, Mei Ngan, & Kayee Hanaoka, Face Recognition Technology Evaluation (FRTE) Part 2: Identification, NIST 5 (Feb. 2022), https://pages.nist.gov/frvt/reports/1N/frvt_1N_report.pdf
[52] U.S. Dep’t of Homeland Sec., Privacy Impact Assessment for the ICE Use of Facial Recognition Services, DHS/ICE/PIA-054 (May 13, 2020), https://www.dhs.gov/publication/dhsicepia-054-ice-use-facial-recognition-services.
[53] See e.g., EPIC Comments to GSA on Modified System of Records Notice for Login.gov (Dec. 21, 2022), https://epic.org/documents/epic-comments-modified-system-of-records-notice-for-login-gov/; EPIC, Consumer Federation of America, and Center for Digital Democracy Comments to OSTP on Public and Private Sector Uses of Biometric Technologies (Jan. 15, 2022), https://epic.org/documents/epic-comments-to-ostp-on-public-and-private-sector-uses-of-biometric-technologies/; EPIC Comments to DHS: Advance Collection of Photos at the Border, USCBP-2021-0038 (Nov. 29, 2021), https://epic.org/documents/epic-comments-to-dhs-advance-collection-of-photos-at-the-border/.
[54] Statement of Jeramie Scott at EPIC to House Committee on Homeland Security Subcommittee on Border Security, Facilitation, & Operations Hearing on “Assessing CBP’s Use of Facial Recognition Technology” (July 27, 2022), https://epic.org/wp-content/uploads/2022/07/Testimony-Scott-CBP-FRT-Use-2022.07.27.pdf.
[55] Maria Villegas Bravo, DHS Disregards Internal Policies and Avoids Fourth Amendment Protections to Track Your Location, EPIC (Feb. 8, 2024), https://epic.org/dhs-disregards-internal-policies-and-avoids-fourth-amendment-protections-to-track-your-location/; Kiran Wattamwar, ICE’s Privacy Impact Assessment on Surveillance Technologies is an Exercise in Disregarding Reality, EPIC (Oct. 5, 2023), https://epic.org/ices-privacy-impact-assessment-on-surveillance-technologies-is-an-exercise-in-disregarding-reality/; Jake Wiener, New ICE Privacy Impact Assessment Shows All the Ways the Agency Fails to Protect Immigrants’ Privacy, EPIC (Apr. 20, 2023), https://epic.org/new-ice-privacy-impact-assessment-shows-all-the-way-the-agency-fails-to-protect-immigrants-privacy/.
[56] EPIC v. ICE (Facial Recognition Services), EPIC, https://epic.org/documents/epic-v-ice-facial-recognition-services/ (2024).
[57] U.S. Dep’t of Homeland Sec., Privacy Impact Assessment for the ICE Use of Facial Recognition Services, DHS/ICE/PIA-054 (May 13, 2020), https://www.dhs.gov/publication/dhsicepia-054-ice-use-facial-recognition-services.
[58] EPIC v. ICE (Facial Recognition Services), EPIC, https://epic.org/documents/epic-v-ice-facial-recognition-services/ (2024).
[59] See e.g., Privacy Impact Assessments (PIA) Collection, Dep’t of Homeland Sec., https://www.dhs.gov/publications-library/collections/privacy-impact-assessments-%28pia%29 (2024); Privacy Impact Assessment (PIA) Reports, U.S. Consumer Product Safety Commission, https://www.cpsc.gov/About-CPSC/Agency-Reports/PIA-Reports (2024).
[60] Privacy Impact Assessment (PIA) Repository, Dep’t of Veterans Affairs
[61] See e.g., Department of Justice/FBI Privacy Impact Assessments (PIAs), Fed. Bureau of Investigation, https://www.fbi.gov/how-we-can-help-you/more-fbi-services-and-information/freedom-of-information-privacy-act/department-of-justice-fbi-privacy-impact-assessments, (2024); Privacy Act Information, Federal Communications Commission, https://www.fcc.gov/managing-director/privacy-transparency/privacy-act-information#pia (2024).
[62] DOJ Privacy Impact Assessments, Dep’t of Justice, https://www.justice.gov/opcl/doj-privacy-impact-assessments (2024).
[63] Privacy Impact Assessments (PIA), U.S. Postal Service, https://about.usps.com/who/legal/privacy-policy/privacy-impact-assessments.htm (2024).
[64] Privacy Impact Assessments (PIA) Collection, Dep’t of Homeland Sec., https://www.dhs.gov/publications-library/collections/privacy-impact-assessments-%28pia%29 (2022); Privacy Impact Assessment (PIA) Reports, U.S. Consumer Product Safety Commission, https://www.cpsc.gov/About-CPSC/Agency-Reports/PIA-Reports (2021).
[65] Privacy Impact Assessment (PIA) Reports, U.S. Consumer Product Safety Commission, https://www.cpsc.gov/About-CPSC/Agency-Reports/PIA-Reports (2024)
[66] Privacy Compliance Process, Dep’t of Homeland Sec., https://www.dhs.gov/compliance#:~:text=Privacy%20Threshold%20Analysis%20(PTA),-The%20first%20step&text=The%20DHS%20Privacy%20Office%20reviews,or%20when%20changes%2Fupdates%20occur (last updated Jan. 13, 2022).
[67] Privacy Compliance Process, Dep’t of Homeland Sec., https://www.dhs.gov/compliance#:~:text=Privacy%20Threshold%20Analysis%20(PTA),-The%20first%20step&text=The%20DHS%20Privacy%20Office%20reviews,or%20when%20changes%2Fupdates%20occur (last updated Jan. 13, 2022).
[68] Privacy Policy Guidance and Memorandum, Dep’t of Homeland Sec., https://www.dhs.gov/sites/default/files/publications/privacy_policyguide_2008-02_0.pdf (last accessed July 11, 2022).
[69] See Danielle Keats Citron & Daniel J. Solove, Privacy Harms, 102 B.U. L. Rev. 793, 830–60 (2022) (typologizing different privacy harms);Sandra Wachter & Brent Mittelstadt, A Right to Reasonable Inferences: Re-Thinking Data Protection Law in the Age of Big Data and AI, Colum. Bus. Rev., 2019, at 22–28 (exploring overlap between data inferences and personal data).
[70] See Citron & Solove, supra note 23, at 831–33, 853 (discussing physical harm and lack of control); Wachter & Mittelstadt, supra note 23, at 12–19 (discussing automated methods for inferring intimate details about someone’s identity and life).
[71] See Citron & Solove, supra note 23, at 817, 839–41 (discussing reputational harms caused by inaccuracies); Wachter & Mittelstadt, supra note 23, at 57 (discussing right to rectify inaccurate inferences).
[72] Grant Fergusson, Outsourced and Automated: How AI Companies Have Taken Over Government Decision-Making, Electronic Privacy Information Center (Sept. 2023), https://epic.org/wp-content/uploads/2023/09/FINAL-EPIC-Outsourced-Automated-Report-w-Appendix-Updated-9.26.23.pdf.
[73] See EPIC, Screened & Scored in the District of Columbia at 24-25 (Nov. 2022) (describing one such arrangement with Thomson Reuters), https://epic.org/wp-content/uploads/2022/11/EPIC-Screened-in-DC-Report.pdf;Grant Fergusson, Public Benefits, Private Vendors: How Private Companies Help Run Our Welfare Programs, EPIC Blog (Jan. 26, 2023), https://epic.org/public-benefits-private-vendors-how-private-companies-help-run-our-welfare-programs/.
[74] Jordan Pearson, ChatGPT Can Reveal Personal Information From Real People, Google Researchers Show, Vice (Nov. 29, 2023), https://www.vice.com/en/article/88xe75/chatgpt-can-reveal-personal-information-from-real-people-google-researchers-show.
[75] David Barry, Microsoft’s AI Data Leak Isn’t the Last One We’ll See, Reworked (Sept. 29, 2023), https://www.reworked.co/information-management/microsofts-ai-data-leak-isnt-the-last-one-well-see/.
[76] Mehul Srivastava and Cristina Criddle, Nvidia’s AI software tricked into leaking data Financial Times (June 9, 2023), https://www.ft.com/content/5aceb7a6-9d5a-4f1f-af3d-1ef0129b0934.
[77] NIST, Artificial Intelligence Risk Management Framework (AI RMF 1.0), 29 (2023).
[78] See DHS Office of Inspector General, CBP, ICE, and Secret Service Did Not Adhere to Privacy Policies or Develop Sufficient Policies Before Procuring and Using Commercial Telemetry Data [REDACTED] (Sept. 2023), https://www.oig.dhs.gov/sites/default/files/assets/2023-09/OIG-23-61-Sep23-Redacted.pdf ; See also Maria Villegas Bravo, Blogpost: DHS Disregards Internal Policies and Avoids Fourth Amendment Protections to Track Your Location (Feb. 8, 2024), https://epic.org/dhs-disregards-internal-policies-and-avoids-fourth-amendment-protections-to-track-your-location/.
News
EPIC Condemns Removal of AI Executive Order Safeguards
January 31, 2025
EPIC Presents to Lawmakers on State AI Legislation
January 30, 2025

Support Our Work
EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.
Donate