Privacy Impact Assessments
Privacy impact assessments are a potentially powerful mechanism for minimizing personal data collection, ensuring that institutions weigh the privacy risks of new projects, and keeping the public informed about the collection and use of their personally identifiable information.
A privacy impact assessment (or data protection impact assessment) is an analysis of how personally identifiable information will be collected, processed, stored, and transferred. When implemented properly, privacy impact assessments (PIAs) force government agencies and other institutions to carefully evaluate and publicly disclose the privacy risks of a proposed action, system, or project. Privacy impact assessments are required of U.S. federal agencies under the E-Government Act of 2002 and mandated for all projects that pose a “high risk” to personal data under the European Union’s General Data Protection Regulation. EPIC and others have also advocated for the adoption of algorithmic impact assessments, which would force entities to evaluate the privacy, equity, and human rights implications of AI and algorithmic decision-making systems before deployment.
How Privacy Impact Assessments Work
As Professor Gary T. Marx writes, the object of a privacy impact assessment is to “anticipate problems, seeking to prevent, rather than to put out fires.” When an institution is deciding whether to initiate a collection of personal data or to adopt a system that will process personal data, it should—and in some cases must—conduct a privacy impact assessment before proceeding. An impact assessment enables the entity to identify privacy risks, to determine how and if those risks can be mitigated, and to make an informed decision whether the proposed collection or system can be justified in light of its privacy impact. In many cases, an impact assessment also serves to inform the public of a data collection or system that poses a threat to privacy. Privacy impact assessments are analogous to the environmental impact statements that federal agencies and other entities must complete before initiating projects that will significantly affect the environment.
A privacy impact assessment is not a simple box-checking exercise or a static, one-off undertaking. As David Wright and Paul de Hert explain, “it is a process which should begin at the earliest possible stages, when there are still opportunities to influence the outcome of a project. It is a process that should continue until and even after the project has been deployed.” Or as the Office of Management and Budget warns federal agencies, a privacy impact assessment “is not a time-restricted activity that is limited to a particular milestone or stage of the information system or [personally identifiable information] life cycles. Rather, the privacy analysis shall continue throughout the information system and PII life cycles.”
The Rise of Privacy Impact Assessments
Privacy impact assessments date back at least as far as the 1990s. The New York Public Service Commission was reportedly one of the first regulators to require a privacy impact assessment in 1991, and the Internal Revenue Service published its first privacy impact assessment in 1996. The IRS’s assessment was later cited in 2000 as a model for other federal agencies by the Federal Chief Information Officer’s Council.
In 2002, Congress enacted the E-Government Act with the aim “mak[ing] the Federal Government more transparent and accountable.” In order to “ensure sufficient protections for the privacy of personal information,” section 208 of the E-Government Act requires federal agencies to complete, review, and publish a privacy impact assessment before initiating the collection of personal data or procuring information technology. Specifically, “before . . . initiating a new collection of information” in an identifiable form that “will be collected, maintained, or disseminated using information technology” from ten or more persons, the agency must “conduct a privacy impact assessment”; “ensure the review of the privacy impact assessment by the Chief Information Officer, or equivalent official”; and “make the privacy impact assessment publicly available through the website of the agency, publication in the Federal Register, or other means.”
Under the E-Government Act, a privacy impact assessment must be “commensurate with the size of the information system being assessed, the sensitivity of information that is in an identifiable form in that system, and the risk of harm from unauthorized release of that information[.]” An assessment must address, in particular:
OMB regulations dictate that agencies must complete privacy impact assessments “from the earliest stages of”—and continuously throughout—the information collection process. The OMB explains:
Though compliance with the E-Government Act varies widely from agency to agency and project to project, the statute has often succeeded in forcing agencies to notify the public about new collections of personal data and systems that will process personal data. Agencies including the Department of Homeland Security, the Department of Justice, the Department of Defense, and the Department of Commerce maintain substantial public databases of privacy impact assessments on their agency websites.
Privacy impact assessments—or data protection impact assessments—are also required by the European Union’s General Data Protection Regulation (GDPR) for all high-risk data processing activities. Article 35 of the GDPR states:
Article 35 also specifies that a data protection impact assessment is required for “a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person”; for “processing on a large scale of special categories of data referred to in Article 9(1), or of personal data relating to criminal convictions and offences referred to in Article 10”; and for “a systematic monitoring of a publicly accessible area on a large scale.”
EPIC’s Work on Privacy Impact Assessments
EPIC has long worked to promote the use of privacy impact assessments and to ensure strict adherence with PIA requirements. In particular, EPIC has sought to enforce the obligation of federal agencies to conduct and publish privacy impact assessments under the E-Government Act. In EPIC v. DHS, No. 11-2261 (D.D.C. filed Dec. 20, 2011), EPIC obtained a privacy impact assessment and related records concerning a prior effort by the DHS to track social media users and journalists. In EPIC v. FBI, No. 14-1311 (D.D.C. filed Aug. 1, 2014), EPIC obtained unpublished privacy impact assessments from the Federal Bureau of Investigation concerning facial recognition technology. And in EPIC v. DEA, No. 15-667 (D.D.C. filed May 1, 2015), EPIC learned that the Drug Enforcement Administration had failed to produce privacy impact assessments for the agency’s license plate reader program, a telecommunications records database, and other systems of public surveillance.
More recently, in EPIC v. Presidential Advisory Commission on Election Integrity, No. 17-1320 (D.D.C. filed July 3, 2017), EPIC challenged the failure of the Presidential Advisory Commission on Election Integrity to undertake and publish a PIA prior to the collection of state voter data. EPIC’s suit led the now-defunct Commission to suspend its data collection and later delete all of the voter information that had been illegally obtained. In EPIC v. DHS, No. 18-1268 (D.D.C. filed May 30, 2018), EPIC succeeding in blocking the development of a Department of Homeland Security system designed to monitor journalists. In EPIC v. Commerce, 928 F.3d 95 (D.C. Cir. 2019), EPIC challenged the Census Bureau’s failure to conduct and publish a PIA before collecting citizenship data in the 2020 Census. The citizenship question was ultimately withdrawn by the Bureau. And in EPIC v. USPS, EPIC brought suit to stop the U.S. Postal Service’s law enforcement arm from using facial recognition and social media monitoring tools at least until the agency has completed required privacy impact assessments.
Since 2018, EPIC has also advocated for the wide adoption of algorithmic impact assessments, which would force entities to evaluate the privacy, equity, and human rights implications of AI and algorithmic decision-making systems before deployment. EPIC was instrumental in the introduction of the Algorithmic Accountability Act, which would require companies to conduct impact assessments to determine if their algorithms are “inaccurate, unfair, biased, or discriminatory.”
Recent Documents on Privacy Impact Assessments
EPIC v. U.S. Postal Service
US District Court for the District of Columbia
Seeking to stop the U.S. Postal Service's law enforcement arm from using facial recognition and social media monitoring tools.
EPIC v. DEA – Privacy Impact Assessments
US District Court for the District of Columbia
Seeking all privacy impact assessments created by the Drug Enforcement Agency that are not currently available on the agency's website, as well as other privacy related documents created since 2007.
EPIC v. FBI – Privacy Assessments
US District Court for the District of Columbia
Seeking all privacy impact assessments created by the Federal Bureau of Investigation that are not currently available on the agency's website, as well as other privacy related documents created since 2007.
Pub. L. No. 107-347, 116 Stat. 2899
OMB Circular A-130: Managing Information as a Strategic Resource
OMB | 2003
OMB Guidance for Implementing the Privacy Provisions of the E-Government Act of 2002
Joshua B. Bolten
Privacy Impact Assessment (PIA) Guide
Sec. & Exchange Comm’n | 2007
- General Data Protection Regulation, Art. 35
Privacy Impact Assessment
David Wright & Paul de Hert, eds. | 2002
Algorithmic Impact Assessments Under the GDPR: Producing Multi-Layered Explanations
Margot Kaminski & Gianclaudio Malgieri | 2020
Support Our Work
EPIC's work is funded by the support of individuals like you, who help us to continue to protect privacy, open government, and democratic values in the information age.Donate