Consumer Privacy

Data Minimization

Background

Data minimization offers a practical solution to a broken internet ecosystem by providing clear limits on how companies can collect and use data.

Data minimization is the idea that entities should only collect, use, and transfer personal data that is “reasonably necessary and proportionate” to provide or maintain a product or service requested by the individual. This standard better aligns business practices with what individuals expect and puts people back in control of their own data.

When individuals interact with a business online, they reasonably expect that their data will be collected and used for the limited purpose and duration necessary to provide the goods or services that they requested. For example, an individual using a map application to obtain directions would not reasonably expect that their precise location data would be disclosed to third parties and combined with other data to profile them. Policymakers across the country recognize that users lack adequate privacy protection, but so far, our laws have failed to address the fundamental problem that data driven systems are not focused on serving the interests of their users. 

Our current laws and standard business practices have turned privacy into a check box compliance process that adds cost without adding any value to users or businesses. We are left with tens of thousands of pages of unread privacy policies, and billions of data points collected and trafficked every day without adequate protection. We need to reorient data practices, standards, and laws around the user-centered approach of data minimization. 

Data minimization is not a new concept. As computers became commonplace throughout the 1970s and the government had more and more personal information on residents, data minimization became a core principle of privacy protection. The Privacy Act of 1974, a landmark privacy law regulating how federal agencies should handle personal information, requires data minimization. 

Privacy laws grounded in the concept of data minimization have been enacted in the United States and the European Union in recent years, and data minimization requirements have featured prominently in privacy proposals considered by Congress.

Data minimization requirements in practice

European Union: General Data Protection Regulation (GDPR)

The European Union’s General Data Protection Regulation (“GDPR”), passed in 2016, is one of the most robust privacy regimes in the world. At the heart of this comprehensive privacy framework is data minimization. 

The GDPR starts from the baseline assumption that personal data cannot be collected or processed—a fundamentally different starting point than what exists in the United States given the lack of any comprehensive federal privacy legislation. Under the GDPR, exceptions to the assumption that data processing is unlawful are permitted only if certain, specific conditions are met that would allow an entity to lawfully collect and process personal data. The GDPR lists six permissible purposes that allow an entity to process personal data. If an entity cannot claim one of these six bases, then the entity is not legally allowed to process personal data. Entities must also proactively declare which of these six legal bases they are relying on before processing any data, including collecting the data, and before data is processed for any secondary purposes.

Consent is one of the legal bases for processing. Consent under the GDPR must be “freely given, specific, informed, and unambiguous.” The GDPR’s high bar for consent means that, unlike in the U.S., actions like continuing to use a website or clicking “I accept” on a long, confusing privacy policy that does not meaningfully inform consumers about a company’s data practices do not qualify as consent under the GDPR. 

Once an entity has identified a legal basis for processing personal data, the GDPR relies on data minimization principlesto dictate both how much and what type of data can be collected and how that data can be used. The GDPR requires data collection to be limited to a “specified, explicit and legitimate purpose.” Once personal data is collected, its use must be limited to what is “adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed.” Entities cannot process personal data in a manner inconsistent with the purpose for which it was collected. These data minimization and purpose limitation requirements ensure that entities’ collection and use of personal data is in line with consumers’ expectations.  

California: California Consumer Privacy Act (CCPA)

California passed its consumer privacy law, the California Consumer Privacy Act (“CCPA”), in 2018, two years after the GDPR passed, and it relies on similar data minimization concepts as the GDPR. California then amended the CCPA in 2020 by adopting the California Privacy Rights Act (“CPRA”) via ballot measure. California’s dedicated privacy agency, the California Privacy Protection Agency (“CPPA”), was created by CPRA and has rulemaking authority to promulgate regulations to help implement the CCPA. 

California’s Attorney General promulgated an initial round of regulations implementing the CCPA in August 2020. The Agency updated those regulations, which went into effect in March 2023. EPIC provided extensive input on the rules in November 2021May 2022August 2022, and November 2022, urging the Agency to clarify and strengthen the CCPA’s data minimization requirements. 

Data minimization requirements in the CCPA

The CCPA incorporates strong data minimization requirements to protect Californians from harmful overcollection of personal information, out-of-context impermissible secondary data uses, and excessive data retention. According to a recent CPPA enforcement advisory, “[d]ata minimization is a foundational principle in the CCPA.” 

The Agency explained in its regulations that the data minimization rules mean that: 

[B]usinesses must limit the collection, use, and retention of your personal information to only those purposes that: (1) a consumer would reasonably expect, or (2) are compatible with the consumer’s expectations and disclosed to the consumer, or (3) purposes that the consumer consented to, as long as consent wasn’t obtained through dark patterns. 

For all of these purposes, the business’ collection, use, and retention of the consumer’s information must be reasonably necessary and proportionate to serve those purposes.

Essentially, businesses can only collect and use personal information if it is for a purpose that is in line with what consumers would expect. If businesses want to collect or use personal information for a purpose a consumer would not expect, then they must obtain the consumer’s consent. The Agency’s regulations also set out rules for what counts as consent and prohibit using confusing language or difficult-to-use options.

Because California has a dedicated privacy agency that has full rulemaking authority under the CCPA, the regulations promulgated by the CPPA are an important resource in understanding what these requirements means in practice. For example, California’s regulations provide factors and examples to help businesses determine whether their collection, use, retention, or disclosure of personal information is “reasonably necessary and proportionate” for the purpose it was collected: 

(1) The minimum personal information that is necessary to achieve the purpose identified …. For example, to complete an online purchase and send an email confirmation of the purchase to the consumer, an online retailer may need the consumer’s order information, payment and shipping information, and email address. 

(2) The possible negative impacts on consumers posed by the business’s collection or processing of the personal information. For example, a possible negative impact of collecting precise geolocation information is that it may reveal other sensitive personal information about the consumer, such as health information based on visits to healthcare providers.

(3) The existence of additional safeguards for the personal information to specifically address the possible negative impacts on consumers …. For example, a business may consider encryption or automatic deletion of personal information within a specific window of time as potential safeguards.” 

11 CCR § 7002(d)

Similarly, the regulations also help businesses decide whether collection, use, retention, or disclosure of personal information would be compatible with the context of the interaction in which the personal information was collected. The regulations instruct businesses to consider “the reasonable expectations of the consumer(s) whose personal information is collected or processed concerning the purpose for which their personal information will be collected or processed” and “the other disclosed purpose for which the business seeks to further collect or process the consumer’s personal information.” 

If there is strong link between the reason a business initially collected a consumer’s personal information and the new reason the business wants to collect or process the information, this further processing is likely compatible with the context of the relationship. If it unrelated, however, the further processing is likely impermissible. The regulations give the example:

[A] weak link exists between the consumer’s reasonable expectations that the personal information will be collected to provide a requested cloud storage service at the time of collection, and the use of the information to research and develop an unrelated facial recognition service.

11 CCR § 7002(c)(3)

The CCPA’s data minimization standard and the regulations expanding upon these limitations are important resources for other states seeking to pass meaningful privacy laws to protect their residents. EPIC urges state legislators to look to California’s law and regulations as a framework on which to build a privacy law that focuses on data minimization rather than passing weak laws that allow abusive data practices to continue unchecked.

Maryland Online Data Privacy Act

In May 2024, Governor Wes Moore signed the Maryland Online Data Privacy Act. Maryland’s law includes data minimization requirements that limit the collection of personal data to what is reasonably necessary for the product or service requested by a consumer, prohibit the sale of sensitive personal data, bans targeted advertising to kids and teens, and prohibit the processing of personal data in ways that discriminate. The Maryland Online Data Privacy Act goes into effect on October 1, 2025.

Federal Trade Commission 

The Federal Trade Commission is the chief federal agency tasked with protecting consumer privacy. Under its mandate to protect consumers and promote competition, Section 5 of the FTC Act gives the Commission authority prevent unfair and deceptive acts—including authority to promulgate trade regulation rules that define with specificity acts or practices that are unfair or deceptive and through case-by-case enforcement actions. After the Commission has established a trade regulation rule through its rulemaking process, a violation of the rule constitutes an unfair or deceptive act or practice that carries a steep monetary penalty of $51,744 per violation. 

In August 2022, the Federal Trade Commission announced an Advanced Notice of Proposed Rulemaking (ANPR) to establish a trade regulation rule on commercial surveillance and data security, the first step in the Commission’s rulemaking process. EPIC explained the Commission’s ANPR and rulemaking process here. The ANPR sought public comment on whether the Commission should implement a rule that regulates the ways in which companies collect, use, aggregate, protect, analyze, and retain consumer data. The ANPR posed 95 questions in the following categories: harms to consumers; harms to children; costs and benefits; regulations; automated systems; discrimination; consumer consent; notice, transparency, and disclosure; remedies; and obsolescence. EPIC had previously urged the Commission to use its rulemaking authority to establish a data minimization rule that would prohibit all secondary data uses with limited exceptions. EPIC submitted extensive comments entitled “Disrupting Data Abuse” to the FTC on its Proposed Trade Regulation Rule on Commercial Surveillance & Data Security. Among other things, EPIC encouraged the Commission to establish a data minimization standard that would provide:

It is an unfair trade practice to collect, use, transfer, or retain personal data beyond what is reasonably necessary and proportionate to the primary purpose for which it was collected, consistent with consumer expectations and the context in which the data was collected.

EPIC explained the serious harms that consumers face when their data is used in out-of-context ways that violate consumers’ expectations. EPIC has continued to refine the arguments raised in “Disrupting Data Abuse” through our ongoing blog series on data minimization and remains active in the Commission’s rulemaking process. 

The FTC also has authority to prohibit deceptive and unfair acts or practices on a case-by-case enforcement basis. In several of its recent enforcement actions, the FTC has incorporated data minimization requirements into consent orders. These include proposed or final consent orders in the following FTC actions: BlackbaudAvastX-Mode SocialInMarket MediaRiteAidGlobal Tel*LinkBetterHelpAmazon AlexaEasy HealthcareEdmodoGoodRX HoldingsEpic GamesCheggDrizlyCafePress, and Kurbo (f/k/a Weight Watchers). These consent orders require different types of data minimization provisions, including:

  • Implement a comprehensive data security program with strong safeguards, including a data collection and retention schedule;
  • Require a company to destroy unnecessary data;
  • Restrict the consumer data that a company can collect and retain;
  • Require a company to provide its customers with access and deletion rights for the information that the company collects about them;
  • Require a company to bolster its data security by minimizing the amount of data it collects and retains;
  • Require a company to direct third parties to delete consumer data that was shared with them’
  • Limit retention of data;
  • Publicly post the information a company collects and why such data collection is necessary;
  • Halt the use of any technology if the company cannot control potential risks to customers;
  • Prohibition on sharing user health data with applicable third parties for advertising purposes; 
  • Prohibition on selling or licensing any precise location data, including sensitive location data that could be used to track whether people attended sensitive locations like medical clinics, places of religious worship, and domestic abuse shelters;
  • Require consent before collecting and using customers’ location data for advertising and marketing;
  • Prohibit a company from selling or licensing web browsing data for advertising purposes; and 
  • Prohibit a company from sharing consumers’ health data, including sensitive information relating to mental health conditions, for advertising purposes.

The Commission secured bright-line bans on sharing sensitive health information for advertising purposes—a data minimization requirement with a strong purpose limitation—in five cases: Proposed Order, United States v. CerebralInc.; Proposed Order, United States v. Monument; OrderUnited States v. GoodRx Holdings, Inc.; Order, In re BetterHelp, Inc.; Order, United States v. Easy Healthcare Corp. The prosed order in GoodRx permanently prohibited the company from sharing health information for advertising purposes with applicable third parties and it required user consent before sharing health information with applicable third parties for other purposes. It required the company to direct third parties to delete customer health data and limit the future collection and retention of customer health information.

In a 2021 report, What the FTC Could Be Doing (But Isn’t) to Protect Privacy, EPIC pushed the Commission to use all of its authorities to establish strong privacy protections for consumers, including data minimization requirements. EPIC regularly filed comments with the Commission regarding its proposed consent orders, applauding the Commission for including data minimization principles and pushing for the strongest possible requirements in final orders and future enforcement actions. The Federal Communications Commission (FCC) has also required data minimization in a consent decree to better protect consumers’ privacy. The FCC and T-Mobile entered into a consent decree following several breaches that caused T-Mobile customers’ customer proprietary network information and personal information to be exposed. The consent decree required that the company adopt a data minimization and deletion program to limit its collection and retention of customer information.

U.S. legislative efforts to adopt data minimization 

The American Data Privacy and Protection Act (ADPPA)

In 2022, bipartisan leaders on the House Energy & Commerce Committee and Senate Commerce Committee took a significant step toward securing nationwide privacy protections by proposing the American Data Privacy and Protection Act (ADPPA). At the heart of ADPPA was a strong data minimization requirement that entities only collect, use, and transfer data that is reasonably necessary, proportionate, and limited to: (1) provide a specific product or service requested by the individual or (2) a communication reasonably anticipated within the context of the relationship, with some enumerated exceptions. 

On top of this baseline protection, ADPPA placed heightened restrictions on sensitive data: Covered entities could only collect and process sensitive data if it was strictly necessary to provide the product or service the consumer requested (or for a limited subset of the enumerated permissible purposes). The transfer of sensitive data was even further limited to only a handful of specific permissible purposes or with the consumer’s affirmative express consent. 

Notably, the bill would have prohibited targeted advertising based on a person’s activities over time and across websites and online services, or over time on high-impact social media. ADPPA also included strong civil rights protections that prohibited entities from processing data in ways that discriminate. 

The bill went through extensive negotiations between members of Congress of both parties, industry, civil rights groups, and consumer protection and privacy groups. EPIC supported ADPPA, including by testifying on the bill in the House Subcommittee on Consumer Protection and Commerce, joining nearly 50 other public interest groups in urging the House to vote on ADPPA, and sending a letter to the chairs of the Senate Commerce Committee urging them to hold a markup of the bill. 

ADPPA received overwhelming bipartisan support in the House Energy & Commerce Committee, where it was favorably approved on a 53-2 vote. Unfortunately, Congress failed to enact ADPPA. 

After ADPPA failed to pass, EPIC used the bill to craft the State Data Privacy and Protection Act, a state bill that mirrored ADPPA’s bipartisan compromise. Versions of the State Data Privacy and Protection Act were introduced in MaineMassachusetts, and Illinois.

The American Privacy Rights Act (APRA) 

In April 2024, the American Privacy Rights Act of 2024 (APRA), was announced by Senate Commerce Committee Chair Maria Cantwell (D-WA) and House Energy and Commerce Committee Chair Cathy McMorris-Rodgers (R-WA). This bicameral, bipartisan discussion draft builds on Senator Cantwell’s earlier Consumer Online Privacy Rights Act and on the American Data Privacy and Protection Act that Chair Rodgers and Ranking Member Pallone passed out of the House Energy and Commerce Committee in the summer of 2022. 

APRA contains similar requirements to ADPPA, although it eliminates the two-tiered structure in favor of a single standard. In APRA, covered entities and service providers are prohibited from collecting, processing, retaining, or transferring personal data beyond what is necessary, proportionate, and limited to provide the requested product or service (or for certain enumerated permissible purposes). The term “proportionate” helps provide sensitive data with heightened protection because proportionality will be impacted by the type of data being collected. 

The bill is still pending in Congress. EPIC has called on the House Energy & Commerce Committee to reinstate critical civil rights protections that were stripped from the bill before introduction. 

States

Of the 19 states that have passed “comprehensive” privacy legislation in recent years, only California’s and Maryland’s contain meaningful data minimization rules. 

Despite the frequent claims of technology companies and industry lobbyists, the Virginia/Connecticut “model” and the state laws that have followed do not require real data minimization. These laws contain provisions that purport to be data minimization but only include language requiring controllers to limit collection of personal data to what is “adequate, relevant and reasonably necessary in relation to the purposes for which such data is processed, as disclosed to the consumer.” The key words “as disclosed to the consumer” mean that businesses are not really limited at all—they may collect and use data for any purposes they disclose in their privacy policies that no one ever reads.

These laws allow businesses to continue collecting whatever personal data they want and using it for any reason they want as long as they disclose that reason in their privacy policy—ensuring that the status quo of massive data collection and sale continues uninterrupted. In fact, it encourages companies to write those purposes as broadly as possible to cover any possible use of data they may want in the future. A real data minimization rule limits the data that companies can collect and use to what consumers expect.

To illustrate the difference, consider a person using a ride-share app to request a ride to a doctor’s office or a place of worship. That person expects the app to use their location to allow the driver to pick them up at their location and drop them off at their destination. The user does not expect the app to continuously track their location long after the ride has ended or to sell the fact that they were dropped off at these sensitive destinations to data brokers. 

True data minimization would not allow these unexpected secondary uses of personal data. The so-called data minimization found in the Virginia/Connecticut “model” and the other states that have adopted similar laws would allow these unexpected and unfair data practices to continue unchecked as long as these data uses were disclosed in a boilerplate privacy policy. 

To meaningfully protect privacy, laws and regulations should include real data minimization protections like those found in the GDPR, the CCPA, Maryland’s Online Data Privacy Act, and proposed federal legislation.

EPIC’s work on data minimization 

EPIC has been advocating for a strong comprehensive federal privacy law for 30 years. As part of that effort, EPIC has urged the adoption of data minimization standards in privacy laws and regulations at both the state and federal levels.

In January 2022, EPIC accelerated this effort when it released a joint white paper with Consumer Reports that provided a detailed roadmap for how the Federal Trade Commission (FTC) should issue a data minimization rule under its unfair practices authority. The paper urged the FTC to establish a Data Minimization Rule to prohibit all secondary data uses with limited exceptions, ensuring that people can safely use apps and online services without having to take additional action. 

As outlined above, the American Data Privacy and Protection Act of 2022, released later that year, contained similar data minimization rules. EPIC worked closely with policymakers and civil society partners to strengthen those provisions. We were invited to testify in support of the bill in the House, and while Congress unfortunately was not able to pass the ADPPA in 2022, EPIC seized on this momentum to call on federal regulators to include data minimization standards in federal rules. In the fall of 2022, EPIC submitted extensive comments entitled “Disrupting Data Abuse” to the FTC on its Proposed Trade Regulation Rule on Commercial Surveillance & Data Security.

Similar rules were proposed in the American Privacy Rights Act in 2024.

EPIC has also taken a leadership role in promoting the enactment of stronger state privacy laws that include data minimization standards. We built on ADPPA’s momentum to propose a new model for state legislation to counter Big Tech’s influence on state privacy laws. EPIC converted ADPPA’s bipartisan compromise language into to a model state bill and worked with legislators in Maine, Massachusetts, and Illinois to introduce versions of “State ADPPA.” EPIC staff have spent the last year working in those states and many others to educate lawmakers and their staffs about the importance of incorporating data minimization language into state privacy laws and provide expert technical assistance to help draft those provisions. As outlined above, the Maryland Online Data Privacy Act includes key data minimization provisions from State ADPPA. Versions of the bill came close to passage in Vermont and Maine. A Committee in Massachusetts gave the bill a favorable report.

But we have heard from many lawmakers that they would prefer to strengthen existing state laws rather than enact a new framework. With this in mind, EPIC recently released a new model bill for states – the State Data Privacy Act – with Consumer Reports. We chose the Connecticut Data Privacy Act (CTDPA) as base text to build from, as the CTDPA is the bill most often cited by industry as the model they would like states to adopt. CTDPA is far too weak, but it is an established framework that many state lawmakers are already familiar with. Strengthening the CTDPA provides consistency for businesses while giving consumers meaningful privacy protections. 

EPIC is happy to work with any policymakers interested in data minimization rules and frameworks.

EPIC's Experts on Data Minimization

Support Our Work

EPIC's work is funded by the support of individuals like you, who help us to continue to protect privacy, open government, and democratic values in the information age.

Donate