Analysis
Data Minimization: A Pillar of Data Security, But More Than That Too
June 22, 2023 |
This is the third in a series of blog posts about EPIC’s proposal for a data minimization standard to limit commercial surveillance and protect consumer privacy. In our first post, my colleague Suzanne Bernstein explained that data minimization is a framework for limiting the collection, use, transfer, and retention of personal information and discussed how minimization is a way to fulfill the reasonable expectations of consumers concerning the use and protection of their personal data. In our second post, my colleague Sara Geoghegan talked about the harms that often flow from secondary uses of personal data and how purpose limitations—a critical part of any data minimization framework—fit into the Federal Trade Commission’s authority to regulate unfair commercial data practices.
Today’s post highlights the important role data minimization can play in data security while also underscoring that a robust minimization framework must do more than protect against breaches and unauthorized access of personal data.
Last summer, the FTC published a lengthy request for comment signaling that it intended to adopt new rules governing the commercial processing of personal data—something EPIC had previously urged the Commission to do. In November, EPIC filed extensive comments with the FTC setting out the scale of today’s data privacy crisis, discussing the Commission’s legal authority to establish robust rules, and identifying specific harmful business practices that the Commission should regulate. Those comments cover a lot of ground (summary here), but they lead with EPIC’s long-running call for a data minimization rule—specifically, a declaration by the FTC that:
By defining the above as an unfair practice, the FTC can unlock the ability to impose significant fines on violators that collect, process, and transfer excessive personal data. Briefly stated, the Commission’s authority to issue trade rules extends to business practices that are both (1) unfair or deceptive, and (2) prevalent. A business practice is considered “unfair” if it’s likely to cause substantial injury that consumers can’t reasonably avoid and which isn’t outweighed by countervailing benefits to consumers or competition, and it’s considered “deceptive” if it involves a representation or omission likely to mislead consumers. To establish that a harmful business practice is “prevalent,” the Commission can rely on two types of evidence: (1) past cease and desist orders concerning that business practice, or (2) “any other information available to the Commission indicates a widespread pattern of unfair or deceptive acts or practices.”
Data Security Means Data Minimization
Alongside commercial surveillance, data security is one of the two principal areas of concern highlighted in the FTC’s initial proposal for the (aptly named) Trade Regulation Rule on Commercial Surveillance and Data Security. EPIC has often criticized the FTC’s failure to safeguard the privacy of consumers over the past two decades. But a relative bright spot has been the Commission’s data security work, particularly the cases the FTC has pursued against businesses that fail to secure personal data from breach or unauthorized access. Since 2000, the Commission has brought more than 90 enforcement actions alleging unfair, deceptive, or otherwise unlawful data security practices by companies entrusted with consumers’ personal information. This includes cases against major firms like Microsoft, Twitter, Rite Aid, CVS, Petco, DSW, Oracle, Snapchat, Fandango, Credit Karma, Wyndham, PayPal, Office Depot, Equifax, Zoom, and Amazon. The Commission has also used its targeted statutory authorities to establish data security requirements through the Safeguards Rule and the Children’s Online Privacy Protection Rule and to flesh out reporting requirements for companies that fail to protect personal health information through the Health Breach Notification Rule.
This body of data security work is an important foundation for the FTC’s rulemaking. First, it helps the Commission establish the factual and legal basis for adopting trade rules on data security. It’s hard to look at the steady drumbeat of FTC enforcement actions over the past twenty years and deny that harmful data security practices are prevalent in today’s economy or that consumers suffer a panoply of privacy and financial harms as a result. (You’ll find much more on this topic beginning on page 181 of our comments from the fall.)
The FTC’s work in this area also highlights some of the specific elements that should be incorporated into industry-wide rules on data security. As we noted in our comments, the FTC’s data security cases clearly establish (1) that it is unfair for businesses to fail to maintain reasonable administrative, technical, and physical measures to secure consumers’ personal data against unauthorized access, and (2) that it is deceptive for businesses to claim that they secure the personal data they collect and process if, in fact, they fail to implement the necessary safeguards. The Commission has also identified some of the common hallmarks of inadequate data security, such as failing to maintain vulnerability disclosure policies, failing to patch known software vulnerabilities, failing to segment database servers, storing social security numbers in unencrypted plain text, transmitting personal information in plain text, and failing to perform vulnerability and penetration testing as part of timely security reviews.
So what does all of this have to do with data minimization? Quite a bit, it turns out. The relationship between data security and data minimization is perhaps best summarized by the maxim “You don’t have to protect what you don’t collect.” Every piece of personal information collected and retained by a business is inherently at risk of unauthorized access and use. Technical and physical safeguards are certainly vital to limiting that risk, but one surefire strategy is for business to limit the data they collect and process in the first place. Practicing data minimization makes businesses less attractive targets for data thieves and hackers, limits the harm to consumers if and when breaches do occur, and fully eliminates the risk of breach for data elements that are never collected to begin with. Indeed, the FTC’s data security work has increasingly come to reflect this relationship between security and minimization. The Commission’s recent orders against Chegg, Drizly, and CafePress all incorporate data minimization requirements, and the FTC’s initial notice of proposed rulemaking on commercial surveillance and data security expressly defines data security to include data minimization.
But Data Minimization Means More Than Data Security
EPIC is encouraged by the Commission’s focus on data minimization, data security, and the link between the two. But as the FTC lays down commercial surveillance rules, it’s important for the Commission to focus on another component of data minimization: restrictions on how, why, and to what extent entities can process personal data that they are (at least nominally) entitled to use and retain. (The FTC and others often describe these as purpose limitations; EPIC considers them a subset of data minimization, too.) Often consumers are harmed not by an unauthorized user obtaining access to their personal data, but rather by secondary, out-of-context uses of personal data by the businesses that have collected it. It’s essential that the FTC’s commercial surveillance rules address both of these scenarios.
To illustrate the difference: Twitter has (so far) been subject to two FTC enforcement actions concerning its personal data practices. The first, in 2011, resulted from “serious lapses in the company’s data security [that] allowed hackers to obtain unauthorized administrative control of Twitter, including both access to non-public user information and tweets that consumers had designated as private, and the ability to send out phony tweets from any account.” The second, in 2022, resulted from Twitter “ask[ing] users to give their phone numbers and email addresses to protect their accounts,” then “profit[ing] by allowing advertisers to use this data to target specific users.” (The FTC approached the latter as a deceptive trade practice because Twitter affirmatively misled its users, but even without that misdirection, we’d argue that covertly using security information for marketing purposes constitutes an unfair trade practice.)
A data minimization trade rule that only addresses the former type of violation (failing to safeguard personal data against unauthorized access) may fail to capture the latter type of violation (using personal data for secondary purposes that violate the expectations of the consumer). Such a rule would invite further data gamesmanship by Twitter and other businesses and could allow many of the most pervasive forms of data abuse—e.g., targeted advertising and the tracking it incentivizes—to continue largely unchecked. A narrow rule to this effect would be a devastating missed opportunity for the Commission.
A Data Minimization Rule Should Do Both
The good news is that the FTC is on firm footing to address both types of data protection violations using a minimization framework. First, the Commission’s own enforcement precedents extend well beyond inadvertent data security failures to address many of the harmful personal data practices baked into companies’ core business models. Just in the past few years, the FTC has taken action against Microsoft for illegally retaining children’s data, against Everalbum for its misuse of facial recognition technology, against Epic Games for unfair default privacy settings, against OpenX for wrongful collection and use of geolocation data, and against SpyFone for surreptitious data harvesting and use. The Commission should build on these and other cases to adopt a rule that establishes purpose limitations on the use of personal data and imposes liability for traditional data security violations.
Second, as noted, the Commission’s commercial surveillance and data security rules need not be restricted to the exact practices identified in prior FTC enforcement actions. If “any other information available to the Commission indicates a widespread pattern of unfair or deceptive acts or practices,” those practices can be properly regulated by the FTC through a trade rule. This includes harmful secondary uses of personal data that the Commission may not have targeted yet through case-by-case enforcement. EPIC and dozens of peer organizations have gathered voluminous evidence of harmful data practices that the FTC should rely on in scoping its regulations, and each day brings new and alarming media reports about the U.S. data protection crisis that further support the need for robust commercial surveillance rules.
Finally, the Commission should recognize (as it has begun to do) that data security safeguards and purpose limitations fundamentally protect against the same types of harm to consumers: unwanted, unexpected, and out-of-context uses of personal information. The misuse of an individual’s personal information inherently causes harm by depriving them of control over their data—whether that misuse is the work of a hacker or a business entrusted to protect the individual’s data. This principle is reflected in the Commission’s recent enforcement of and guidance concerning the Health Breach Notification Rule, which makes clear that a “breach” includes “a company’s disclosure of covered information without a person’s authorization.” The Commission should continue to clarify this alignment of privacy and data security protections as it promulgates commercial surveillance rules.
EPIC is encouraged by the FTC’s attention to data security safeguards, but it is crucial that the Commission seize on this rulemaking opportunity to set substantive limits on secondary uses of personal data. An effective data minimization framework must be structured to cover both.
Support Our Work
EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.
Donate