Data Minimization: Bolstering The FTC’s Health Data Privacy Authority

July 13, 2023 | Suzanne Bernstein, EPIC Law Fellow

Among the categories of personal data collected seemingly without restriction in the United States, consumer health data is both particularly sensitive and particularly lucrative. In recent years, there has been a dramatic increase in telehealth, health and fitness apps, and other online health products, resulting in an enormous volume of health data collection and abuse that falls outside of the narrow protections of the Health Insurance Portability and Accountability Act (HIPAA). The Federal Trade Commission (FTC) works to fill this gap by using its broad Section 5 authority and by enforcing the Health Breach Notification Rule (HBNR). But as the FTC considers promulgating a new rule to govern the commercial processing of personal data, it has the opportunity take a more comprehensive approach to protecting consumer health data privacy and security.

EPIC has long urged the Commission to issue a data minimization rule using its Section 5 authority to regulate unfair and deceptive trade practices. An effective data minimization rule would establish that it is an unfair trade practice to collect, use, transfer, or retain personal data beyond what is reasonably necessary and proportionate to the primary purpose for which it was collected, consistent with consumer expectations and the context in which the data was collected. The previous blog posts in this series have elaborated on critical aspects of data minimization for the FTC’s forthcoming commercial surveillance and data security rule: centering reasonable consumer expectation, limiting the scope of permissible data uses, and the key role that data minimization can play in data security. While a data minimization-focused commercial surveillance rule would apply broadly to various types of consumer data, this blog post will focus on health data as a case study, illustrating how a data minimization rule would bring stability, predictability, and security to consumers about their health data.

Consumer Health Data Collection and Use

Consumer health data collection has skyrocketed in recent years. The broad availability and convenience of smartphones and internet access has enabled “Americans to turn to apps and other technologies to track diseases, diagnoses, treatment, medications, fitness, fertility, sleep, mental health, diet and other vital areas[.]” Our understanding of what constitutes health data has grown as data analysts and data brokers have demonstrated their ability to infer health-related insights from a widening range of data sources. For example, location data can become sensitive health data, like GPS data indicating that someone has visited a methadone or abortion clinic. Location data can also reveal healthcare activity and related behavior.

Unbeknownst to many consumers, most of this health data collection is not regulated by HIPAA. Most of the apps, platforms and companies that collect, share and sell health information online fall outside of HIPAA’s narrow scope, as the HIPAA Privacy Rule and related requirements only apply to covered entities including healthcare providers, health plans, and clearinghouses, as well as business associates that assist a covered entity in transmitting Protected Health Information (PHI). PHI is also narrowly defined to only include information generated by a covered entity. If, for example, a patient or consumer “discloses PHI to a third-party, non-covered entity, the information is no longer protected by HIPAA.” As a result, there is a tremendous amount of health data in the hands of commercial entities that HIPAA does not regulate. (However, a far broader expanse of personal health data falls under the Health Breach Notification Rule enforced by the FTC, which will be discussed in the following section.)

Gaps in the regulation of commercial health data practices pose significant risks to consumers. The mismanagement or breach of sensitive health data can result in a range of privacy injuries from stigma and humiliation to financial and reputational injuries. What’s more, the largely unregulated data brokerage ecosystem that constantly collects, analyzes and sells health data without consumer knowledge or consent poses stark privacy and security risks to consumers. Data brokers sell health data, including mental health information, to willing buyers including commercial entities, health insurance companies, law enforcement, and nearly any interested individual. While health data collection and sale are one piece of the enormous commercial surveillance apparatus, they pose unique risks to consumers. For example, health insurance companies can purchase and use information collected by data brokers to determine healthcare rates. Health, demographic, and “lifestyle” information collected from any online activity—like purchasing plus-sized clothing or posting about feeling anxious or depressed from a recent divorce—can yield inferences for predicting health costs. All of this, from the surveillance and data collection to the sale and use of health data to make an insurance policy decision, is largely beyond the control of consumers. A recent AARP report detailed the concern of older adults about big data in healthcare: “The vast majority (78%) are concerned that the use of big health data may limit access to insurance and three of four (75%) worry about companies using their data without their consent.”

Another context to consider is reproductive health data privacy. Reproductive health data collected from various sources like period and fertility tracking apps, online searches for contraceptives, or location data from a visit to an abortion clinic can be sold to advertisers, law enforcement, or data brokers to be further aggregated and exploited. As EPIC recently detailed in comments to the Department of Health and Human Services about the Department’s effort to update the HIPAA Privacy Rule in the wake of Dobbs: “the collection and disclosure of reproductive health information can be harmful in myriad ways, spanning from the invasion of privacy to the threat of criminalization, social consequences, and chilling of access to health care.”

The Commission can address and prevent future heath data abuse by articulating a data minimization rule to disrupt the unfair collection and sale of health data at the source. By limiting data collection and use to what is reasonably necessary and proportionate to the purpose for which it was collected, the Commission can enforce data collection that is only consistent with consumer expectations, establishing trust, limiting security risks, and mitigating health privacy concerns.

The FTC’s Authority to Protect Consumers’ Health Data

The Commission should continue to build on its recent health privacy enforcement actions by centering data minimization in the FTC’s forthcoming commercial surveillance and data security rule. The Commission has made great strides in using its Section 5 authority to combat unfair and deceptive consumer health privacy violations. Actions against Kochava, Flo Health and Premom signified the Commission’s commitment to reproductive health privacy, while the recent FTC settlement with GoodRx emphasized that it is unlawful to disclose consumers’ personal health information to third parties without authorization. The FTC’s enforcement actions against BetterHelp and Vitagene demonstrated the breadth of personal information that should be considered health data, targeting violations of mental health and genetic privacy, respectively. And the Commission’s recent use of the Health Breach Notification Rule authority illustrates the FTC’s determination to use all available enforcement authority to protect consumer health privacy.

Currently, the FTC’s Section 5 and HBNR authority limit the Commission’s ability to address health privacy abuses until an unfair or deceptive data practice—such as a security breach or unauthorized disclosure—has already happened. Without a trade rule in place, the Commission is generally unable to impose civil penalties on companies unless they have previously violated Section 5, while the HBNR only requires post hoc notice to consumers of unauthorized access to their personal health information. Centering data minimization in the proposed trade regulation rule would enable the Commission to better combat health privacy abuses throughout the data collection lifecycle: from initial data collection to the use and sale of the data. To reiterate, an effective data minimization rule would establish that it is an unfair trade practice to collect, use, transfer, or retain personal data beyond what is reasonably necessary and proportionate to the primary purpose for which it was collected, consistent with consumer expectations and the context in which the data was collected. In the context of health data privacy, a data minimization rule would standardize health data collection, bring security and predictability to consumers, and unlock additional FTC enforcement authority.

Data Minimization and Consumer Health Data Privacy

The previous posts in this series discussed discrete aspects of data minimization applied to all consumer data: reasonable consumer expectation, permissible data use, and data security. Analyzing these concepts in the health data privacy context further underscores the need for a data minimization standard. For example, data security is critical to highly sensitive information like consumer health data. Each piece of health data collected and retained by a business or data broker is at risk for a data breach or any unauthorized access and use. Data minimization would effectively limit the collection, retention, and use of health data, thereby lowering the data security risk. As my colleague John Davisson explained, “Technical and physical safeguards are certainly vital to limiting [data security] risk, but one surefire strategy is for business to limit the data they collect and process in the first place.” In turn, consumers would feel safer and more likely to trust innovative healthcare apps and online services.

A data minimization framework would also incorporate the related concepts of a reasonable consumer expectation standard and the scope of permissible data use. The evaluation of consumers’ reasonable expectations necessarily reaches beyond a disclosure or privacy policy (which consumers likely do not read or understand) to consider the context of the interaction between the consumer and the business collecting data. When a consumer engages with an online business, “they reasonably expect that their data will be collected and used for the limited purpose and duration necessary to provide the goods or services that they requested.” In the health data context, it would likely exceed reasonable consumer expectation for a fertility tracking app to share or sell consumer health information with a third-party advertiser seeking to serve that consumer with ads for baby clothes. When enrolling in the fertility tracking app, the reasonable consumer likely did not expect that their health information would be sold or shared with third parties—even if they skimmed and agreed to a lengthy, wordy privacy policy. The reasonable consumer expectation standard is meant to challenge the existing notice and choice regime that relies on nominal “consent” to enable unrestricted data collection and use.

Relatedly, a data minimization standard necessarily considers the scope of permissible data use. Out-of-context secondary uses of data can cause substantial harm to consumers. However, some secondary uses may not be harmful, typically because they are within the reasonable expectations of the consumer. Continuing with the fertility tracking app example, the consumer would likely expect that after signing up for the service with their email, they would receive an email communication or even a newsletter from the fertility tracking app. But sharing or selling the consumer’s health data to an advertiser or data broker (resulting in the targeted baby clothes ad) would constitute a harmful, out-of-context secondary use. As my colleague Sara Geoghegan warned: “Secondary uses of data can be harmful not only because they violate the core privacy principle that personal information should be used within the context of the primary purpose for which it was collected, but also because they are used to justify broad collection and indefinite retention of sensitive data.”

Conclusion

In its upcoming trade regulation rule, the Commission can provide meaningful new protections to one of the highest-risk categories of personal information subject to commercial abuse: health data. Health data misuse, from collection and retention to unauthorized disclosure, can lead to grave privacy, financial and social harms and can even facilitate criminal prosecution. The FTC has already taken important steps to protect consumer health privacy using its existing enforcement authority. Centering a data minimization framework in the commercial surveillance and data security rule would significantly further this goal, protecting consumer health data and promoting trust and security in the marketplace.

Support Our Work

EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.

Donate