A Look at EPIC’s Report to the FTC on Commercial Surveillance & Data Security

November 30, 2022 | John Davisson, Director of Litigation

Last week, EPIC published a report titled Disrupting Data Abuse: Protecting Consumers from Commercial Surveillance in the Online Ecosystem. The report responds to a call for comments from the Federal Trade Commission, which is considering a rule on commercial surveillance and data security. Over 230 pages, we detail the harms inflicted by exploitative commercial data practices, establish the Commission’s authority to regulate those practices, and call on the FTC to impose specific privacy, security, transparency, algorithmic fairness, and anti-discrimination obligations on businesses.

EPIC and coalition partners have repeatedly urged the Commission to undertake a trade regulation rulemaking that would define unfair and deceptive commercial data practices and unlock the FTC’s dormant enforcement power. The Commission, which is the de facto data privacy regulator in the United States, typically lacks the ability to impose fines for first-time privacy and security violations. But a trade rule would establish across-the-board obligations backed by the threat of civil penalties—a major step forward for U.S. data protection. EPIC is heartened to see the Commission considering such a rule now.

Our report begins by laying out the stakes of today’s data privacy crisis:

The lack of comprehensive privacy laws and regulations has allowed abusive data practices to flourish, creating a persistent power imbalance that threatens both individual rights and competition. Due to the failure of policymakers in the U.S. to establish adequate data protection standards, online firms have been allowed to deploy commercial surveillance systems that collect and commodify every bit of our personal data. … The notice and choice approach that has dominated the United States’ response to this uncontrolled data collection over the last several decades simply does not work.

Next, we lay out the Commission’s legal authority to break from the failed approaches of the past and establish robust rules for the commercial processing of personal data. Briefly put, the FTC can use its trade rulemaking authority to prohibit particular commercial data practices as deceptive (i.e., materially misleading) or unfair (i.e., causing substantial and unavoidable injury to consumers that is not outweighed by benefits to consumers or competition). As we explain, “substantial injury” includes harms that may not always be economically quantifiable, such as invasions of privacy, reputational damage, and discrimination. The practices must also be “prevalent”—a standard that is easily met for the types of data uses which would be targeted by a trade rule. Having declared certain data practices unlawful, the Commission can then impose prophylactic obligations on businesses to prevent those practices from occurring.

We then turn to our substantive recommendations for the Commission’s rule.

Data minimization

EPIC argues that business’s collection, use, retention, or transfer of a consumer’s personal information beyond what is reasonably necessary and proportionate to achieve the primary purpose for which it was collected (consistent with consumer expectations and the context in which the data was collected) is an unfair trade practice. These out-of-context secondary uses of data and the overcollection that feeds them are inconsistent with the reasonable expectations of online consumers. We call for a data minimization rule to ensure that businesses only collect the data that they need to provide the goods and services that consumers request, and that they don’t use or transfer data in ways that defy reasonable consumer expectations.

Limits on automated decision-making systems

EPIC argues that it is an unfair and deceptive practice to use an automated decision-making system without first demonstrating that it is effective, accurate, and free from impermissible bias. Commercial entities frequently use automated decision-making systems without substantiating the claims made about the systems, verifying their accuracy, or evaluating them for disparate impact. These automated systems—which can encompass a broad range of statistical or machine-learning tools that perform operations on data to aid or replace human decision-making—cause substantial injury to consumers when used without proper disclosure and oversight. Companies should also be required to publicly substantiate their claims about automated decision-making systems implicating the interests of consumers, articulate the purposes of those systems, evaluate the accuracy of those systems, and analyze potential disparate impacts of those systems. Finally, the Commission categorically ban algorithmic systems that have been shown to cause serious and systemic harms, such one-to-many facial recognition and systems that purport to provide “emotion recognition” capabilities.

Anti-discrimination provisions

Throughout its report, EPIC highlights the ways that commercial surveillance and algorithmic decision-making systems disproportionately harm marginalized communities. Targeting and profiling systems are designed to divide, segment, and score individuals based on their characteristics, their demographics, and their behaviors. In many cases, this means that consumers are sorted and scored in ways that reflect and entrench systematic biases. EPIC argues the FTC should issue a rule that prohibits discrimination as an unfair trade practice.

Notice and transparency requirements

EPIC argues that the Commission should issue a rule declaring it an unfair and deceptive practice to collect, use, retain, or transfer personal data without first assessing, justifying, and providing adequate notice of such collection, use, retention, or transfer. EPIC further argues that the Commission should require businesses to promptly honor an individual’s request to access all data the business maintains on them; to have such data corrected if it is in error; or to secure the deletion of all such data. EPIC notes, however, that even the most effective notice and transparency requirements cannot, by themselves, fully protect against the abuse of personal data.

Heightened protections for minors

It is clear that children and teens require heightened protections when it comes to the collection and use of their personal data. Minors are uniquely vulnerable to profiling and the outputs of commercial surveillance systems, which are necessarily designed to suggest and shape preferences and beliefs. We argue the Commission should issue a rule declaring it an unfair practice to collect, process, retain, or transfer the personal data of minors under the age of 18 unless strictly necessary to achieve the minor’s specific purpose for interacting with the business or to achieve certain essential purposes. The Commission should also ban targeted advertising to minors and issue a rule declaring it to be an unfair and deceptive practice for companies to make intentional design choices in order to facilitate the commercial surveillance of minors.

Data security standards

Data security and privacy go hand in hand. EPIC asks the Commission’s to declare that a business’s failure to implement reasonable security measures is an unfair trade practice, and that any entity which represents that it protects the security of consumer data but fails to adopt reasonable data security measures has engaged in a deceptive trade practice. Consumers are facing an epidemic of data breaches and resulting identity theft due to a lack of investment in and commitment to data security. For over two decades, the FTC has tried to remedy the situation through case-by-case enforcement and the encouragement of industry self-regulation, but it is clear those approaches are not sufficient.

Prohibition on deceptive design practices

Finally, EPIC urges the Commission to declare that it is an unfair practice for a business to use manipulative design or dark patterns to nudge consumers to “accept” terms or options that broaden the scope of personal data that the business collects, uses, or disclosures. Dark patterns are especially harmful in the data protection context. Companies have pushed for decades to frame data collection and processing as an issue of consumer “choice” while deploying manipulative choice architecture to ensure that consumers always “choose” to permit more data collection, broader uses of data, and loose or non-existent data sale and transfer restrictions.

The Commission’s trade rulemaking process is a long one, and FTC staff now have thousands of pages of comments to review from civil society, industry, academics, experts, and members of the public (some of which are linked to on our rulemaking page). The next major step in the process will be the publication of a proposed rule, though the timing of that publication is uncertain. But as EPIC’s report notes, we are eager to continue working with the FTC to ensure that this process yields the strongest possible privacy and civil rights protections for consumers.

View our full comments below.

Support Our Work

EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.

Donate