The Electronic Privacy Information Center (EPIC) submits these comments in response to the Federal Trade Commission’s Advanced Notice of Proposed Rulemaking regarding a Trade Regulation Rule on Commercial Surveillance and Data Security. The unchecked spread of commercial surveillance over the last two decades has led to a data privacy crisis for consumers in the United States. Without any comprehensive privacy laws or regulations, abusive data practices have flourished. The ability to monitor, profile, and target consumers at a mass scale has created a persistent power imbalance that robs individuals of their autonomy and privacy, stifles competition, and undermines democratic systems. It is far past time to disrupt this data abuse, set rules of the road for our online ecosystem, and ensure that companies cannot extract private value from personal data in ways that undermine the public good.
Section 5 of the Federal Trade Commission Act provides that unfair and deceptive trade practices are unlawful and empowers the Commission to prevent and protect consumers from such practices. Under section 18 of the FTC Act, the Commission can issue a trade regulation rule defining those unfair and deceptive surveillance practices and establishing strong privacy and data security standards for all.
In these comments, EPIC provides an overview of the systems that facilitate commercial surveillance and the injuries they inflict on consumers. A business’s collection, use, retention, or transfer of a consumer’s personal information beyond what is reasonably necessary and proportionate to achieve the primary purpose for which it was collected (consistent with consumer expectations and the context in which the data was collected) is an unfair trade practice. These out-of-context secondary uses of data—including its sale to and use by data brokers, surveillance advertising firms, and other entities trafficking in consumer profiles—and the overcollection that feeds them are inconsistent with the reasonable expectations of online consumers. These unfair commercial surveillance practices lead to invasive, discriminatory targeting that violates the privacy and autonomy of consumers. EPIC argues that the FTC should establish a data minimization rule to ensure that businesses only collect the data that they need to provide the goods and services that consumers request, and that they don’t use or transfer data in ways that defy reasonable consumer expectations.
EPIC further argues that the Commission should issue a rule declaring it an unfair and deceptive practice to use an automated decision-making system without first demonstrating that it is effective, accurate, and free from impermissible bias. Commercial entities frequently use automated decision-making systems without substantiating the claims made about the systems, verifying their accuracy, or evaluating them for disparate impact. These automated systems—which can encompass a broad range of statistical or machine-learning tools that perform operations on data to aid or replace human decision-making—cause substantial injury to consumers when used without proper disclosure and oversight.
The Commission should also find that it is an unfair and deceptive practice to use an automated decision-making system implicating the interests of consumers without providing adequate notice of such use, including meaningful, readable, and understandable disclosure of the logic, factors, inputs, and training data on which the system relies. Companies should be required to publicly substantiate their claims about automated decision-making systems implicating the interests of consumers, articulate the purposes of those systems, evaluate the accuracy of those systems, and analyze potential disparate impacts of those systems.
The Commission should also categorically ban algorithmic systems that have been shown to cause serious and systemic harms. Specifically, systems that enable one-to-many facial recognition and systems that purport to provide “emotion recognition” capabilities have been shown to exacerbate biases and produce harmful outcomes. There is mounting evidence that such systems cannot be operated in a way that is fair to consumers or in a way that serves the public interest.
Throughout these comments, EPIC highlights the ways that commercial surveillance and algorithmic decision-making systems disproportionately harm marginalized communities. Targeting and profiling systems are designed to divide, segment, and score individuals based on their characteristics, their demographics, and their behaviors. In many cases, this means that consumers are sorted and scored in ways that reflect and entrench systematic biases. EPIC believes the FTC should issue a rule that prohibits discrimination as an unfair trade practice.
It is clear that children and teens require heightened protections when it comes to the collection and use of their personal data. Minors are uniquely vulnerable to profiling and the outputs of commercial surveillance systems, which are necessarily designed to suggest and shape preferences and beliefs. Therefore, the Commission should issue a rule declaring it an unfair practice to collect, process, retain, or transfer the personal data of minors under the age of 18 unless strictly necessary to achieve the minor’s specific purpose for interacting with the business or to achieve certain essential purposes. The Commission should also ban targeted advertising to minors and issue a rule declaring it to be an unfair and deceptive practice for companies to make intentional design choices in order to facilitate the commercial surveillance of minors.
Data security and privacy go hand in hand. The Commission’s rule should declare that a business’s failure to implement reasonable security measures is an unfair trade practice, and that any entity which represents that it protects the security of consumer data but fails to adopt reasonable data security measures has engaged in a deceptive trade practice. Consumers are facing an epidemic of data breaches and resulting identity theft due to a lack of investment in and commitment to data security. For over two decades, the FTC has tried to remedy the situation through case-by-case enforcement and the encouragement of industry self-regulation, but it is clear those approaches are not sufficient.
Lastly, the Commission should issue a rule affirming that is an unfair practice for a business to use manipulative design or dark patterns to nudge consumers to “accept” terms or options that broaden the scope of personal data that the business collects, uses, or discloses. Dark patterns are especially harmful in the data protection context. Companies have pushed for decades to frame data collection and processing as an issue of consumer “choice” while deploying manipulative choice architecture to ensure that consumers always “choose” to permit more data collection, broader uses of data, and loose or non-existent data sale and transfer restrictions. The Commission has already taken steps to crack down on manipulative design techniques and dark patterns and should take this opportunity to declare them an unfair trade practice.
In response to the Commission’s questions regarding notice, transparency, and consent, EPIC argues that the agency should issue a rule declaring it an unfair and deceptive practice to collect, use, retain, or transfer personal data without first assessing, justifying, and providing adequate notice of such collection, use, retention, or transfer. EPIC further argues that the Commission should require businesses to promptly honor an individual’s request to access all data the business maintains on them; to have such data corrected if it is in error; or to secure the deletion of all such data. EPIC notes, however, that even the most effective notice and transparency requirements cannot, by themselves, fully protect against the abuse of personal data. Commercial surveillance practices are simply too complex and numerous for even the most sophisticated consumer to understand. Transparency and user rights are only valuable in conjunction with substantive limits on data collection and use.