Coalition Letter Re: Order to Show Cause and Proposed Order In re Facebook, Inc.
Chair Khan, Commissioner Bedoya, and Commissioner Slaughter:
The undersigned privacy, tech accountability, and consumer protection organizations write in support of the Federal Trade Commission’s action to modify its 2020 privacy order with Meta Platforms, Inc. (formerly Facebook, Inc.). Meta has violated the law and its consent decrees with the Commission repeatedly and flagrantly for over a decade, putting the privacy of all users at risk. In particular, we support the proposal to prohibit Meta from profiting from the data of children and teens under 18. This measure is justified by Meta’s repeated offenses involving the personal data of minors and by the unique and alarming risks its practices pose to children and teens.
The proposed modifications are appropriate and urgently necessary based on Meta’s record of data privacy and security violations. The 2012 Complaint alleged, among other things, that Meta violated Section 5 of the FTC Act by misrepresenting the extent to which users could control certain uses of their private and non-public personal information.[1] Meta agreed to a consent order in response to the 2012 Complaint that necessarily included a prohibition against making privacy and security misrepresentations about user data and required Meta to establish and maintain a privacy program.[2]
In 2019, in addition to a record breaking $5 billion settlement related to violations of the 2012 order, the Commission sought to amend the original order for the first time by requiring unprecedented new restrictions on Meta’s business operations and by instituting multiple accountability mechanisms.[3] The Complaint alleged that Meta failed to comply with the 2012 Order because Meta did not establish and implement a sufficient privacy program, and Meta misrepresented both the user information that was shared or accessible to third parties and users’ ability to control the privacy of their data. As a result, Meta agreed to the 2020 Order modifying and expanding various aspects of the 2012 Order.4 The 2020 Order mandated a more expansive privacy review of new or modified products to ensure risk mitigation before implementation. It also imposed security restrictions and required more attentive oversight as it related to third- party apps and developers. Despite the record-breaking fine and the unprecedented procedural constraints placed on Meta, it appears that the FTC’s 2020 reasonable attempts to rein in Meta’s unlawful practices have been ineffective. misrepresented both the user information that was shared or accessible to third parties and users’ ability to control the privacy of their data. As a result, Meta agreed to the 2020 Order modifying and expanding various aspects of the 2012 Order.[4] The 2020 Order mandated a more expansive privacy review of new or modified products to ensure risk mitigation before implementation. It also imposed security restrictions and required more attentive oversight as it related to third- party apps and developers. Despite the record-breaking fine and the unprecedented procedural constraints placed on Meta, it appears that the FTC’s 2020 reasonable attempts to rein in Meta’s unlawful practices have been ineffective.
The Commission’s recent Order to Show Cause demonstrates that Meta has once again failed to fulfill its legal obligations. Though we do not know the full extent of the violations of the 2020 Order, independent assessor Proviti described widespread violations of the privacy program requirement that would require “substantial additional work” to address.[5] The establishment of an effective privacy program was not optional under the Commission’s previous orders, nor was it sufficient for the company to merely “try its best” to implement a program. Meta was required by law to establish and operate a “comprehensive privacy program… that protects the privacy, confidentiality, and Integrity of” consumer data, and it failed.[6]
The Order to Show Cause also describes violations related to minors’ privacy and safety on the Messenger Kids app between December 2017 and July 2019. The 2020 Order only extinguishes the Section 5 claims that the Commission knew about before June 12, 2019, so the Section 5 violations that the Commission did not know about in June 2019 were not resolved by the 2020 Order. Accordingly, the Order violations related to Messenger Kids constitute a change of facts that warrant a modification. Further, COPPA violations were not resolved or affected by the 2020 Order, regardless of whether the Commission knew about them.
Meta’s failure to comply with the terms of the 2012 and 2020 Orders puts users on all of its platforms at risk. First, Meta’s practices harmed all users by violating their privacy as well as by perpetuating discrimination and bias, thus reducing many users’ experiences and opportunities. These risks are especially acute for minors, who do not share adults’ conception of privacy or understanding that their data is used by commercial actors to influence their behavior.[7]
Moreover, data-driven marketing and targeted advertising causes particular and substantial harm to children and teens by manipulating them into interest in harmful products and activities and undermining their autonomy via data-driven algorithms that Meta has optimized for user engagement.[8] This harms minors’ physical and mental wellbeing and increases the risk of problematic internet use.[9] Children and teenagers’ data is particularly sensitive, as minors are more susceptible to commercial surveillance practices due to their developmental vulnerabilities.[10] Furthermore, there are few areas of a minor’s life that are not exposed to those exploitative practices: they play, learn, communicate, experiment, and grow up online. Meta’s illegitimate acquisition of minors’ data from this critical life stage and its profiling of minors has the potential to impact their online experience well into their adulthood.
Based on the foregoing, the Federal Trade Commission has the authority and prerogative to modify their proposed order with Meta.[11] The Commission may, after notice and the opportunity for a hearing, modify in whole or in part any order made under 15 U.S.C. §45(b). To begin this process, the Commission must identify “conditions of fact or of law that have so changed as to require such action or if the public interest shall so require.”[12] Meta’s widespread violations of the 2020 Order constitute a change of conditions—and a pattern of privacy violations—that the FTC must address in the public interest. Further, the arguments in Meta’s recent motion and supporting memo before the court are baseless, as the Commission has long had the authority to reopen administrative orders under Section 5.[13]
The undersigned emphasize that the Commission’s proposed prohibition on monetizing minors’ data is squarely within the public interest. While Meta’s disregard for users’ privacy did not single out young users in its treatment, the effect on minors is singular. As many of our organizations have emphasized to the Commission for many years, Meta has repeatedly demonstrated its failure to protect youth privacy and wellbeing over profit.[14] Minors’ unique vulnerabilities must be accounted for and their privacy and wellbeing safeguarded. The FTC’s impetus to secure limitations on minors’ data reflects minor’s unique vulnerability to Meta’s repeated violations of the law, and is well-founded under the Commission’s authority.
Ultimately, Meta appears unable or unwilling to safeguard user data in accordance with the Commission’s orders or by following the law. Further, the current approach is not sufficiently holding Meta accountable for consistently side-stepping the Commission’s orders: unprecedented violations of previous consent decrees and ongoing violations of the law require unprecedented remedies and hence justify the proposed modification of the order. The Commission’s remedies must effectively address any impression that companies can continue to violate FTC orders and the law and absorb FTC penalties as the cost of doing business. The proposed modifications must move from procedural to structural remedies, which are necessary to address Meta’s serious and repeated violations of previous consent decrees. The Commission has proposed a solution that would make it far more difficult for Meta to put children’s privacy at further risk by limiting the collection, use, and sharing of data for the purpose of monetization. The terms of the order modification bolster the Commission’s credibility and send the message that tech giants cannot evade regulation and meaningful accountability.
Respectfully Submitted,
Center for Digital Democracy
Electronic Privacy Information Center Fairplay
U.S. Public Interest Research Group (PIRG)
Accountable Tech
ADL (Anti-Defamation League)
Becca Schmill Foundation
Berkeley Media Studies Group
Beyond the Screen
Campaign for Accountability
Children and Screens: Institute of Digital Media and Child Development
Civics Unplugged
Common Sense Media
Consumer Action
Consumer Federation of America
Cyber Collective
Design It For Us Coalition
Eating Disorders Coalition for Research, Policy, & Action
Ekō
Encode Justice
Enough Is Enough
Friends of the Earth
LookUp.live
Lynn’s Warriors
The National Alliance to Advance Adolescent Health
Parents Television and Media Council
Peace Educators Allied for Children Everywhere, Inc. (P.E.A.C.E.)
Public Citizen
Public Good Law Center
Tech Oversight Project
Turning Life On
[1] Complaint, In re Facebook, Inc. Docket No. C-4365 (July 7, 2012).
[2] Decision and Order, In re Facebook, Inc. Docket No. C-4365 (July 27, 2012).
[3] Complaint for Civil Penalties, Injunction and Other Relief, In re Facebook Inc., Case No. 19-cv-2184 (D.C. filed Jul 24, 2019).
[4] Order Modifying Prior Decision and Order, In re Facebook Inc. Docket No. C-4365 (April 23, 2020).
[5] Order to Show Cause Why the Commission Should Not Modify the Order and Enter the Proposed New Order, In re Facebook Inc., Docket No. C-4365 (May 3, 2023).
[6] Order Modifying Prior Decision and Order, In re Facebook Inc. Docket No. C-4365 (April 23, 2020).
[7] Comments of Center for Digital Democracy, Fairplay, et al. in the Matter of Trade Regulation Rule on Commercial Surveillance and Data Security at 29-33 Docket No. FTC-2022-053-0001 (submitted Nov. 21, 2022), https://fairplayforkids.org/wp-content/uploads/2022/11/ANPRM_comments.pdf (citing Kaiwen Sun et al., They See You’re a Girl if You Pick a Pink Robot with a Skirt: A Qualitative Study of How Children Conceptualize Data Processing and Digital Privacy Risks, CHI Conference on Human Factors in Computing Systems at 2 (May 2021), https://dblp.org/rec/conf/chi/SunSASGRS21; Mariya Stoilova et al., Digital by Default: Children’s Capacity to Understand and Manage Online Data and Privacy, 8 Media and Commc’n 197, 200, (2020), http://dx.doi.org/10.17645/mac.v8i4.3407; Priya Kumar et al., No Telling Passcodes Out Because They’re Private: Understanding Children’s Mental Models of Privacy and Security Online, 1 Proceedings of the ACM on Human- Computer Interaction 64, at 3, (November 2017), https://pearl.umd.edu/wp-content/uploads/2017/08/kumar- etal2018-CSCW-Online-First.pdf; EPIC, Disrupting Data Abuse: Protecting Consumers from Commercial Surveillance in the Online Ecosystem at 167–81 (Nov. 2022), https://epic.org/wp-content/uploads/2022/12/EPIC- FTC-commercial-surveillance-ANPRM-comments-Nov2022.pdf
[8] Spence v. Meta Platforms, N.D. Cal. Case No. 3:22-cv-03294 at 82 (June 6, 2022).
[9] Emily A. Vogels et al., Teens, Social Media and Technology 2022, Pew Research Center (Aug. 10,
2022), https://www.pewresearch.org/internet/2022/08/10/teens-social-media-andtechnology-2022; Chloe Wilkinson et al., Screen Time: The Effects on Children’s Emotional, Social, and Cognitive Development at 6 (2021), https://informedfutures.org/wp-content/uploads/Screen-time-Theeffects-on-childrens-emotional-social-cognitive- development.pdf;
[10] See note 8, supra.
[11] See Order to Show Cause, In re Facebook Inc., FTC File No. 212-3091 (May 3, 2023).
[12] 15 U.S.C. §45(b).
[13] Id.
[14] Center for Digital Democracy, Meta’s Virtual Reality-based Marketing Apparatus Poses Risks to Teens and Others (May 3, 2023), https://www.democraticmedia.org/article/metas-virtual-reality-based-marketing-apparatus- poses-risks-teens-and-others; Fairplay, Meta Has a Long History of Failing to Protect Children Online (May 3, 2023), https://fairplayforkids.org/wp-content/uploads/2023/05/meta_-fails_to_-protect_children.pdf; Center for a Commercial-Free Childhood (now Fairplay) et al., Request for Investigation of Facebook’s Messenger Kids (submitted Oct. 3, 2018), https://fairplayforkids.org/wp-content/uploads/archive/devel- generate/wab/FTC%20FB%20Messenger%20Kids%20Letter.pdf.
News
EPIC Testifies in Support of New Mexico Privacy Bill
February 14, 2025
EPIC Testifies in Support of New Mexico Privacy Bill
February 14, 2025


Support Our Work
EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.
Donate