Over the past two decades, social media platforms have become vast and powerful tools for connecting, communicating, sharing content, conducting business, and disseminating news and information. Today, millions or billions of users populate major social networks including Facebook, Instagram, TikTok, Snapchat, YouTube, Twitter, LinkedIn, and dating apps like Grindr and Tinder.
But the extraordinary growth of social media has given platforms extraordinary access and influence into the lives of users. Social networking companies harvest sensitive data about individuals’ activities, interests, personal characteristics, political views, purchasing habits, and online behaviors. In many cases this data is used to algorithmically drive user engagement and to sell behavioral advertising—often with distortive and discriminatory impacts.
The privacy hazards of social networks are compounded by platform consolidation, which has enabled some social media companies to acquire competitors, exercise monopolistic power, and severely limit the rise of privacy-protective alternatives. Personal data held by social media platforms is also vulnerable to being accessed and misused by third parties, including law enforcement agencies.
As EPIC has long urged, Congress must enact comprehensive data protection legislation to place strict limits on the collection, processing, use, and retention of personal data by social networks and other entities. The Federal Trade Commission should also make use of its existing authority to rein in abusive data practices by social media companies, and both the FTC and Congress must take swift action to prevent monopolistic behavior and promote competition in the social media market.
Social Media & Surveillance Advertising
Social media companies—and in particular, Facebook—collect vast quantities of personal data in order to “microtarget” advertisements to users. This practice, also known as surveillance advertising or behavioral advertising, is deeply harmful to privacy, the flow of information, and the psychological health of social media users.
As former FTC Commissioner Rohit Chopra wrote in his dissent from the FTC’s 2019 Facebook order, “Behavioral advertising generates profits by turning users into products, their activity into assets, their communities into targets, and social media platforms into weapons of mass manipulation.” Chopra went on to explain how surveillance advertising operates in Facebook’s case:
Notably, tracking and behavioral advertising by social media companies is not limited to the platforms themselves. Firms like Facebook use hard-to-detect tracking techniques to follow individuals across a variety of apps, websites, and devices. As a result, even those who intentionally opt out of social media platforms are affected by their data collection and advertising practices.
Social Media & Competition
Data collection is at the core of many social media platforms’ business models. For this reason, mergers and acquisitions involving social networks pose acute risks to consumer privacy. Yet in recent years, platforms that have promised to protect user privacy have been repeatedly taken over by companies that fail to protect user privacy.
One of the most notable examples of this trend is Facebook’s 2014 purchase of WhatsApp, a messaging service that attracted users precisely because of strong commitments to privacy. WhatsApp’s founder stated in 2012 that, “[w]e have not, we do not and we will not ever sell your personal information to anyone.” Although EPIC and the Center for Digital Democracy urged the FTC to block the proposed Facebook-WhatsApp deal, the FTC ultimately approved the merger after both companies promised not to make any changes to WhatsApp user privacy settings.
However, Facebook announced in 2016 that it would begin acquiring the personal information of WhatsApp users, directly contradicting their previous promises to honor user privacy. Antitrust authorities in the EU fined Facebook $122 million in 2017 for making deliberately false representations about the company’s ability to integrate the personal data of WhatsApp users. Yet the FTC took no further action at the time. It wasn’t until the FTC’s 2020 antitrust lawsuit against Facebook—six years after the merger—that the FTC publicly identified Facebook’s acquisition of WhatsApp as part of a pattern of anticompetitive behavior.
For many years, the United States stood virtually alone in its unwillingness to address privacy as an important dimension of competition in the digital marketplace. With the 2020 wave of federal and state antitrust lawsuits against Facebook and Google—and with a renewed interest in antitrust enforcement at the FTC—that dynamic may finally be changing. But moving forward, it is vital that antitrust enforcers take data protection and privacy into account in their antitrust enforcement actions and assessments of market competition. If the largest social media platforms continue to buy up new market entrants and assimilate their users’ data into the existing platforms, there will be no meaningful opportunity for other firms to compete with better privacy and data security practices.
Social Media & Data Breaches
The massive stores of personal data that social media platforms collect and retain are vulnerable to hacking, scraping, and data breaches, particularly if platforms fail to institute critical security measures and access restrictions. Depending on the network, the data at risk can include location information, health information, religious identity, sexual orientation, facial recognition imagery, private messages, personal photos, and more. The consequences of exposing this information can be severe: from stalking to the forcible outing of LGBTQ individuals to the disclosure of one’s religious practices and movements.
Without federal comprehensive privacy legislation, users often have little protection against data breaches. Although social media companies typically publish privacy policies, these policies are wholly inadequate to protect users’ sensitive information. Privacy policies are disclaimers published by platforms and websites that purport to operate as waivers once users “consent” to them. But these policies are often vague, hard to interpret, full of loopholes, subject to unilateral changes by the platforms, and difficult or impossible for injured users to enforce.
EPIC’s Work on Social Media Privacy
For more than a decade, EPIC has advocated before Congress, the courts, and the Federal Trade Commission to protect the privacy of social media users.
Beginning in 2008, EPIC warned of the exact problem that would later lead to the Facebook Cambridge Analytica scandal. In Senate testimony in 2008, then-EPIC President Marc Rotenberg stated that, “on Facebook … third party applications do not only access the information about a given user that has added the application. Applications by default get access to much of the information about that user’s friends.”
In 2009, EPIC and nine other public interest organizations filed a complaint with the FTC detailing how Facebook changed its privacy settings to begin disclosing information to third-party applications and the public which users had sought to keep private. Facebook implemented these changes without obtaining affirmative consent from its users or even giving them the ability to opt out. In 2011, the FTC announced that Facebook had settled charges that it deceived users by failing to keep its privacy promises and credited EPIC with providing the factual basis for its complaint against Facebook.
In 2014, EPIC filed a complaint with the FTC alleging that Facebook “altered the News Feeds of Facebook users to elicit positive and negative emotional responses.” Facebook had teamed up with researchers to conduct a psychological experiment by exposing one group of users to positive emotional content and another group of users to negative emotional content to determine whether users would alter their own posting behavior. The study found that “emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.” EPIC alleged that the researchers who conducted the study “failed to follow standard ethical protocols for human subject research.” EPIC further alleged that Facebook engaged in unfair and deceptive practices in violation of Section 5 of the FTC Act by not informing users that they were potentially subject to behavioral testing. Finally, EPIC alleged that Facebook’s psychological study violated the 2011 FTC Consent Order by misrepresenting its data collection practices.
In 2014, when Facebook entered a deal to acquire the text-messaging application WhatsApp, EPIC and the Center for Digital Democracy filed a complaint with the FTC urging the Commission to block Facebook’s acquisition of WhatsApp unless adequate privacy safeguards were established. Although the FTC approved the merger, the Commission sent a letter to Facebook and WhatsApp notifying the companies of their obligations to honor their privacy promises. In 2016, WhatsApp announced its plans to transfer users’ personal information to Facebook for use in targeted advertising.
In March 2018, news broke that Facebook had allowed Cambridge Analytica, a political data mining firm associated with the Trump campaign, to access personal information on 87 million Facebook users. EPIC and a coalition of consumer organizations immediately wrote a letter to the FTC urging it to investigate this unprecedented disclosure of personal data. The groups made clear that by exposing users’ personal data without their knowledge or consent, Facebook had violated the 2011 Consent Order with the FTC, which made it unlawful for Facebook to disclose user data without affirmative consent. The groups wrote that, “The FTC’s failure to enforce its order has resulted in the unlawful transfer of  million user records … [i]t is unconscionable that the FTC allowed this unprecedented disclosure of Americans’ personal data to occur. The FTC’s failure to act imperils not only privacy but democracy as well.”
EPIC also submitted an urgent FOIA request to the FTC following the Cambridge Analytica revelations. The request sought all the privacy assessments required by the FTC’s 2011 Order and all communications between the FTC and Facebook regarding those privacy assessments. Following the FTC’s release of heavily redacted versions of the assessments, EPIC filed a Freedom of Information Act lawsuit to obtain the full, unredacted reports from the FTC.
In 2019, following a proposed settlement between the FTC and Facebook in connection with the Cambridge Analytica breach, EPIC moved to intervene in United States v. Facebook to protect the interests of Facebook users. EPIC argued in the case that the settlement was “not adequate, reasonable, or appropriate.”
In 2020, following President Trump’s threat to effectively ban social network TikTok from the United States, Oracle reached a tentative agreement to serve as TikTok’s U.S. partner and to “independently process TikTok’s U.S. data.” In response, EPIC sent demand letters to Oracle and TikTok warning both of their legal obligation to protect the privacy of TikTok users if the companies entered a partnership. The deal would have paired one of the largest brokers of personal data with a network of 800 million users, creating grave privacy and legal risks. “Absent strict privacy safeguards, which to our knowledge Oracle has not established, [the] collection, processing, use, and dissemination of TikTok user data would constitute an unlawful trade practice,” EPIC wrote. In 2021, the Oracle-TikTok deal was effectively scuttled.
Also in 2020, EPIC and coalition of child advocacy, consumer, and privacy groups filed a complaint urging the Federal Trade Commission to investigate and penalize TikTok for violating the Children’s Online Privacy Protection Act. TikTok paid a $5.7 million fine for violating the children’s privacy law in 2019. Nevertheless, TikTok failed to delete personal information previously collected from children and was still collecting kids’ personal information without notice to and consent of parents.
Recent Documents on Social Media Privacy
US Supreme Court
Whether the First Amendment prevents nearly all regulation of social media companies' content-hosting and content-arranging decisions.
Pennsylvania Supreme Court
Whether the First Amendment protects a public employee from being fired for a Facebook post
US Court of Appeals for the Ninth Circuit
Whether Facebook violated the privacy rights of users by tracking their web browsing history even after they logged out of the platform
US District Court for the District of Columbia
Seeking disclosure of Facebook assessments, reports, and related records required by the 2012 FTC Consent Order
Charging that Facebook's facial recognition practice lacks privacy safeguards and violates the 2011 Consent Order with the FTC
Measuring Americans’ comfort with research uses of their social media data
Sarah Gilbert, Jessica Vitak, & Katie Shilton | 2021
The Other Coup
Shoshana Zuboff | 2021
Inside the Making of Facebook’s Supreme Court
Kate Klonick | 2020
Zucked: Waking Up to the Facebook Catastrophe
Roger McNamee | 2019
The Age of Surveillance Capitalism
Shoshana Zuboff | 2019
Privacy, Sharing, and Trust: The Facebook Study
Ari Ezra Waldman | 2016