FOIA Cases
EPIC v. FTC (Facebook Assessments)
Case No. 18-942 (2018)
US District Court for the District of Columbia
Background
In Freedom of Information Act lawsuit EPIC v. FTC, EPIC is seeking the Facebook assessments, reports, and related records required by the 2012 Consent Order. From 2009 to 2011, EPIC and a coalition of consumer organizations pursued several complaints with the Federal Trade Commission (“FTC”) alleging, among other points, that Facebook had changed user privacy settings and disclosed the personal data of users to third parties without the consent of users. EPIC had conducted extensive research and documented the instances of Facebook overriding users’ privacy settings to reveal personal information and to disclose, for commercial benefit, user data, and the personal data of friends and family members, to third parties without their knowledge or affirmative consent
FTC’s Investigation Into Facebook
In response to the complaints from EPIC and consumer privacy organizations, the FTC issued a Preliminary Order against Facebook in 2011 and then a Final Order in 2012. The FTC stated that Facebook “deceived consumers by telling them they could keep their information on Facebook private, and then repeatedly allowing it to be shared and made public.”
Under the proposed settlement, Facebook is barred from misrepresenting its privacy and security practices, as well as its compliance with any privacy program; is required to give its users a clear and prominent notice and obtain their affirmative express consent before sharing their information; is required to remove user information within 30 days after the user has deleted their account; is required to establish a comprehensive privacy program; and is required every two years for the next 20 years to obtain independent, third party audits certifying that it has a privacy program in place that complies with the Final Order.
Cambridge Analytica
On March 16, 2018, Facebook published a press release admitting the unlawful transfer of data from 50 million user profiles to the firm Cambridge Analytica, which harvested the data without user consent. Cambridge Analytica, hired by President Trump’s 2016 presidential campaign, was able to collect the private information of approximately 270,000 users and their extensive friend networks under false pretenses as a research-driven application. All of the users that participated in an online survey created by Cambridge University researcher Aleksandr Kogan. The users consented to having their data collected but was told it was for “academic use.” The third party application subsequently scraped the data of these user’s friends without their knowledge or consent and transferred the data to Cambridge Analytica. That estimated number has since increased to 87 million users, making it one of the largest unlawful data transfers in Facebook’s history.
The transfer of data is a violation of the 2012 Consent Order, which states that Facebook “shall not misrepresent in any manner, expressly or by implication . . . the extent to which [Facebook] makes or has made covered information accessible to third parties; and the steps [Facebook] takes or has taken to verify the privacy or security protections that any third party provides.”
Facebook’s Initial Assessment was due on April 13, 2013, and the subsequent reporting deadlines were due in 2015 and 2017. Cambridge Analytica engaged in the illicit collection of Facebook user data from 2014 to 2016, which is encompassed by the reporting period of the requested assessments.
On March 26, 2018, the FTC announced an investigation to determine whether Facebook violated the 2012 Consent Order. Acting Director Tom Pahl stated, “Companies who have settled previous FTC actions must also comply with FTC order provisions imposing privacy and data security requirements. Accordingly, the FTC takes very seriously recent press reports raising substantial concerns about the privacy practices of Facebook.”
Following the Cambridge Analytica scandal, lawmakers in the U.S. and abroad swiftly demanded answers from Facebook. On April 10, 2018, Mark Zuckerberg testified publicly before a joint hearing before both the Senate Judiciary and Senate Commerce Committees and the next day testified before the House Energy and Commerce Committee. Several state attorney generals have have opened both joint and independent investigations into Facebook’s involvement with Cambridge Analytica – including Massachusetts, New York, New Jersey, and Missouri. Moreover, it has been reported that the Department of Justice’s Special Counsel Robert Mueller has requested e-mails from Cambridge Analytica as part of his investigation into the Russian interference of the 2016 Presidential Election.
EPIC’s Interest
EPIC President Marc Rotenberg has stated: “It’s not clear why a company that has asked us to give up so much privacy should be allowed to maintain so much secrecy.” There is a profound and urgent public interest in the release of the Facebook Assessments and related records. The release of the full audits is crucial for Congress, the States Attorneys General, and the public to evaluate how the Cambridge Analytica breach occurred and how the FTC, Facebook, and the selected independent third-party auditor fulfilled their obligations under the 2012 Consent Order.
Central to EPIC’s mission is oversight and analysis of government activities. Through its Consumer Privacy Project, EPIC has brought numerous complaints and petitions to the Federal Trade Commission concerning business practices that implicate consumer privacy. Notably, EPIC has brought several complaints concerning Facebook’s business practices. In In re Facebook, EPIC brought a complaint focusing on the unfair and deceptive trade practices of Facebook with respect to the sharing of user information with third-party application developers that led to the 2011 Consent Decree. A year later, EPIC brought a second complaint, In re Facebook II, addressing Facebook’s latest round of changes. In In re Facebook (Psychological Study), EPIC filed a complaint concerning Facebook’s “secretive and non-consensual use of personal information to conduct an ongoing psychological experiment on 700,000 Facebook users, i.e. the company purposefully messed with people’s minds.” Lastly, in In re Facebook (Facial Recognition), EPIC and a coalition of consumer groups filed a complaint asserting that Facebook’s use of facial recognition techniques threaten user privacy and violates the 2012 Consent Order.
FOIA Documents
As a result of EPIC’s request and lawsuit, the Federal Trade Commission has released hundreds of pages of communications between Facebook and the FTC related to the agency’s enforcement of the 2012 Consent Order. The documents reveal the cordial relationship between the Commission and Facebook and provides insight into the FTC’s inability to make use of its current enforcement authority.
In the early years following the 2012 Consent Order, a set of e-mails revealed disagreement between Facebook and the FTC over potential enforcement action on Facebook’s proposed changes to its Data Use Policy and Statement of Rights and Responsibilities.
In a September 11, 2013 e-mail, the FTC counsel wrote that the agency is “greatly disappointed that [Facebook] did not provide [the FTC with] the information [the FTC] requested to assess Facebook’s compliance with the Commission’s orders.” The e-mail alludes to an earlier phone call where Facebook would not answer the agency’s questions to eight specific issues, “essentially making the call a waste of time.” Facebook responded to this e-mail by stating they were “surprised and concerned by the suggestion” that they did not address the FTC’s questions and stated that Facebook does not “believe there is any credible basis to assert that [the FTC’s] questions relate to Facebook’s obligation under the Consent Order.” Following this exchange, Facebook cooperated with the FTC’s request for information.
On September 20, 2013, the FTC sent a follow up letter to Facebook asking the company to address additional concerns about Facebook’s proposed changes to its Data Use Policy and Statement of Rights and Responsibilities. One concern involve mobile users not having the same access to settings for Facebook ads that desktop users have. The FTC wrote:
The failure to include these ads settings for mobile users appears to implicate Part I.B of the Order, which prohibits Facebook from misrepresenting the extent to which a consumer can control the privacy of any covered information maintained by Facebook and the steps a consumer must take to implement such controls.
Instead of definitively stating that Facebook’s failure to include these ad settings for mobile users violates the 2012 Consent Order, the FTC allowed Facebook to explain itself by stating “If Facebook contends this discrepancy does not implicate the Order, please explain the basis for this contention.”
Facebook responded to the FTC’s concerns, stating that the provided information “reflects Facebook’s continued commitment to cooperation and collaboration with [the FTC].” The FTC did not challenge Facebook’s contention that the discrepancy does not violate the 2012 Consent Order or seek additional follow up questions.
Communications since 2013 reflect a similar lack of commitment by the FTC to enforce the original 2012 Consent Order. For example, in a June 4, 2015 letter to Facebook, the FTC expressed concerns about the scope of Facebook’s 2015 assessment:
Though PricewaterhouseCoopers LLP’s (“PwC”) cover letter to the Assessment notes that Facebook made acquisitions during the Reporting Period . . . it states it excluded “any independently operated affiliates” from the Assessment. Despite Facebook’s assertions, PwC’s report does not demonstrate whether and how Facebook addressed the impact of acquisitions on its Privacy Program.
In a June 1, 2017 letter to Facebook, the FTC expressed similar concerns about the 2017 assessment and whether the audit evaluated the company’s acquisitions impact on Facebook’s privacy program. The FTC accepted Facebook and its auditor’s response letters assuring the FTC that the auditor addressed the impact of acquisitions on Facebook’s privacy program at face value without additional inquiry.
The records uncovered by EPIC’s FOIA request also show that the FTC, throughout the years, never explicitly declared any of Facebook’s actions to be in violation of the consent order. For example, in an April 2, 2015 letter, the FTC provided a gentle reminder of Facebook’s compliance obligations when the company experiences corporate changes such as acquisitions.
Whenever Facebook updated the agency on new product launches or policy changes, the FTC asked questions if necessary or provided timid “feedback.” For example, in an April 12, 2016 letter to Facebook counsel about its soon-to-be announced Account Kit product, the FTC wrote:
[A]lthough the Account Kit interface links to Facebook’s Terms and Data Use Policy, such links to lengthy and general documents are insufficient to disclosure an unexpected use of personal information, such as the use of authentication data for advertising. Adding a “Learn More” hyperlink is likewise insufficient. Furthermore, making the disclosures after consumers have provided their telephone number or email address likely creates deception.
- EPIC’s FOIA Request (March 20, 2018)
- FTC Acknowledgement Letter (March 29, 2018)
- FTC Unusual Circumstances E-mail (April 17, 2018)
- First Interim Production: Facebook Assessment Records (September 10, 2018)
- Second Interim Production: Communications Between FTC and Facebook About Compliance with Consent Order (October 12, 2018)
- Third Final Production: Additional Communications Between FTC and Facebook about Compliance with Consent Order (October 19, 2018)
Facebook Privacy Assessments
Facebook’s Privacy Assessments are critical to understanding how Facebook allowed the unauthorized disclosure of 87 million user records to Cambridge Analytica while it was under an FTC Order. The Privacy Assessments were required as part of the 2012 Consent Order. The Order required Facebook to implement a “comprehensive privacy program” designed to identify “reasonably foreseeable” risks that could result in the “unauthorized collection, use, or disclosure” of personal data.” To assess the effectiveness of that program, Facebook was required to “obtain initial and biennial assessments and reports” from an independent third party. To date, there are three published assessment periods: 2013, 2015, and 2017.
- 2017 Independent Privacy Assessment (reprocessed June 26, 2018)
- 2015 Independent Privacy Assessment (reprocessed June 26, 2018)
- 2013 Independent Privacy Assessment (reprocessed June 26, 2018)
The Privacy Assessments were performed by PricewaterhouseCoopers LLP (PwC), an auditing and accounting firm. PwC performed its assessments by conducting interviews with Facebook employees and making observations, both “physically or online” to assess the “effectiveness of the controls and safeguards implemented.”
What is most notable about these assessments is that they cover the period during which the Cambridge Analytica scandal occurred, yet PwC found each time that “Facebook’s privacy controls were operating with sufficient effectiveness to provide reasonable assurance to protect the privacy of covered information.”
In particular, the 2013-15 Assessment covers the period during which Dr. Alexander Kogan, a researcher for Cambridge University, developed an app that allowed him to collect the data of 87 million Facebook users without their consent and sell it to Cambridge Analytica in violation of Facebook’s terms. The 2015-17 report covers the period during which Facebook discovered this data transfer had occurred. Both Assessments state:
[Facebook] has implemented mechanisms to ensure that Facebook obtains consent from users prior to disclosing non-public personal information to third-party developers … Facebook requires developers who access non-public APIs to agree to Facebook’s Data Use Policy, Terms, and Platform Policies.
In May 2018, following the Cambridge Analytica scandal, Facebook revealed that it had suspended 200 apps over possible data misuse. In June 2018, reported that Facebook had overridden users’ privacy settings to grant at least 60 device makers secret access to user data. Neither issue is mentioned in the Assessments. The 2015-17 Assessment also fails to mention a major change Facebook made to its platform in 2015, when it restricted third-party access to the API that allowed Cambridge Analytica to gain access to friends’ data.
EPIC told Congress that, “[t]he transfer of 87 million user records to Cambridge Analytica could have been avoided if the FTC had done its job. The 2012 Consent Order against Facebook was issued to protect the privacy of user data.” The Privacy Assessments were supposed to “certify that [Facebook’s] privacy controls [were] operating with sufficient effectiveness.” Had the Assessments complied with this requirement, they would have alerted the FTC to problems with Facebook’s privacy controls long before the public learned about them in 2018. The FTC also would not have waited until the Cambridge Analytica scandal to launch an investigation into Facebook’s privacy practices.
Legal Documents
U.S. District Court for the District of Columbia (No. 18-942)
- EPIC Complaint (April 20, 2018)
- FTC Answer (May 24, 2018)
- Facebook Motion to Intervene (May 3, 2019)
- EPIC Opposition to Facebook’s Motion to Intervene (June 14, 2019)
- Order Granting Facebook’s Motion to Intervene (Aug. 28, 2019)
- Facebook’s Intervenor Answer (Aug. 28, 2019)
- EPIC’s Motion for Partial Summary Judgment on Attorney’s Fees and Costs (Nov. 22, 2019)
- FTC Opposition to EPIC’s Motion (Dec. 13, 2019)
- EPIC’s Reply (Dec. 20, 2019)
- Magistrate’s Report and Recommendation on EPIC’s Motion (June 16, 2020)
- Stipulation of Dismissal (June 24, 2020)
- FTC, Facebook Say $5B Privacy Deal Benefits Consumers, Law360, January 28, 2020
- FTC settlement requires ‘sea change’ for Facebook privacy, DOJ says, POLITICO Pro, January 24, 2020
- Facebook still hasn’t paid that $5B FTC fine, but what happens when it does?, Mashable, January 17, 2020
- Facebook’s FTC Privacy Settlement Challenged in Court, BankInfoSecurity, January 10, 2020
- Brazil Fines Facebook Over Cambridge Analytica Data Sharing, Law360, January 3, 2020
Support Our Work
EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.
Donate