Consumer Cases
In re Facebook and Facial Recognition (2018)
Summary
On April 6, 2018, EPIC and a coalition of consumer privacy organizations filed a complaint with the Federal Trade Commission, charging that Facebook’s facial recognition practice lacks privacy safeguards and violates the 2011 Consent Order with the Commission.
The complaint addresses Facebook’s business change that went into effect in early 2018, which enables Facebook to routinely scan photos, posted by users, for biometric facial matches without the consent of either the image subject or the person who uploaded the photo. EPIC and consumer groups emphasized to the FTC that “the scanning of facial images without express, affirmative consent is unlawful and must be enjoined.”
In the complaint, EPIC and the groups ask the FTC to investigate Facebook, determine the extent of the harm to consumer privacy and safety, require Facebook to cease the collection and use of users’ biometric data without their affirmative and express opt-in consent, prohibit the deployment of further facial recognition techniques, delete all facial templates and biometric identifiers wrongly obtained, establish appropriate security safeguards, limit the disclosure of user information to third parties, and seek appropriate injunctive and compensatory relief. The following organizations signed onto the complaint: Electronic Privacy Information Center, Campaign for a Commercial Free Childhood, Center for Digital Democracy, Constitutional Alliance, Consumer Action, Consumer Federation of America, Consumer Watchdog, Cyber Privacy Project, Defending Rights & Dissent, Government Accountability Project, Patient Privacy Rights, Southern Poverty Law Center, U.S. Public Interest Research Group.
Background
EPIC’s Previous Complaints on Facebook and Facial Recognition
EPIC has previously urged the Commission to prohibit Facebook’s facial recognition techniques on multiple occasions.
In June 2011, EPIC and a coalition of consumer organizations filed a complaint with the FTC alleging that Facebook’s covert deployment of its facial recognition technology was unfair and deceptive. EPIC stated that Facebook’s “Tag Suggestions” technique, “converts the photos uploaded by Facebook users into an image identification system under the sole control of Facebook. This has occurred without the knowledge or consent of Facebook users and without adequate consideration of the risks to Facebook users.” EPIC warned that “unless the Commission acts promptly, Facebook will routinely automate facial identification and eliminate any pretense of user control over the use of their own images for online identification.” EPIC emphasized that the Commission’s “failure to act on pending consumer complaints concerning Facebook’s unfair and deceptive trade practices may have contributed to Facebook’s decision to deploy facial recognition.”
In December 2011, EPIC urged the Commission to strengthen its proposed settlement with Facebook by requiring it to “cease creating facial recognition profiles without users’ affirmative consent.” EPIC contended that while the Order’s broad prohibition on privacy misrepresentations already covered Facebook’s deceptive use of facial recognition, the Order should have been amended to proscribe the practice explicitly.
In January, 2012, EPIC submitted extensive comments in response to the FTC’s workshop “Facing Facts: A Forum on Facial Recognition Technology.” EPIC again emphasized that Facebook’s facial recognition practice “entirely fails at informing users how their photo data will be used or to provide any meaningful consent for use,” as required by the Order. EPIC advised the Commission that, “Commercial actors should not deploy facial techniques until adequate safeguards are established. As such safeguards have not yet been established, EPIC would recommend a moratorium on the commercial deployment of facial recognition techniques.”
2018 FTC Complaint
EPIC’s 2018 FTC complaint on Facebook and Facial Recognition is signed by a number of other consumer privacy organizations, including the Campaign for a Commercial Free Childhood, Center for Digital Democracy, Constitutional Alliance, Consumer Action, Consumer Federation of America, Consumer Watchdog, Cyber Privacy Project, Defending Rights & Dissent, Government Accountability Project, Patient Privacy Rights, Privacy Rights Clearinghouse, Southern Poverty Law Center, and U.S. Public Interest Research Group.
The complaint concerns recent changes in Facebook’s business practices that threaten user privacy and violate the 2011 Consent Order with the Federal Trade Commission. Facebook has begun to routinely scan photos, posted by users, for biometric facial matches without the consent of either the image subject or the person who uploaded the photo.
The complaint alleges that Facebook seeks to perfect its facial recognition techniques by enlisting Facebook users in the process of confirming their image identity. This automated, deceptive, and unnecessary identification of individuals undermines user privacy, ignores the privacy settings of Facebook users, and is contrary to law in many parts of the world. The Commission is required by law to undertake an investigation, to enjoin these unlawful practices, and to provide appropriate remedies to users of the service.
Facebook’s Deployment of Facial Recognition
On December 19, 2017, Facebook stated in a press release that this change will roll-out to users in the United States in early 2018 on an opt-out basis. Facebook automatically enrolled users whose privacy settings enabled Tag Suggestions–a technology that Facebook turned on by default to all users without their knowledge or affirmative express consent in 2013 (1.19 billion monthly active users at the time).
Facebook’s Tag Suggestions technique converts the photos uploaded by Facebook users into an image identification system under the sole control of Facebook. Many users remain unaware that Tag Suggestions applied to them by default in 2013, and that there is a choice to opt-out. Therefore, Facebook’s reliance on this prior setting to infer consent for additional facial recognition practices, which gives Facebook unprecedented control over facial templates, signifies a detrimental disregard for consumer privacy and an urgent need for FTC intervention.
The change in business practice adversely subjects consumers to secretive, unnecessary, and undesired facial recognition and the disclosure of personal data to third parties without affirmative express consent or clear and prominent notice. Thus, it constitutes a violation of the 2011 FTC Consent Order which requires Facebook to “obtain consumers’ affirmative express consent before enacting changes that override their privacy preferences.”
Privacy Implications of Facebook Facial Scanning
Biometric data can be analyzed to produce sensitive inferences about personal traits, demographics, and behaviors. With photo uploads on Facebook totaling over 350 million per day, Facebook’s unprecedented repository of facial recognition data puts at risk the sensitive personal information of 2.13 billion users to misuse by app developers, advertisers, data brokers, foreign state actors, and government agencies. The interest of approximately 214 million users on Facebook fall within the jurisdiction of the United States Federal Trade Commission.
Facebook’s changes to its facial recognition techniques operate through an expansive machine learning system that automatically scans photos to notify users when their biometric face print is detected on an image, even if it has not been tagged by the uploader. Users are notified to “find” photos that they are in but have not been tagged, as long as the photo’s privacy settings allow the user to view it as a Friend, Public, or Custom Audience.
After the changes in 2018, Facebook users can identify themselves in an otherwise anonymous image without any tags. Serious privacy implications arise from enabling photo subjects to contact the uploader about the facial recognition notification, as it would enable strangers to learn about an individual’s precise identity on Facebook merely from taking a picture of them appearing in the background. The unnecessary identification of individuals also reveals personal and private details about where the image subject was, who they were with, and at what time. This technology imperils both consumer privacy and physical safety from harassment and stalking.
Prohibitions of Facebook Facial Recognition
Facebook’s extension of facial recognition technology excludes users in Canada and Europe. Regulators in Canada and Europe have imposed strict limits on how companies can collect and store biometric data. Facebook’s deployment of its facial recognition technology is an illegal invasion of citizens’ privacy rights in Canada and Europe, yet continues to proliferate in the U.S without restriction.
Facebook’s invasive facial recognition techniques may also contravene several state privacy laws that prohibit and limit the collection, use, and dissemination of biometric data. Illinois has enacted the Biometric Information Privacy Act, upon which Facebook faces a class action lawsuit on the tag suggestions technology. Texas and Washington also have biometric identifier laws that allow the state attorneys general to bring legal action on prohibited biometric data practices.
Violations of 2011 Consent Order
Facebook has violated Part I(A)-(B) of the Consent Order by misrepresenting the extent to which the user can control the privacy of biometric information, and the extent of Facebook’s collection and disclosure of the facial templates and photo comparison data to third parties.
Facebook has violated Part II(B) of the Consent Order by failing to obtain affirmative express consent before implementing business changes to facial recognition techniques. Any claims of inferred or continuing consent from the user’s prior setting on Tag Suggestions is invalid, as Facebook has never given users a choice to opt-in to facial recognition.
Facebook’s recent notice to users on the changes to the extent of facial recognition does not conspicuously present an opt-out button as it merely links a “Go to Settings” button. The announcement appeared on top of the news feed as a text box, but did not distinguish itself clearly and prominently as a notice for a significant change to the user’s privacy setting.
Facebook did not persistently remind the users to manage their privacy settings to address this significant change in biometric data practice. The disclaimer on top of the news feed disappeared after refreshing the page throughout the day.
Contrary to Part II(A) of the 2011 Consent Order, Facebook did not provide a “clear and prominent” notice to users prior to disclosing a user’s nonpublic information with third parties, and materially exceeding the restrictions imposed by a user’s privacy settings.
About FTC Consent Orders
The effectiveness of the FTC depends primarily upon the agency’s willingness to enforce the legal judgments it obtains. However, the FTC routinely fails to enforce its consent orders, which promotes industry disregard for the FTC. Companies under consent decree have no incentive to protect consumer data if they do not anticipate the FTC to hold them accountable when they violate consent decrees.
EPIC has routinely called attention to the numerous changes Facebook has made to its privacy settings without obtaining users’ affirmative consent, in violation of the terms of its FTC consent decree.
FTC Authority to Act
The Commission has a non-discretionary obligation to enforce a final order.
To date, the FTC has failed to take any action with respect to Facebook’s changes in biometric privacy practices. Critically, the Commission has not filed a lawsuit pursuant to, the Federal Trade Commission Act which states that the FTC “shall” obtain injunctive relief and recover civil penalties against companies that violate consent orders. 15 U.S.C. S 45(l).
The FTC has exclusive authority over the enforcement of its consent orders. The enforcement provision of the FTC Act, Section 5(l), makes clear that the agency action is not discretionary; a violating party “shall forfeit” a penalty and be subject to an enforcement action.
The FTC is charged with performing a “discrete agency action.” A “discrete agency action” is a “final agency action” under the Administrative Procedure Act. “Agency action unlawfully withheld” is a defined as “discrete agency action that [the agency] is required to take.”
Agency action is the “whole or part of an agency rule order, license, sanction, relief, or the equivalent or denial thereof, or failure to act.” 5 U.S.C. S 551 (13). Agency action, including a “failure to act” is subject to judicial review.
EPIC may “compel agency action unlawfully withheld” pursuant to the Administrative Procedure Act. 5 U.S.C. S 706(1).
Legal Documents
- EPIC’s FTC Complaint in In re Facebook and Facial Recognition (filed June 10, 2011)
- EPIC’s Previous FTC Complaint in In re Facebook (filed May 5, 2010).
- EPIC’s Previous FTC Complaint in In re Facebook (filed December 17, 2009).
- EPIC’s Previous Supplemental Complaint in In re Facebook (filed January 14, 2010).
Resources
-
In re Facebook and the Facial Identification of Users
Consumer Cases
Concerning Facebook’s covert biometric data collection, and the subsequent use of this data for online identification
-
In re Facebook
Consumer Cases
EPIC’s complaint focuses on the unfair and deceptive trade practices of Facebook with respect to sharing of user information with third-party application developers.
-
In re Facebook II
Consumer Cases
EPIC’s complaint focuses on the unfair and deceptive trade practices of Facebook with respect to sharing of user information with third-party application developers.
-
Enforcement of Privacy Laws
There are two main forms of enforcement in U.S. privacy laws: government enforcement, typically by an agency of relevant jurisdiction or State Attorneys General, and private right of action.
-
Social Media Privacy
Too many social media platforms are built on excessive collection, algorithmic processing, and commercial exploitation of users’ personal data. That must change.
News
EPIC Urges CFPB to Grant Petition Addressing Coerced Debt
October 10, 2024
Support our Work
EPIC's work is funded by the support of individuals like you, who help us to continue to protect privacy, open government, and democratic values in the information age.
Donate