Schoolwork Without Surveillance: The FTC’s Crackdown on Ed Tech Privacy Abuses
May 26, 2022 |
On May 19, the Federal Trade Commission unanimously voted to approve a Children’s Online Privacy Protection Act (COPPA) policy statement that warns against surveillance as a condition of accessing educational tools. The statement explains that “Children should not have to needlessly hand over their data and forfeit their privacy in order to do their schoolwork or participate in remote learning, especially given the wide and increasing adoption of ed tech tools.” The FTC emphasized COPPA’s prohibition against mandatory collection, limitations on the use and retention of data, and security requirements.
The education technology (ed tech) sector expanded dramatically during the COVID-19 pandemic as schools and students shifted to remote learning. Although ed tech can serve a valuable role in classroom settings, an alarming number of ed tech systems have been found to surveil children and exploit personal data. These tools can pose serious threats to student and children’s privacy through overcollection of personal information, reliance on biometric data, lack of security, opaque algorithms, and unreliable AI. The FTC’s statement highlights the special risks that ed tech tools present, explaining that “Concerns about data collection are particularly acute in the school context, where children and parents often have to engage with ed tech tools in order to participate in a variety of school-related activities.”
The Commission’s COPPA statement “demands enforcement of meaningful substantive limitations on operators’ ability to collect, use, and retain children’s data, and requirements to keep that data secure.” It lays out the FTC’s commitment to ensuring that ed tech tools “and their attendant benefits do not become an excuse to ignore critical privacy protections for children.”
In investigating potential COPPA violations, the Commission will focus on:
What Else the FTC Can Do
EPIC urges the FTC to build on its COPPA policy statement and make full use of its authorities and resources to protect children’s privacy. While EPIC has long advocated for a dedicated data protection agency and for comprehensive privacy legislation, there is much the FTC can do in the meantime protect children, students, and the public at large from abusive data practices.
- Enact a Data Minimization Rule
In January, EPIC and Consumer Reports published a white paper urging the FTC to use its broad unfairness authority to promulgate a data privacy rule. The FTC’s COPPA statement explained how targeted advertising can threaten children’s privacy, “The development of ever more sophisticated targeting practices, in some cases based on comprehensive collection of users’ activities across the Internet, has raised concerns that businesses might engage in harmful conduct and led to calls for strengthening children’s privacy protections.” Since the Commission revised the COPPA Rule in 2013, “companies’ information collection practices have continued to become more extensive, and concerns remain that children’s information may be used to target them.”
EPIC and Consumer Report’s white paper suggests a Data Minimization rule to prohibit all secondary data uses with limited exceptions. The paper further encourages the FTC to adopt data transparency obligations for primary use of data; civil rights protections over discriminatory data processing; nondiscrimination rules, so that users cannot be charged for making privacy choices; data security obligations; access, portability, correction, and deletion rights; and a prohibition on the use of dark patterns with respect to data processing.
- Regulate Commercial Uses of AI
In February 2020, EPIC filed a petition with the FTC calling on the agency to conduct a rulemaking concerning the use of artificial intelligence in commerce. EPIC encourages the FTC to promulgate rules that would regulate AI-driven ed tech. “Given the scale of commercial AI use, the rapid pace of AI development, and the very real consequences of AI-enabled decision-making for consumers, the Commission should immediately initiate a rulemaking to define and prevent consumer harms resulting from AI,” EPIC urged in its petition. EPIC called on the FTC to enforce the AI standards established in the OECD AI Principles, the OMB AI Guidance, and the Universal Guidelines for AI. EPIC’s petition followed two prior EPIC complaints to the FTC about the use of AI in employment screening and the secret scoring of young athletes.
- Investigate the Data Practices of Ed Tech Firms
EPIC encourages the FTC to use its investigative powers to scrutinize the data practices of ed tech companies, including the online test proctoring firms implicated in EPIC’s complaint to the DC Attorney General.
- Develop and Leverage Unused Statutory Authorities
In 2021 EPIC released a report—What the FTC Could Be Doing (But Isn’t) to Protect Privacy—highlighting several statutory authorities that the FTC has failed to fully exercise to protect privacy. EPIC encourages the FTC to use all of the authorities in its statutory toolbox to address abusive data practices and to safeguard the privacy of students and children.
EPIC’s Prior Work on Children’s and Student Privacy
- In May 2022, EPIC Law Fellow Sara Geoghegan delivered comments commending the FTC’s new COPPA statement policy and encouraging the Commission to use all of its authorities to protect children’s and students’ privacy, including by promulgating a data minimization rule.
- In May 2021, EPIC’s work on its Student Privacy Project and online test proctoring complaint was selected for inclusion in the spring 2021 Tech Spotlight Casebook, a publication of the Harvard Kennedy School’s Belfer Center for Science and International Affairs. The casebook “recognizes projects and initiatives that demonstrate a commitment to public purpose in the areas of digital, biotech, and future of work.”
- In December 2020, EPIC filed a complaint with the D.C. Attorney General against five providers of online test proctoring software and services that use a range of tools to surveil or enable the surveillance of test-takers
- In May 2020, EPIC and a coalition of consumer groups filed an FTC complaint against TikTok for violating COPPA.
- In December 2019, EPIC filed comments with the FTC concerning the agency’s regulatory review of COPPA. The FTC previously considered EPIC’s recommendations in a 2005 review of the COPPA Rule and incorporated several of EPIC’s recommendations in the 2013 regulations.
- In February 2019, EPIC and a coalition of consumer groups filed an FTC complaint charging that Facebook engaged in unfair and deceptive practices and violated COPPA after court documents from a 2012 class action lawsuit revealed that Facebook encouraged children to make credit card purchases on Facebook’s platform.
- In August 2018, FTC unanimously voted to approve recommendations from EPIC and others to strengthen safeguards for children’s data in the gaming industry.
- In March 2018, EPIC filed an amicus brief in Jackson et al. v. McCurry et al. arguing that the Fourth Amendment restricts school administrators’ power to access a student’s cell phone without consent.
- In 2015, EPIC filed an amicus brief in Commonwealth v. White arguing against the warrantless seizure of students’ cell phones by law enforcement.
- In 2013, EPIC filed a FOIA lawsuit seeking records relating to the Department of Education’s use of private debt collectors.
- In 2011, EPIC filed an amicus brief in Chicago Tribune v. University of Illinois underscoring the privacy protections that FERPA provides for student records.
- In 2009, EPIC filed an FTC complaint aginst Echometrix: alleging that the company engaged in unfair and deceptive trade practices by representing that its software protected children online while simultaneously collecting and disclosing information about children’s online activity.
- In 2003, EPIC and 11 consumer organizations filed an FTC complaint alleging that Amazon illegally collected and disclosed children’s personal information in violation of COPPA.
- In 1996, EPIC Executive Director Marc Rotenberg testified before the House Judiciary Committee to highlight the “unique problems in the collection and disclosure of data about children that argues in favor of strong privacy protection.”