Schoolwork Without Surveillance: The FTC’s Crackdown on Ed Tech Privacy Abuses

May 26, 2022 | Sara Geoghegan, EPIC Law Fellow

On May 19, the Federal Trade Commission unanimously voted to approve a Children’s Online Privacy Protection Act (COPPA) policy statement that warns against surveillance as a condition of accessing educational tools. The statement explains that “Children should not have to needlessly hand over their data and forfeit their privacy in order to do their schoolwork or participate in remote learning, especially given the wide and increasing adoption of ed tech tools.” The FTC emphasized COPPA’s prohibition against mandatory collection, limitations on the use and retention of data, and security requirements.

The education technology (ed tech) sector expanded dramatically during the COVID-19 pandemic as schools and students shifted to remote learning. Although ed tech can serve a valuable role in classroom settings, an alarming number of ed tech systems have been found to surveil children and exploit personal data. These tools can pose serious threats to student and children’s privacy through overcollection of personal information, reliance on biometric data, lack of security, opaque algorithms, and unreliable AI. The FTC’s statement highlights the special risks that ed tech tools present, explaining that “Concerns about data collection are particularly acute in the school context, where children and parents often have to engage with ed tech tools in order to participate in a variety of school-related activities.”

The Commission’s COPPA statement “demands enforcement of meaningful substantive limitations on operators’ ability to collect, use, and retain children’s data, and requirements to keep that data secure.” It lays out the FTC’s commitment to ensuring that ed tech tools “and their attendant benefits do not become an excuse to ignore critical privacy protections for children.”

In investigating potential COPPA violations, the Commission will focus on:

Prohibition Against Mandatory Collection: COPPA-covered companies, including ed tech providers, must not condition participation in any activity on a child disclosing more information than is reasonably necessary for the child to participate in that activity. These businesses cannot stop students from engaging in an ed tech activity if they do not provide information beyond what is reasonably needed to administer the students’ participation in the activity. For example, if an ed tech provider does not reasonably need to be able to email students, it cannot condition the student’s access to schoolwork on students providing their email addresses. Students must not be required to submit to unnecessary data collection in order to do their schoolwork.

Use Prohibitions: COPPA-covered companies, including ed tech providers, are strictly limited in how they can use the personal information they collect from children. For example, operators of ed tech that collect personal information pursuant to school authorization may use such information only to provide the requested online education service. In this context, ed tech companies are prohibited from using such information for any commercial purpose, including marketing, advertising, or other commercial purposes unrelated to the provision of the school-requested online service.

Retention Prohibitions: COPPA-covered companies, including ed tech providers, must not retain personal information collected from a child longer than reasonably necessary to fulfill the purpose for which it was collected. It is unreasonable, for example, for an ed tech provider to retain children’s data for speculative future potential uses.

Security Requirements: COPPA-covered companies, including ed tech providers, must have procedures to maintain the confidentiality, security, and integrity of children’s personal information. For example, even absent a breach, COPPA-covered ed tech providers violate COPPA if they lack reasonable security.

What Else the FTC Can Do

EPIC urges the FTC to build on its COPPA policy statement and make full use of its authorities and resources to protect children’s privacy. While EPIC has long advocated for a dedicated data protection agency and for comprehensive privacy legislation, there is much the FTC can do in the meantime protect children, students, and the public at large from abusive data practices.

  • Enact a Data Minimization Rule

    In January, EPIC and Consumer Reports published a white paper urging the FTC to use its broad unfairness authority to promulgate a data privacy rule. The FTC’s COPPA statement explained how targeted advertising can threaten children’s privacy, “The development of ever more sophisticated targeting practices, in some cases based on comprehensive collection of users’ activities across the Internet, has raised concerns that businesses might engage in harmful conduct and led to calls for strengthening children’s privacy protections.” Since the Commission revised the COPPA Rule in 2013, “companies’ information collection practices have continued to become more extensive, and concerns remain that children’s information may be used to target them.”

    EPIC and Consumer Report’s white paper suggests a Data Minimization rule to prohibit all secondary data uses with limited exceptions. The paper further encourages the FTC to adopt data transparency obligations for primary use of data; civil rights protections over discriminatory data processing; nondiscrimination rules, so that users cannot be charged for making privacy choices; data security obligations; access, portability, correction, and deletion rights; and a prohibition on the use of dark patterns with respect to data processing.
  • Regulate Commercial Uses of AI

    In February 2020, EPIC filed a petition with the FTC calling on the agency to conduct a rulemaking concerning the use of artificial intelligence in commerce. EPIC encourages the FTC to promulgate rules that would regulate AI-driven ed tech. “Given the scale of commercial AI use, the rapid pace of AI development, and the very real consequences of AI-enabled decision-making for consumers, the Commission should immediately initiate a rulemaking to define and prevent consumer harms resulting from AI,” EPIC urged in its petition. EPIC called on the FTC to enforce the AI standards established in the OECD AI Principles, the OMB AI Guidance, and the Universal Guidelines for AI. EPIC’s petition followed two prior EPIC complaints to the FTC about the use of AI in employment screening and the secret scoring of young athletes.
  • Investigate the Data Practices of Ed Tech Firms

    EPIC encourages the FTC to use its investigative powers to scrutinize the data practices of ed tech companies, including the online test proctoring firms implicated in EPIC’s complaint to the DC Attorney General.
  • Develop and Leverage Unused Statutory Authorities

    In 2021 EPIC released a report—What the FTC Could Be Doing (But Isn’t) to Protect Privacy—highlighting several statutory authorities that the FTC has failed to fully exercise to protect privacy. EPIC encourages the FTC to use all of the authorities in its statutory toolbox to address abusive data practices and to safeguard the privacy of students and children.

EPIC’s Prior Work on Children’s and Student Privacy

EPIC has long fought for legal safeguards for children’s privacy and student privacy.

  • In May 2021, EPIC’s work on its Student Privacy Project and online test proctoring complaint was selected for inclusion in the spring 2021 Tech Spotlight Casebook, a publication of the Harvard Kennedy School’s Belfer Center for Science and International Affairs. The casebook “recognizes projects and initiatives that demonstrate a commitment to public purpose in the areas of digital, biotech, and future of work.”
  • In December 2020, EPIC filed a complaint with the D.C. Attorney General against five providers of online test proctoring software and services that use a range of tools to surveil or enable the surveillance of test-takers
  • In May 2020, EPIC and a coalition of consumer groups filed an FTC complaint against TikTok for violating COPPA.
  • In February 2019, EPIC and a coalition of consumer groups filed an FTC complaint charging that Facebook engaged in unfair and deceptive practices and violated COPPA after court documents from a 2012 class action lawsuit revealed that Facebook encouraged children to make credit card purchases on Facebook’s platform.
  • In August 2018, FTC unanimously voted to approve recommendations from EPIC and others to strengthen safeguards for children’s data in the gaming industry.
  • In March 2018, EPIC filed an amicus brief in Jackson et al. v. McCurry et al. arguing that the Fourth Amendment restricts school administrators’ power to access a student’s cell phone without consent.
  • In 2015, EPIC filed an amicus brief in Commonwealth v. White arguing against the warrantless seizure of students’ cell phones by law enforcement.
  • In 2013, EPIC filed a FOIA lawsuit seeking records relating to the Department of Education’s use of private debt collectors.
  • In 2009, EPIC filed an FTC complaint aginst Echometrix: alleging that the company engaged in unfair and deceptive trade practices by representing that its software protected children online while simultaneously collecting and disclosing information about children’s online activity.
  • In 2003, EPIC and 11 consumer organizations filed an FTC complaint alleging that Amazon illegally collected and disclosed children’s personal information in violation of COPPA.
  • In 1996, EPIC Executive Director Marc Rotenberg testified before the House Judiciary Committee to highlight the “unique problems in the collection and disclosure of data about children that argues in favor of strong privacy protection.”

Support Our Work

EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.

Donate