In recent years, the growing use of surveillance technology and digital learning tools in school settings has posed new threats to student privacy. Although FERPA and statutes against unfair trade practices generally prohibit the misuse of students’ personal data, enforcement of these laws has often lagged, leaving students vulnerable to invasive data collection, exploitative uses of their personal information, and unfair and discriminatory automated decision-making systems. EPIC has filed complaints to defend students’ privacy and advocated for comprehensive privacy legislation to protect students’ privacy rights.
FERPA and Other Student Privacy Statutes
The most significant federal statute concerning student privacy is the Family Educational Rights and Privacy Act (FERPA). FERPA protects the confidentiality of educational records while also giving students the right to review their own records. Enacted in 1974, FERPA is the last federal student privacy legislation passed by Congress.
The Americans with Disabilities Act (ADA) prohibits all public entities, including schools, from disability discrimination. The Department of Education’s Office of Civil Rights enforces the ADA along with other federal civil rights laws to protect students. The ADA requires that all covered entities provide reasonable accommodations to people with disabilities.
The Federal Trade Commission Act (FTC Act) prohibits unfair and deceptive trade practices, including the misuse of student data. The FTC enforces the FTC Act to protect consumers and competition. Each state has a consumer protection law that similarly prohibits unfair and deceptive acts and practices.
View EPIC’s page on FERPA here.
Invasive Systems in Schools
In recent years, schools have increasingly employed facial recognition and other biometric technology to monitor students. These systems collect vast amounts of biometric and personal information, use opaque and unproven algorithms, and are often biased against students of color and students with disabilities. During the COVID-19 pandemic, the use of these digital surveillance tools has increased dramatically. The drastic expansion of these tools has exacerbated their inequities. One example of one such technology is online proctoring software. Online proctoring services purportedly maintain academic integrity by discovering and flagging signs on cheating. Online proctoring companies claim that they can reliably and consistently find cheating. Despite these claims, the efficacy of these systems remain unproven. Moreover, these systems often are inaccurate for students of color and students with disabilities, often falsely flagging these students and subjecting them to further review. Students may have to provide a room scan to an unknown and unseen proctor, effectively being required to share footage of their bedrooms, homes, or personal space in exchange for their education. Students may also have to submit to facial recognition software, eye scans, key stoke logging, and other invasive technology as an academic requirement. Students should not have to trade their privacy or be subjected to privacy invasions in order to receive an education.
EPIC’s Work on Student Privacy
EPIC has advocated on behalf of students’ privacy and worked to protect students. On December 9, 2020, EPIC filed a complaint with the Office of the Attorney General for the District of Columbia alleging that five major providers of online test proctoring services have engaged in unfair and deceptive trade practices in violation of the D.C. Consumer Protection Procedures Act (DCCPPA) and the Federal Trade Commission Act. Specifically, EPIC’s complaint charges that Respondus, ProctorU, Proctorio, Examity, and Honorlock have engaged in excessive collection of students’ biometric and other personal data and have routinely relied on opaque, unproven, and potentially biased AI analysis to detect alleged signs of cheating. On the same day, EPIC sent letters to all five firms warning that EPIC is prepared to bring suit under the DCCPPA unless the companies agree to limit their collection of personal data, comply with basic requirements for trustworthy AI, and submit to annual third-party audits. EPIC aims to ensure that students are not subjected to unfair, unreliable AI determinations or forced to choose between preserving their privacy and receiving an education. Moreover, the fundamental fairness of online proctoring systems has been called into serious doubt. Research has shown that AI—particularly facial recognition systems—can encode bias and disproportionately harm students of color and students with disabilities. Test-takers subjected to AI-based proctoring have reported that the systems struggle to recognize faces of color and flag students with certain disabilities at higher rates. EPIC’s complaint identifies four categories of unfair and deceptive trade practices that the five test proctoring companies have engaged in. Each category violates both the DCCPPA and the FTC Act: (1) Unfair and Deceptive Collection of Excessive Personal Data; (2) Unfair Use of Opaque, Unproven AI Systems; (3) Proctorio and Honorlock’s Deceptive Uses of Facial Recognition; and (4) Deceptive Claims About the Reliability of Test Proctoring Systems.
EPIC’s work on its Student Privacy Project and online test proctoring complaint was selected for inclusion in the spring 2021 Tech Spotlight Casebook, a publication of the Harvard Kennedy School’s Belfer Center for Science and International Affairs. The casebook “recognizes projects and initiatives that demonstrate a commitment to public purpose in the areas of digital, biotech, and future of work.” The book highlights EPIC’s recent efforts to halt the use of unfair, unreliable, and invasive remote proctoring tools and the D.C. consumer protection complaint EPIC filed against online proctoring firms. “Through meticulous research, the Student Privacy Project revealed the extent to which these companies collect and process student personal and biometric data,” the casebook explains. “The complaint attempts to hold the five companies accountable for their practices by demonstrating how the data collection and processing practices may violate existing law.” The casebook also recognizes recent work around census privacy protections, community control over police surveillance, racially biased speech recognition tools, and the use of “garbage” facial recognition to identify criminal suspects.
EPIC leads a campaign to ban face surveillance through the Public Voice coalition. In December 2020, New York enacted a law that suspends the use of facial recognition and other biometric technology by New York State schools. The ban will last for two years or until a study by the State Education Department is complete and finds that facial recognition technology is appropriate for use in schools, whichever takes longer.
Recent Documents on Student Privacy
US Court of Appeals for the Fourth Circuit
Whether police and school officials need to obtain a search warrant when searching a child’s phone in school for the purpose of generating evidence to be used against the child in criminal proceedings.
Concerning five providers of online test proctoring software and services that use a range of tools to surveil or enable the surveillance of test-takers.
US Court of Appeals for the Eleventh Circuit
Whether school administrators may access and search the contents of a student's cell phone without consent
Massachusetts Supreme Judicial Court
Whether schools may turn over to the police a student's cell phone without a warrant
US District Court for the District of Columbia
Seeking records relating to the Department of Education's use of private debt collection.
Disease and Data in Society: How the Pandemic Expanded Data Collection and Surveillance Systems
Alan Butler and Enid Zhou | 2021
Triggering Tinker: Student Speech in the Age of Cyberharassment
Ari Ezra Waldman | 2017
Privacy in Pandemic: Law, Technology, and Public Health in the COVID-19 Crisis
Tiffany C. Li | 2020
Learner Privacy in MOOCs and Virtual Education
Elana Zeide and Helen Nissenbaum | 2018