Vendor of School-Based Face Surveillance Systems Lied About Bias, Accuracy

December 2, 2020

Documents obtained by Motherboard show that a key vendor of school-based facial recognition tools lied to school officials about the accuracy rate and racial bias of its surveillance product. The records reveal that SN Technologies' AEGIS system misidentifies black students at alarmingly high rates and mistakes objects like broom handles for guns. Despite these errors, at least one New York school district has the system configured to automatically alert police when it detects a weapon or an individual on the district’s watchlist. The use of face surveillance systems in schools increases unnecessary interactions between police and students and can accelerate the school-to-prison pipeline. SN Technologies' algorithm was included in the 2019 NIST study that showed extensive racial bias in face surveillance systems. EPIC advocates for a moratorium on facial recognition technologies and urges policymakers to increase algorithmic accountability and transparency around the adoption and use of these tools.

Support Our Work

EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.

Donate