A pair of recent discoveries about Zoom and Twitter's facial recognition algorithms highlights the discriminatory impact of such systems and reinforces EPIC's call for a moratorium on face surveillance. Technologist Colin Madland recently tweeted images showing that Zoom's facial recognition tool failed to recognize a black colleague's face when using a digital background–even though it easily identified Madland's face. In subsequent tweets from the same thread, it became apparent that Twitter's image preview system also had a strong bias toward centering images on white faces over black faces. Twitter said it had previously tested the system for bias, but the company will now "open source [its] work so others can review and replicate." A 2019 study from NIST of a majority of facial recognition vendors found significant rates of racial bias. In addition to calling for a moratorium on facial surveillance, EPIC advocates for algorithmic transparency and a comprehensive federal data privacy law.
Share this page:
Subscribe to the EPIC Alert
The EPIC Alert is a biweekly newsletter highlighting emerging privacy issues.