NIST Study Finds Extensive Bias in Face Surveillance Technology
December 20, 2019
The National Institute of Science and Technology study of Face Recognition Software found that false positives are up to 100 times more likely for Asian and African American faces when compared to White faces. NIST examined 189 software algorithms from 99 developers, a "majority of the industry," according to the federal agency. The highest rates of false positives were found for African American females — which NIST says is "particularly important because the consequences could include false accusations." EPIC has called for a global moratorium on the use of Face Surveillance technology. The Public Voice declaration in support of the moratorium has been endorsed by over 100 organizations and 1000 individuals in more than 40 countries.