ICE uses facial recognition services (FRS) to identify individuals through its investigatory branch Homeland Security Investigations (HSI). The agency performs facial recognition searches against federal government databases, state databases such as driver’s license photos, and commercial databases compiled by third party companies. Within the federal government ICE can query at least two databases, the Automated Biometric Identification System (IDENT) and Homeland Advanced Recognition Technology (HART). ICE is in the process of connecting to databases at the FBI, DoD, and State Department as well. The agency also queries some state driver’s license databases, including Maryland.
Clearview AI is a provider of facial recognition technology. The company scrapes images from social media sites like Facebook and YouTube to compile a database of over three billion images which it sells access to. The product compares a photo to Clearview’s database and provides links to the social media accounts and other information of matching images. Clearview has received cease and desist letters from Facebook, Twitter, Google and other companies for violating terms of service by scraping images from those sites.
ICE’s use of FRS is of interest to EPIC because of the risks created by facial recognition technology. Facial recognition technology can be deployed “covertly, even remotely, and on a mass scale,” and there is little an individual can do to prevent collection of his or her image. Facial images and other types of biometric data are especially sensitive because, “unlike other means of identification . . . it cannot be changed.” Abuse of facial recognition technology is of particular concern because it can enable comprehensive surveillance. Both social media and public life now generate trails of images which can be used in facial recognition matching. Due to the sensitivity of this information, EPIC calls for a complete ban on face surveillance.
ICE released a Privacy Impact Assessment for its use of facial recognition services. The PIA found 13 potential privacy risks with the use of FRS. To mitigate many of these risks the PIA references training materials and guidance for HIS agents but does not provide details on those materials. ICE’s policies are of particular concern after a data breach at CBP last year exposed over 184,000 facial recognition images.
EPIC is pursuing these FOIA requests to make public ICE’s use of facial recognition, particularly Clearview AI’s controversial service, and to increase transparency on ICE’s efforts to mitigate the privacy risks of FRS.