Whether a defendant who was identified using a facial recognition systems is entitled to detailed discovery on the system and the specifics of how he was identified.
Facial recognition is an increasingly common tool used in police investigations to identify unknown suspects. But facial recognition systems make mistakes, and have in several cases wrongly identified innocent individuals as perpetrators of crimes, leading to wrongful arrests and attempted prosecutions. At least three Black men have been wrongfully identified and arrested from facial recognition searches, in two of those cases subsequent photo lineups did not correct the error.
There are a wide variety of facial recognition systems that police can use with differing accuracy rates in controlled testing by the National Institute of Standards and Technology. Accuracy rates in the real world tend to be lower than those achieved in testing because police often use low-quality and inherently flawed data. Despite being in use in some capacity for more than 20 years, there are few if any
On November 29 2019 there was an armed robbery of a store in West New York, NJ. Francisco Arteaga was eventually identified as a suspect and charged with the robbery. Before Mr. Arteaga was identified, New Jersey police found that witnesses at the scene could not describe the perpetrator, and a facial recognition search run in state by New Jersey’s fusion center, the Regional Operations Intelligence Center, turned up no results. Police then went out of state to the New York Police Department which ran a facial recognition search using still images cropped from security cameras on the street. The search returned Mr. Arteaga as one of the results, and NYPD’s facial recognition analyst reported only him to the New Jersey police as a “potential match” for the security camera footage. Police then placed Mr. Arteaga’s picture in a photo lineup where he was eventually identified by two witnesses, though the procedures used to perform the lineups were deficient.
Mr. Arteaga requested detailed discovery on the facial recognition systems used by the NYPD to identify him, the original photo and any edits performed by the NYPD before a search was run, and information on the analyst who performed the search that identified him. The New Jersey district court denied his motion to compel discovery.
EPIC along with partners the Electronic Frontier Foundation (EFF) and the National Association of Criminal Defense Lawyers (NACDL) filed a brief informing the court about how errors in facial recognition systems occur, the potential for bias in those systems, and arguing that discovery represents a last-chance to correct those errors. EPIC and its partners lay out the series of steps required to conduct a facial recognition search, all of which involve human decisions that can introduce variabilities in accuracy and increase the chance of a misidentification. The brief then argues that each facial recognition search presents a unique risk of misidentification that requires detailed discovery to assess, and that human review after a search is performed is not a cure for algorithmic mistakes. The brief also highlights known cases of misidentifications and argues that discovery is the only way for defendants to understand the evidence presented against them.