NY City Council Bills 1014-23 and 1024-23 Banning Biometric Surveillance in Places of Public Accommodation and Housing

May 5, 2023

New York City Council

Committee on Technology

Committee on Civil and Human Rights

New York City Hall

City Hall Park

New York, NY 10007

Re: Testimony of EPIC on Bills 1014-2023 and 1024-2023

Dear Chair Gutiérrez, Chair Williams, and Council Members,

EPIC writes to urge you to pass both Bill 1014 and Bill 1024 into law to protect New Yorkers from rapidly growing facial surveillance systems that destroy our privacy and harm society. We also urge you to add a private right of action to Bill 1024, and to ensure that both bills protect employees, delivery drivers and other people who may not be covered by the language of the current bills.

The Electronic Privacy Information Center (EPIC) is a public interest research center established in 1994 to focus public attention on emerging privacy and civil liberties issues.[1] EPIC has long advocated for a ban on facial recognition and strict limits on the collection and use of biometric data.[2]

EPIC studies advanced surveillance technologies including facial recognition, the flaws in these systems, and their impacts on society. As advocates for privacy and civil liberties, we are impressed with the City Council’s proposed approach. Earlier this year, EPIC Senior Counsel Jeramie Scott urged the Council to pass a ban on facial recognition in places of public accommodation.[3]

Facial recognition is a dystopian technology, frequently flawed, and even more dangerous when it works perfectly.[4] The Council’s approach here is largely correct: a ban on use in places of public accommodation and apartment buildings is the only appropriate response to a technology that can destroy our privacy and effectively close off traditionally public spaces. 

We urge the council to consider completely banning private use of facial recognition technology by businesses and landlords. These bills come very close but leave a few exceptions for using facial recognition to identify potential shoplifters and others who don’t qualify as consumers in Bill 1014 or tenants and guests in Bill 1024.

The protections for “consumers” in Bill 1014 may not ban the non-consensual use of facial recognition systems on employees, delivery drivers, emergency first responders, and other people who might enter a store without the intent to shop there. The safest way to avoid anyone being wrongfully surveilled by a facial recognition system is to ban the system itself.

Similarly, Bill 1024 leaves the door open to install systems monitoring service entrances or tailored to delivery drivers, contract workers, or other non-tenant visitors. Workers deserve equal privacy protections. And bill 1024 lacks a private right of action, which is necessary for the bill to work as intended.

Private use of facial recognition, like the system deployed at Madison Square Garden, is not an effective method for providing security. Rather, it is a form of gatekeeping, an attempt to close businesses and housing to whoever the owner feels in undesirable. At its worst, private use of facial recognition can enable discrimination by allowing comprehensive monitoring of businesses traditionally open to the public, with owner-curated hotlists of people to exclude. Such a practice has no place in a tolerant, democratic society.

Facial recognition systems like the one deployed at MSG will also create more comprehensive and nuanced records of our public movements. A system that identifies individuals when they enter a store effectively records their location and time of arrival. Those records may be accessible by police without a warrant and can also be sold to advertisers, data brokers, and anyone willing to pay for them. Although current city code prohibits such sales, the city made clear to the Council at this hearing that it is not tracking private use of facial recognition and cannot be relied upon to prevent all sale and abuse of biometric data.

And in contrast to most other forms of identification, and even most other forms of location tracking, biometric monitoring is effectively unavoidable. If a person wants more privacy, she can leave her phone at home, but she can’t leave her face behind. The same is true for her voice, the way she walks, and other potential ways to identify her based on her physical characteristics. 

Banning biometric surveillance in places of public accommodation and apartments will not hurt businesses; if anything, it will save them from spending money on advanced surveillance systems with no evidence-based record of reducing crime or preventing harm. At most, these systems can work to concentrate crime in the poorest and least-resourced neighborhoods, but there is no evidence that surveillance technologies including facial recognition reduce crime as a whole.

We also want to commend the council on the strong private right of action in bill 1014. Especially in a city as large as New York, a private right of action is one of the strongest ways to ensure citizens’ rights are protected. While you may hear concerns about a flood of litigation, the reality is that this bill is exceedingly easy to comply with and provides a very reasonable opportunity to cure violations. Lawsuits are likely to only target bad actors. 

Finally, at the hearing the Council was rightly concerned with the NYPD’s use and misuse of facial recognition technology. The NYPD has a deeply flawed track record of abusing facial recognition technology and openly flaunting the very transparency laws intended to address such abuses.[5] EPIC and a coalition recently urged the Council to schedule a hearing specifically addressing the NYPD’s repeat failures to comply with the Public Oversight of Surveillance Technologies (POST) Act.[6] We also urge the Council to review two reports from the Georgetown Center on Privacy and Technology that lay out in detail the ways that the NYPD ignores science and plays fast and loose with its facial recognition systems.[7] These abuses do not just impact New Yorkers. EPIC recently filed an amicus brief in New Jersey, where a man was identified by NYPD at the request of New Jersey police with no connection to New York whatsoever.[8]His public defenders received no discovery on the reliability of Mr. Arteaga’s quite possibly erroneous identification, making an effective defense for a potentially innocent man difficult.

We urge the council to protect citizens’ privacy and guarantee equal access to important spaces by passing these bills. If possible, we urge the council to amend them to include a complete ban on private use of facial recognition and other biometric monitoring technologies in these locations.

Thank you for the opportunity to testify, please reach out with any questions to EPIC Senior Counsel Jeramie Scott at [email protected] or EPIC Counsel Jake Wiener at [email protected]


Jake Wiener

Jake Wiener

EPIC Counsel

Jeramie Scott

Jeramie Scott

EPIC Senior Counsel, 

Director, Project on Surveillance Oversight

[1] EPIC, About EPIChttps://epic.org/epic/about.html.

[2] EPIC, Ban Face Surveillancehttps://epic.org/campaigns/ban-face-surveillance/see e.g. Brief for EPIC as Amici Curiae, Patel v. Facebook., 932 F.3d 1264 (9th Cir. 2019), https://epic.org/amicus/bipa/patel-v-facebook/

[3] EPIC, EPIC to NYC Council: Take Action on Facial Recognition Now (Feb. 24, 2023), https://epic.org/%EF%BF%BCepic-to-nyc-council-take-action-on-facial-recognition-now/

[4] See EPIC, Face Surveillance and Biometricshttps://epic.org/issues/surveillance-oversight/face-surveillance/.

[5] See e.g., EPIC Comments to the NYPD on POST Act Disclosures (Feb. 25, 2021), https://epic.org/documents/nypd-post-act-disclosures/.

[6] Coalition Letter to City Council Calling for Hearing on NYPD Post Act Violations (Apr. 13, 2023), https://epic.org/wp-content/uploads/2023/04/Coalition-Letter-NYPD-POST-Act-Violations-Apr2023.pdf.

[7] Clare Garvie, A Forensic Without the Science: Face Recognition in U.S. Criminal Investigations, Georgetown Center on Privacy and Technology (Dec. 6, 2022), https://www.law.georgetown.edu/privacy-technology-center/publications/a-forensic-without-the-science-face-recognition-in-u-s-criminal-investigations/; Clare Garvie, Garbage In, Garbage Out: Face Recognition on Flawed Data, Georgetown Center on Privacy and Technology (May 16, 2019), https://www.flawedfacedata.com

[8] New Jersey v. Arteaga Docket No. A-3078-21T1, Brief EPIC, EFF, and NACDL as amici curiae, https://epic.org/documents/new-jersey-v-arteaga/.