Updates

EPIC Pushes NIST to Focus its Approach to Generative AI Risks Around Who and How AI Harms

June 4, 2024

This past weekend, EPIC submitted comments to the National Institute of Standards and Technology (NIST) with feedback on three draft documents the agency has produced in response to President Biden’s AI Executive Order, including additional recommendations on generative AI risk classification, AI transparency, and stakeholder engagement.

EPIC’s comments target NIST guidance around generative AI risk management, synthetic content risk mitigation, and global engagement on AI standards. First, EPIC recommended that NIST approach generative AI risks as extensions to general AI risks outlined in the NIST AI Risk Management Framework and in the typologies of AI harm that EPIC uses for our own generative AI harms research. Many of the same AI risk management techniques—such as data minimization, impact assessments, and AI red-teaming efforts—will be effective across different AI technologies. Second, EPIC provided specific feedback on the twelve generative AI risks included in NIST’s draft documents, highlighting opportunities for NIST to refine their risk categories and introduce a risk of data degradation due to the ways that synthetic content undermines AI training data. Third, EPIC cautioned NIST to consider how present limitations to AI labeling and watermarking techniques undermine their effectiveness as a risk mitigation tool without other measures in place. And finally, EPIC uplifted the work of international civil society groups like Data Privacy Brazil to showcase why NIST must meaningfully engage with civil society, academia, and impacted communities throughout its global AI standards development process.

EPIC’s submission is the latest in a series of comments related to NIST’s AI obligations under the Biden Administration and builds on previous comments around AI risks and transparency to the OMB, NTIA, and other agencies.

Support Our Work

EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.

Donate