FedScoop: Pipeline safety agency’s proposed pilot for ChatGPT in rulemaking raises questions  

September 5, 2023

“The idea that agencies will use a tool notorious for factual inaccuracies for development of rules that forbid arbitrary and capricious rule-making processes is concerning,” Ben Winters, an attorney and the leader of the AI and Human Rights project at the Electronic Privacy Information Center, said in an email to FedScoop. “Especially, the PHMSA, whose rules often concern potentially life-altering exposure to hazardous materials.” 

Winters, from EPIC, questioned whether ChatGPT is an appropriate technology for the rulemaking process. He argued that relevance analysis could ultimately result in an agency missing a novel point they hadn’t considered before, and added that sentiment analysis isn’t a “relevant consideration” of the Administration Procedure Act’s rulemaking process. 

“[S]ummaries by ChatGPT are prone to factual inaccuracies and a limited and outdated corpus of information,” he said. “Most of these functions could not be reliably achieved by ChatGPT.” 

Read more here.

Support Our Work

EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.