ITEM 10: How a Small Legal Aid Team Took on Algorithmic Black Boxing at Their State’s Employment Agency (And Won)

December 1, 2022 | Virginia Eubanks, EPIC Scholar in Residence

In June 2022, Legal Aid of Arkansas won a significant victory in their ongoing work to compel Arkansas’ employment agency to disclose crucial details about how it uses automated decision-making systems to detect and adjudicate fraud. Arkansas Division of Workforce Services (DWS) tried to assert that records Legal Aid requested through FOIA were exempt from disclosure because: 1) the agency was acting in a law enforcement capacity when it investigated unemployment fraud; and 2) information that might be released would give a “competitive advantage” to “bad actors” over legitimate claimants. Legal Aid of Arkansas sued to compel release of the requested information. The Arkansas Supreme Court found the state’s arguments unconvincing, stating that “DWS is unequivocally not a law-enforcement agency” and that “applicants for unemployment benefits are not ‘competitors’ or ‘bidders’” whose material interests would be harmed by disclosing details of the agency’s algorithmic decision-making.

In August, I spoke with Legal Aid of Arkansas Senior Associate Attorney Trevor Hawkins and Director of Advocacy Kevin De Liban about what the decision means for unemployed workers and algorithmic transparency. The interview has been edited for length and clarity.

Can you tell us how the Arkansas Division of Workforce Services v. Legal Aid of Arkansas case came about?

Hawkins: This case began all the way back in mid- to late-2020, right at the height of the pandemic. We started getting a huge wave of phone calls from clients about really long processing delays, asking why they hadn’t received unemployment benefits yet, even though they filed in the spring.

We knew the state told a reporter that they were using an algorithm to process unemployment claims. That’s a red flag for us. The work that we’ve done in the past suggested that there was a good chance that algorithm was tied to the delays.

De Liban: We know algorithms generally mean pain for our clients, and that Arkansas has not demonstrated any interest in harm reduction.

Hawkins: We made a FOIA request. We got a copy of a contract between Arkansas and ProTech Solutions that said the company was hired by DWS to create an algorithm to make decisions about unemployment applications. So, we expanded our FOIA request using the specific word “algorithm.” That was in October of 2020.

The agency was dragging its feet, so in January 2021, we started pushing really hard. That led to us filing a lawsuit against DWS for violation of the Freedom of Information Act. We won and they were ordered to start producing information – tens of thousands of pages. Obviously, we knew that any sort of identifying information about claimants would be redacted. But there were these other redactions that were obviously not tied to personal identifying information. There were pages that were completely blacked out from top to bottom. DWS argued that they were exempt from disclosing this information because they are acting as a law enforcement agency when they go after fraud. So, we went back to court to argue that this was an absurd claim and to ask the court to compel disclosure of the redacted information.

The good news is that we won. In the June decision, the Arkansas Supreme Court finally told DWS that they have to provide us this information.

But the bad news is that the folks who were calling us in 2020, they were really suffering while we fought to compel the state to disclose. Families were trying to figure out how to pay bills, to make ends meet. They were getting evicted, living in their cars. They did not know how they were going to feed their children. That’s why we fought so hard to get as much information as we could as quickly as possible.

I know you can’t reveal client information, but can you share one or two stories that are characteristic of what they were facing?

Hawkins: Big picture? I’ve been doing this Legal Aid work for five years and late 2020 was the most difficult time I’ve ever had. The situations folks were going through, the things that they were expressing to us…there’s so many stories that come to mind. A single mother with two children who was about to have her car repo’ed. She was having to decide between feeding her children and making her car payment. Folks who were not able to pay their utility bill. Clients with thoughts of self-harm. Legal Aid received these phone calls in late 2020, and many of these folks didn’t receive benefits until late 2021. Some never got them. Many gave up.

The world was crashing down on folks during the pandemic. Unemployment, the one benefit that was supposed keep that from happening, it never came through for them, and that was largely attributable to an algorithm.

It’s interesting that the item that caused all the uproar in your FOIA request was Item 10, which asked for “All public records, including communications…between March 1, 2020 and present, that contain the words ‘algo’ or ‘algorithm.’” Can you say more about why you believe we need better information about how algorithms are working in public benefits?

De Liban: The whole process of agency decision-making is really weighted towards the agency. DWS has all this runway to develop systems while protected from outside transparency. When you’re fighting these systems from the outside, you usually only get your chance after a new project is implemented. Which means that the ways we can fight are limited. We can sue. The public can try to organize around an issue and speak out about it. We can try to convince journalists that, “Hey, this is really bad.”

FOIA request follow-up letter from Legal Aid of Arkansas to Don Denton, General Council Arkansas Division of Workforce Services, December 8, 2020

We can try to affect what levers of power there are. But you can only do that well when you have good information. There’s this enormous information asymmetry. The state may be making decisions based on terrible data. But the state doesn’t generally have to justify its decisions, especially in a political climate that is hostile towards public benefits.

Providing good information to people who are receiving benefits – for whom this process is totally opaque – is crucial. Access to information is the foundational element of any resistance. That’s why you have to fight so hard for it, always. And it’s always an uphill battle. The state already has so much advantage and inertia by the time you start resisting.

There are two interesting claims the Department of Workforce Services tried to make. First, the agency claimed it should qualify for a law enforcement exemption from FOIA disclosure because it investigates fraud. Second, it argued it should qualify for a competitive advantage exemption, because it needs to protect proprietary business secrets to guard against bad actors who might commit fraud. Can you say a little bit about those two claims?

Hawkins: The law enforcement exemption that was the one that they really held on to. That’s always the selling point for these algorithms – they help detect fraud at scale and it’s easier for the state. Despite this algorithm applying to every single applicant for unemployment benefits and making initial decisions about whether they’re eligible or not, the state tried to argue that the algorithm was solely implemented to detect fraud.

It’s troubling for a state administrative agency to claim that they are a law enforcement agency, that they are tasked with bringing criminal charges against claimants for unemployment benefits. DWS went as far as saying that they had the authority to criminally prosecute claimants.

In the decision, the Justice said that DWS is, unequivocally, not a law enforcement agency. But they fought hard on that. That’s the first time – as far as I can tell from my research – that a state employment agency has claimed to be law enforcement.

The competitive advantage exemption was a stretch from the beginning, because it is intended to protect bidders seeking to provide services to the state. DWS was trying to say, “Hey, there’s good and bad people applying for unemployment benefits. They’re competing for these funds. We’re stepping in to protect the good people. Releasing this information gives bad applicants competitive advantage over the good ones.”

This is an absurd suggestion. It was not supported by case law. The court rejected it. But the claim goes hand in hand with the idea that DWS is a law enforcement agency and that investigating fraud is the primary purpose of implementing this algorithm and keeping how it operates secret.

There are good reasons that economic support programs and law enforcement have traditionally been kept separate – in theory if not always in practice! Can you reflect on what it means that DWS is both claiming to be law enforcement and seeing their clients as competitors?

De Liban: It reflects underlying hostility regarding social welfare and economic security programs. In general, these benefits are already hard to get. Algorithms further weaponize the flaws of existing systems and make them harder to successfully navigate.

Agencies are savvy and they don’t want to give up information. They know there’s deference to law enforcement baked into freedom of information laws. Claiming to be law enforcement is a magic talisman to prevent any public accountability.

DWS wasn’t just trying to hide information about how the algorithm itself works. They weren’t saying, “We can’t tell you the eight factors the algorithm uses to determine whether or not your case is going to get flagged.” They redacted any information about anything related to the algorithm. They wanted zero transparency.

What’s next for you in this work?

Hawkins: The big overarching goal is to stop algorithms that lead to quantifiable harm to folks relying on public benefit safety nets. An intermediate step to that goal is more open dialogue among advocates and between advocates and state agencies so that this won’t be uncharted territory in the future. Time is of essence for our clients, and we need resources to help advocates understand what’s going on and to take steps to address algorithmic harm.

De Liban: In collaboration with the National Health Law Program and Upturn, we recently launched the Benefits Tech Advocacy Hub. It’s a place where folks can come to learn how to fight these technologies in public benefit systems and also contribute to a growing community of other people involved in similar fights.

We’re trying to build a system that provides immediate support to public benefits recipients and advocates while building long term power by collecting knowledge, sharing advocacy ideas, trying out new approaches, and sharing the results. The Hub offers immediate strategies for stopping active benefits cuts, case studies about how technology is being used to administer public programs, frameworks for understanding how to intervene, and readings and other resources for understanding austerity politics, which underpin all the issues that arise in public benefits technology. 

We’re also going to focus moving ahead on changing incentive structures. If a state agency implements an algorithm that causes massive harm, they’re usually doing it with the blessing of an executive official, so they’re not going to face repercussions. They might face some public embarrassment or some bad media coverage, but agency heads don’t face any personal or monetary consequences for implementing a really devastating algorithm. Vendors don’t face any consequences for selling algorithms that cause widespread suffering. There’s a real disconnect between the power and ability to implement systems and accountability when they go wrong.

One focus in the future needs to be changing incentive structures. One way that we can do that is to challenge qualified immunity of state agency officials, so that they face individual personal damages in situations where they flagrantly fail to protect the rights of people receiving benefits. We also need to start placing more attention on the vendors that sell and implement these systems.

Is there anything you didn’t get a chance to say? Any final thoughts?

De Liban: All kinds of sectors – civil society, Legal Aid advocates, community organizers, benefits recipients, journalists, policy organizations – have to be prepared for legislative fights around FOIA. You can imagine a state legislature responding to victories like Trevor’s by creating a new FOIA exemption. Those are battles that we have to prepare for in the coming years.

Hawkins: That decision in June, it was just so emotional. It was such a long fight. Now that we have the information we were asking for, we hope to make some of it public, when we can. Our whole team is thankful this battle paid off – we won’t face those same issues if they turn on a new algorithm in the state of Arkansas. I genuinely hope word of these kinds of successes spreads across the country.

Support Our Work

EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.

Donate