News
Washington Lawyer: Algorithmic Accountability: When Systems Get Things Wrong
May 1, 2024
Washington, D.C., for example, was using nearly 30 automated decision-making systems to surveil, screen, and score District residents in areas such as public benefits, health care, policing, and housing, according to an investigation by the Electronic Privacy Information Center (EPIC) in 2022. “Automated decision-making is threaded throughout a wide variety of public services in D.C.,” EPIC points out in its report “Screened and Scored in the District of Columbia.”
“They assign children to schools, inform medical decisions about patients, and impact policing decisions about where to patrol and whom to target,” the report states. For example, to assist case workers in determining who gets housing assistance first, the city was using the Vulnerability Index-Service Prioritization Decision Assistance Tool (VI-SPDAT), resulting in unintended racial disparities, EPIC’s investigation showed.
EPIC sent a copy of the report to D.C. Councilmembers, urging them to pass the Stop Discrimination by Algorithms Act, which seeks to prohibit discriminatory use of algorithms to determine an individual’s eligibility for or access to education, housing, or employment.
Read more here.
Support Our Work
EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.
Donate