ABA Urges Withdrawal of Algorithmic Safe Harbor Rule for Disparate Impact Claims in Housing

April 6, 2021

In September 2020, the Department of Housing and Urban Development released a final rule creating a defense to a discrimination claim under the Fair Housing Act where “predictive analysis” tools are not "overly restrictive on a protected class" or where they “accurately assessed risk.” Shortly after, a federal judge in Massachusetts blocked the rule, saying the regulation would "run the risk of effectively neutering disparate impact liability under the Fair Housing Act.” Today, American Bar Association President Patricia Lee Refo urged the agency to "act immediately to withdraw the 2020 FHA Rule and to adopt new guidance and a new rule to ensure the danger of algorithmic bias is adequately tackled.” EPIC and several others warned the federal housing agency during the initial rule announcement that providing such a safe harbor for the use of algorithms in housing without imposing transparency, accountability, or data protection regulations would exacerbate harms to individuals subject to discrimination. EPIC has called for greater accountability in the use of automated decision-making systems, including the adoption of the UGAI principles and requirements for algorithmic transparency.

Support Our Work

EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.

Donate