Updates

Court Blocks Rule That Would Okay Algorithmic Housing Decisions, Limit Discrimination Claims

October 29, 2020

A federal judge in Massachusetts has blocked a federal regulation that would have made it significantly harder to sue landlords and lenders for housing discrimination under the Fair Housing Act. The rule created a defense to any disparate impact claim in which a "predictive analysis" tool was used to make a housing decision, so long as that tool "accurately assessed risk" or was not "overly restrictive on a protected class." The court ruled that this regulation would "run the risk of effectively neutering disparate impact liability under the Fair Housing Act." In 2019, EPIC and others warned the federal housing agency that sanctioning the use of algorithms for housing decisions would exacerbate discrimination unless the agency imposed transparency, accountability, and data protection requirements. The Alliance for Housing Justice called the rule "a vague, ambiguous exemption for predictive models that appears to confuse the concepts of disparate impact and intentional discrimination." EPIC has called for greater accountability in the use of automated decision-making systems, including the adoption of the Universal Guidelines for Artificial Intelligence and requirements for algorithmic transparency.

Support Our Work

EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.

Donate