New Housing Regulation Limits Disparate Impact Housing Claims Based on Algorithms

September 29, 2020

Individuals alleging that a landlord discriminated against them by using a tenant-screening algorithm will face a higher burden of proof under a new rule that went into effect last Thursday. The rule creates a defense to a discrimination claim under the Fair Housing Act where the “predictive analysis” tools used were not "overly restrictive on a protected class" or where they “accurately assessed risk.” Last October, EPIC and several others warned the federal housing agency that providing such a safe harbor for the use of algorithms in housing without imposing transparency, accountability, or data protection regulations would exacerbate harms to individuals subject to discrimination. The agency did modify its rule following comments from EPIC and others, removing a complete defense based on use of an "industry standard” algorithm or in cases where the algorithm was not the “actual cause” of the disparate impact. But the final rule simply replaces the word “algorithm” with “predictive analysis” and includes vague "overly restrictive" and "accurate assessment” standards. The Alliance for Housing Justice called the rule "a vague, ambiguous exemption for predictive models that appears to confuse the concepts of disparate impact and intentional discrimination.” EPIC has called for greater accountability in the use of automated decision-making systems, including the adoption of the UGAI principles and requirements for algorithmic transparency.

Support Our Work

EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.

Donate