EPIC Urges FTC to Investigate Airbnb’s Opaque and Unproven “Anti-Party” Algorithm

August 18, 2022

In a follow-up letter to EPIC’s 2020 complaint against Airbnb, EPIC warned the Federal Trade Commission today about a newly announced Airbnb algorithm used to identify and block would-be renters who are deemed likely to throw a party. As with the “trustworthiness” algorithm targeted in EPIC’s original complaint, “the company has not disclosed the logic or full range of factors on which the system relies and has failed to establish that the system is accurate, fair, or free from the impermissible bias routinely exhibited by automated decision-making systems.”

If Airbnb’s “anti-party technology” determines that a user presents a high risk of hosting a party during their stay, the system may “prevent [the] reservation attempt from going through” and restrict the user to booking only a private room or hotel room. “The factors relied on by Airbnb’s ‘anti-party technology’ pose a high risk of disparate and unfair impact,” EPIC explained, because it makes housing determinations based on age, length of stay, and distance from rental properties “among many other[]” factors.

EPIC has filed complaints to the FTC and the D.C. Attorney general about opaque algorithms and unfair surveillance used in housing, education, and hiring; advocates for a ban on face surveillance; and routinely calls on state, federal, and international decisionmakers to privilege privacy and human rights over aggressive adoption of AI.

Support Our Work

EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.

Donate