Man contemplating the DOJ and HUD File Statement on SafeRent Algorithm Bias
News

DOJ and HUD File Statement on SafeRent Algorithm Bias

In a joint statement released by the Department of Justice (DOJ) and the Department of Housing and Urban Development (HUD), the two agencies announced that they have filed a Statement of Interest in Louis et al. v. SafeRent et al., a lawsuit currently pending in the U.S. District Court for the District of Massachusetts. The lawsuit alleges that the defendant’s use of an algorithm-based scoring system to screen tenants discriminates against Black and Hispanic rental applicants in violation of the Fair Housing Act (FHA).

The FHA prohibits discrimination in the sale and rental of housing on the basis of race, color, national origin, religion, sex, familial status, and disability. According to the complaint, the plaintiffs allege that SafeRent scores result in a disparate impact against Black and Hispanic rental applicants because the underlying algorithm relies on certain factors that disproportionately disadvantage Black and Hispanic applicants, such as credit history and non-tenancy-related debts, while failing to consider one highly-relevant factor, that the use of housing vouchers funded by HUD makes such tenants more likely to pay their rents.

“Housing providers and tenant screening companies that use algorithms and data to screen tenants are not absolved from liability when their practices disproportionately deny people of color access to fair housing opportunities,” said Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Rights Division. “This filing demonstrates the Justice Department’s commitment to ensuring that the Fair Housing Act is appropriately applied in cases involving algorithms and tenant screening software.”

“Algorithms are written by people. As such, they are susceptible to all of the biases, implicit or explicit, of the people that create them,” said U.S. Attorney Rachael S. Rollins for the District of Massachusetts. “As the housing industry and other professions adopt algorithms into their everyday decisions, there can be disparate impacts on certain protected communities. Stable and affordable housing provides a unique pathway to success, opportunity, and safety. We must fiercely protect the rights and protections promulgated in the Fair Housing Act. Today’s filing recognizes that our 20th-century civil rights laws apply to 21st-century innovations.”

“Tenant screening policies are not exempt from the Fair Housing Acts protections just because decisions are made by algorithm,” said HUD General Counsel Damon Smith. Housing providers and tenant screening companies must ensure that all policies that exclude people from housing opportunities, whether based on an algorithm or otherwise, do not have an unjustified disparate impact because of race, national origin or another protected characteristic.”

Man contemplating the DOJ and HUD File Statement on SafeRent Algorithm Bias

The Fair Housing Act prohibits discrimination in housing on the basis of race, color, religion, sex, familial status (having one or more children under 18), national origin, and disability. More information about the Civil Rights Division and the laws it enforces is available at justice.gov/crt.