The U.S. Department of Justice today announced that it entered into an agreement with Meta, Facebook’s parent company, to resolve a lawsuit that alleged Meta engaged in discriminatory advertising in violation of the Fair Housing Act (FHA). The proposed settlement is subject to review and approval by a district judge in the Southern District of New York, where the lawsuit was originally filed. But assuming it moves forward, Meta said that it has agreed to develop a new system for housing ads and pay a roughly $115,000 penalty, the maximum fine under the FHA.
“When a company develops and deploys technology that deprives users of housing opportunities based in whole or in part on protected characteristics, it has violated the Fair Housing Act, just as when companies engage in discriminatory advertising using more traditional advertising methods,” U.S. Attorney Damian Williams said in a statement. “Because of this ground-breaking lawsuit, Meta will — for the first time — change its ad delivery system to address algorithmic discrimination. But if Meta fails to demonstrate that it has sufficiently changed its delivery system to guard against algorithmic bias, this office will proceed with the litigation.”
The lawsuit was the Justice Department’s first challenging algorithmic bias under the FHA, and it claimed that the algorithms Meta uses to determine which Facebook users receive housing ads relied in part on characteristics like race, color, religion, sex, disability, familial status, and national origin — all of which are protected under the FHA. Academic studies have provided evidence in support of the Justice Department’s claims, including a 2020 paper from Carnegie Mellon that showed that biases in Facebook’s ad platform exacerbated socioeconomic inequalities.
Meta said that, under the settlement with the Justice Department, it will stop using an advertising tool for housing ads, Special Ad Audience, which allegedly relied on a discriminatory algorithm to find users who “look like” other users based on FHA-protected characteristics. Meta also will develop a new system over the next six months to “address racial and other disparities caused by its use of personalization algorithms in its ad delivery system for housing ads,” according to a press release, and implement the system by December 31, 2022.
An independent, third-party reviewer will investigate and verify on an ongoing basis whether Meta’s new system meets the standards agreed to the company and the Justice Department. Meta must notify the Justice Department if it intends to add any targeting options in the future.
If the Justice Department concludes that the new system doesn’t sufficiently address the discrimination, the settlement will be terminated.