Meta settles lawsuit with Department of Justice over ad serving algorithms – TechCrunch

The US Department of Justice today announcement that it reached an agreement with Meta, Facebook’s parent company, to resolve a lawsuit alleging Meta engaged in discriminatory advertising in violation of the Fair Housing Act (FHA). The proposed settlement is subject to review and approval by a judge in the Southern District of New York, where the lawsuit was originally filed. But assuming it goes ahead, Meta said that he agreed to develop a new system for real estate listings and to pay a penalty of approximately $115,000, the maximum allowed by the FHA.

“When a company develops and deploys technology that deprives users of housing opportunities based in whole or in part on protected characteristics, it has violated the Fair Housing Act, just as when companies engage in discriminatory advertising by using more traditional advertising methods,” U.S. Attorney Damian said. Williams said in a statement. “Because of this groundbreaking lawsuit, Meta will – for the first time – change its ad serving system to combat algorithmic discrimination. But if Meta fails to demonstrate that it has changed its delivery system enough to guard against algorithmic bias, this office will pursue litigation.

The lawsuit was the Justice Department’s first challenged algorithmic bias under the FHA, and he claimed that the algorithms Meta uses to determine which Facebook users receive real estate ads were based in part on features such as race, color, religion, sex, disability, marital status and national origin – all of which are protected by the FHA. External investigations have provided evidence in support of Justice Department claims, including a 2020 paper from Carnegie Mellon who showed that biases in Facebook’s advertising tools exacerbate socio-economic inequalities.

Meta said that as part of the agreement with the Department of Justice, it will stop using a real estate ad tool, Special Ad Audience, which allegedly relied on a discriminatory algorithm to find users who ” resemble” other users based on FHA protected characteristics. . Meta will also be developing a new system over the next six months to “address racial and other disparities caused by its use of personalization algorithms in its ad serving system for real estate listings,” according to a press release, and will implement the system by December 31. 2022.

An independent third-party reviewer will continually investigate and verify whether Meta’s new system meets the standards agreed upon by the company and the Department of Justice. If the Department of Justice concludes that the new system does not sufficiently address discrimination, the settlement will be terminated.

Meta must inform the Department of Justice if it intends to add targeting options.

In a blog post, Meta said its new system will apply to ads related to jobs and credit as well as housing — likely in response to criticism of the company’s ad targeting system in those areas as well. . The aforementioned Carnegie Mellon study found that ads on Facebook related to credit cards, loans and insurance were disproportionately sent to men, while job ads were shown to a greater proportion of women. The co-authors found that users who chose not to identify their gender or who self-identified as non-binary or transgender rarely, if ever, saw credit ads of any type.

US officials have long accused Meta of discriminatory ad targeting practices. In 2018, then-Secretary of the U.S. Department of Housing and Urban Development Ben Carson filed a formal complaint against Facebook over ad systems that “unlawfully discriminate” against users based on race, religion, and other categories. . A separate lawsuit filed by the National Fair Housing Alliance in 2019, since settled, alleged that Meta provided discriminatory targeting options to advertisers.

Sharon D. Cole