Justice Department Reaches Groundbreaking Settlement Agreement with Meta-Platforms, Formerly Known as Facebook, to Resolve Discriminatory Advertising Allegations | Takeover bid


The Department of Justice today announced that it has obtained a settlement agreement resolving allegations that Meta Platforms Inc., formerly known as Facebook Inc., engaged in discriminatory advertising in violation of the Fair Housing Act (FHA). The proposed settlement resolves a lawsuit filed today in the U.S. District Court for the Southern District of New York alleging that Meta’s real estate advertising system discriminates against Facebook users based on their race, color, religion, gender , disability, family status and national origin. The settlement will not take effect until approved by the court.

Among other things, the complaint alleges that Meta uses algorithms to determine which Facebook users receive real estate ads, and that those algorithms rely, in part, on features protected by the FHA. This is the department’s first case challenging algorithmic bias under the Fair Housing Act.

Under the settlement, Meta will stop using an advertising tool for real estate listings (known as the “Special Ad Audience” tool) which, according to the department’s complaint, relies on a discriminatory algorithm. Meta will also develop a new system to address racial and other disparities caused by its use of personalization algorithms in its ad serving system for real estate listings. This system will be subject to the approval of the Ministry of Justice and the supervision of the courts.

This settlement marks the first time that Meta will be subject to judicial oversight for its targeting and ad serving system.

“As technology moves rapidly, companies like Meta have a responsibility to ensure that their algorithmic tools are not used in a discriminatory way,” said Assistant Attorney General Kristen Clarke of the Department of Justice’s Civil Rights Division. Justice. “This settlement is historic, marking the first time Meta has agreed to terminate one of its algorithmic targeting tools and change its delivery algorithms for real estate listings in response to a civil rights lawsuit. The Department of Justice has pledged to hold Meta and other tech companies accountable when they misuse algorithms in ways that unlawfully harm marginalized communities.

“When a company develops and deploys technology that deprives users of housing opportunities based in whole or in part on protected characteristics, it has violated the Fair Housing Act, just as when companies engage in discriminatory advertising by using more traditional advertising methods,” the U.S. attorney said. Damian Williams for the Southern District of New York. “Because of this groundbreaking lawsuit, Meta will – for the first time – change its ad serving system to combat algorithmic discrimination. But if Meta fails to demonstrate that it has changed its delivery system enough to guard against algorithmic bias, this office will pursue litigation.

“It’s not just housing providers who have a duty to uphold fair housing laws,” said Demetria McCain, deputy principal assistant secretary for fair housing and equal opportunity at the Department of Housing and Development. Urban Development (HUD). “Parties who discriminate in the housing market, including those who engage in algorithmic bias, must be held accountable. This type of behavior hurts us all. HUD values ​​its continued partnership with the Department of Justice as it seeks to uphold our nation’s civil rights laws.

Prosecution in the United States

The US complaint challenges three key aspects of Meta’s ad targeting and delivery system. Specifically, the department alleges that:

  • Meta has enabled and encouraged advertisers to target their housing ads based on race, color, religion, gender, disability, family status and national origin to decide which Facebook users will be eligible and not eligible to receive housing advertisements.
  • Meta has created an ad targeting tool called “Similar Audience” or “Special Ad Audience”. The tool uses a machine learning algorithm to find Facebook users who share similarities with groups of individuals selected by an advertiser using several options provided by Facebook. Facebook allowed its algorithm to consider FHA-protected characteristics — including race, religion, and gender — to find Facebook users who “look like” the advertiser’s source audience and are therefore eligible for receive real estate advertisements.
  • Meta’s ad serving system uses machine learning algorithms that rely in part on FHA-protected characteristics — such as race, national origin, and gender — to help determine which subset of the An advertiser’s target audience will actually receive a real estate ad.

The complaint alleges that Meta used these three aspects of its advertising system to target and serve housing-related ads to certain Facebook users while excluding other users based on FHA-protected characteristics.

The department’s lawsuit alleges both disparate treatment and disparate impact discrimination. The complaint alleges that Meta is responsible for disparate processing because it intentionally ranks users based on FHA-protected characteristics and designs algorithms that rely on users’ FHA-protected characteristics. The department further alleges that Meta is liable for disparate impact discrimination because the operation of its algorithms affects Facebook users differently based on their membership in protected classes.

Settlement Agreement

Here are the main features of the parties’ settlement agreement:

  • By December 31, 2022, Meta must stop using an advertising tool for real estate listings called “Special Ad Audience” (formerly called “Lookalike Audience”), which relies on an algorithm that the United States claims discriminates based on race, gender and other FHA protected characteristics to identify Facebook users who will be eligible to receive advertising.
  • Meta has until December 2022 to develop a new real estate listing system to address the race, ethnicity and gender disparities between the audiences targeted by advertisers and the group of Facebook users to whom the algorithms Facebook personalization actually serve the ads. If the United States concludes that this new system sufficiently resolves the discriminatory disparities introduced by Meta’s algorithms, then Meta will fully implement the new system by December 31, 2022.
  • If the United States concludes that Meta’s changes to its ad delivery system do not adequately address the discriminatory disparities, the settlement agreement will terminate and the United States will sue Meta in federal court. .
  • The parties will select an independent third-party reviewer to investigate and verify on an ongoing basis whether the new system meets the compliance standards agreed to by the parties. Under the agreement, Meta must provide the reviewer with any information necessary to verify compliance with these standards. The court will have ultimate authority to resolve disputes regarding what information Meta must disclose.
  • Meta will not provide any targeting options for real estate advertisers that directly describe or relate to FHA protected features. Under the agreement, Meta must notify the United States if Meta intends to add targeting options. The court will have the power to resolve any dispute between the parties regarding the proposed new targeting options.
  • Meta must pay a civil penalty in the United States of $115,054, the maximum fine available under the Fair Housing Act.

The Justice Department lawsuit is based in part on an investigation and charge of discrimination by HUD, which found that all three aspects of Meta’s ad serving system violated the Fair Housing Act. When Facebook elected to have HUD’s charge heard in federal court, HUD referred the case to the Department of Justice for litigation.

This case is being handled jointly by the Civil Rights Division of the Department of Justice and the US Attorney’s Office for the Southern District of New York.

Assistant Attorney General Kristen Clarke and U.S. Attorney Damian Williams thanked the Department of Housing and Urban Development for its efforts in the investigation.

The Fair Housing Act prohibits discrimination in housing on the basis of race, color, religion, sex, marital status, national origin, and disability. More information about the Civil Rights Division and the laws it administers is available at www.justice.gov/crt. More information about the US Attorney’s Office for the Southern District of New York is available at www.justice.gov/usao-sdny. People who believe they have been discriminated against in housing can file a complaint online at www.civilrights.justice.govor can contact the Department of Housing and Urban Development at 1-800-669-9777 or through its website at www.hud.gov.


About Author

Comments are closed.