Justice Department and Meta settle historic housing discrimination case

Placeholder while loading article actions

Facebook owner Meta has agreed to revamp the social network’s targeted advertising system in a sweeping settlement with the US Department of Justice, after the company was accused of allowing owners to market their real estate advertisements in a discriminatory manner.

The settlement, which stems from a 2019 Fair Housing Act lawsuit brought by the Trump administration, is the second such settlement in which the company has agreed to change its advertising systems to prevent discrimination. But Tuesday’s settlement goes further than the first, forcing Facebook to overhaul its powerful internal ad targeting tool, known as Lookalike Audiences. Government officials said that by allowing advertisers to target housing-related ads by race, gender, religion or other sensitive characteristics, the product enabled housing discrimination.

As part of the settlement, Facebook will build a new automated advertising system that the company says will help ensure housing-related ads are served to a fairer mix of the population. The settlement stated that the social media giant would have to submit the system to a third party for review. Facebook, which last year renamed its parent company Meta, also agreed to pay costs of $115,054, the maximum penalty under the law.

“This settlement is historic, marking the first time Meta has agreed to terminate one of its algorithmic targeting tools and change its delivery algorithms for real estate listings in response to a civil rights lawsuit,” said said Assistant Attorney General Kristen Clarke of the Justice Department. Rights Division.

According to Facebook spokesman Joe Osborne, advertisers will still be able to target their ads to users in particular locations, but not just based on their zip codes and those with a limited set of interests.

Facebook is now legally required to prevent advertisers from excluding people because of their race

Facebook Vice President of Civil Rights Roy Austin said in a statement that the company will use machine learning technology to try to more evenly distribute who sees housing-related ads, regardless of how those ads are viewed. marketers have targeted their advertisements taking into account age, gender and likelihood. breed of users.

“Discrimination in housing, employment, and credit is a deep-rooted issue with a long history in the United States, and we are committed to expanding opportunities for marginalized communities in these spaces and others,” said Austin in a statement. “This type of work is unprecedented in the advertising industry and represents a significant technological advancement in how machine learning is used to deliver personalized ads.”

Federal law prohibits housing discrimination based on race, religion, national origin, sex, disability, or marital status.

The agreement follows a series of legal complaints from the Department of Justice, a state attorney general and civil rights groups against Facebook who argue that the company’s algorithmic marketing tools – which specialize in giving advertisers a unique ability to target ads to thin slices of the population — have discriminated against minorities and other vulnerable groups in housing, credit and employment.

In 2019, Facebook agreed to stop allowing advertisers to use gender, age, and zip codes — which often act as proxies for race — to market housing, credit, and job listings to its customers. users. The change came after a Washington State Attorney General’s investigation and a report by ProPublica found that Facebook was letting advertisers use its microtargeting ads to conceal real estate listings from African American and other minority users. Subsequently, Facebook said it would no longer let advertisers use the “ethnicity” category for housing, credit and employment ads.

HUD examines advertising practices of Twitter and Google in housing discrimination investigation

But since the company agreed to those regulations, researchers have found that Facebook’s systems can continue to escalate discrimination even when advertisers are prohibited from checking specific boxes for gender, race or age. In some cases, its software detects that people of a certain race or gender frequently click on a specific ad, and then the software begins to reinforce those biases by showing ads to “similar audiences,” Peter said. Romer-Friedman, a director of the law firm Gupta Wessler PLLC.

The result could be that only men see a certain housing ad, even when the advertiser hasn’t specifically tried to show the ad only to men, said Romer-Friedman, who has filed multiple lawsuits. against the company, including the 2018 settlement in which the company agreed to limit ad targeting categories.

Romer-Friedman said the settlement was a “huge achievement” as it was the first time a platform had been willing to make major changes to its algorithms in response to a civil rights lawsuit.

For years, Facebook has grappled with complaints from civil rights activists and people of color, who argue Facebook’s app sometimes unfairly removes content in which people complain of discrimination. In 2020, the company submitted to an independent civil rights audit, which found that the company’s policies were a “huge setback” for civil rights.

#Justice #Department #Meta #settle #historic #housing #discrimination #case

Add Comment