Facebook settles with DOJ over discriminatory housing ads

Placeholder while article actions load

Facebook-owner Meta agreed to revamp the social network’s targeted advertising system under a sweeping settlement with the U.S. Justice Department, after the company was accused of allowing landlords to market their housing ads in discriminatory ways.

The settlement, which stems from a 2019 Fair Housing Act lawsuit brought by the Trump Administration, is the second such settlement in which the company has agreed to change its ad systems to prevent discrimination. But Tuesday’s settlement goes further than the first one, requiring Facebook to overhaul its powerful internal ad targeting tool, known as Lookalike Audiences. Government officials said by allowing advertisers to target housing-related ads by race, gender, religion or other sensitive characteristics that the product enabled housing discrimination.

Under the settlement, Facebook will build a new automated advertising system that the company says will help ensure that housing related ads are delivered to a more equitable mix of the population. The settlement said that the social media giant would have to submit the system to a third party for review. Facebook, which last year renamed its parent company to Meta, also agreed to pay a $115,054 fee, the maximum penalty available under the law.

“This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit,” said Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Rights Division.

Advertisers will still be able to target their ads to users in particular locations but not based on their Zip codes alone and those with a limited set of interests, according to Facebook spokesperson Joe Osborne.

Facebook is now legally bound to stop advertisers from excluding people because of their race

Facebook Vice President of Civil Rights Roy Austin said in a statement that the company will use machine learning technology to try to more equitably distribute who sees housing-related ads regardless of how those marketers targeted their ads by taking into account the age, gender and probable race of users.

“Discrimination in housing, employment and credit is a deep-rooted problem with a long history in the US, and we are committed to broadening opportunities for marginalized communities in these spaces and others,” Austin said in a statement. “This type of work is unprecedented in the advertising industry and represents a significant technological advancement for how machine learning is used to deliver personalized ads.”

Federal law prohibits housing discrimination based on race, religion, national origin, gender, disability or family status.

The agreement follows a string of legal complaints from the Justice Department, a state attorney general and civil rights groups against Facebook that are arguing that the company’s algorithmic-based marketing tools — which specialize in giving advertisers a unique ability to target ads to thin slices of the population — have discriminated against minorities and other vulnerable groups in the areas of housing, credit and employment.

In 2019, Facebook agreed to stop allowing advertisers to use gender, age and Zip codes — which often act as proxies for race — to market housing, credit and job openings to its users. That change came after a Washington state attorney general probe and a ProPublica report that found that Facebook was letting advertisers use its microtargeting ads to conceal housing ads from African American users and other minorities. Afterward, Facebook said it would no longer let advertisers use the “ethnic affinities” category for housing, credit and employment ads.

HUD is reviewing Twitter’s and Google’s ad practices as part of housing discrimination probe

But since the company agreed to these settlements, researchers have found that Facebook’s systems could continue to further discrimination even when advertisers were banned from checking specific boxes for gender, race or age. In some instances, its software detects that people of a certain race or gender are clicking frequently on a specific ad, and then the software begins to reinforce those biases by delivering ads to “look-alike audiences,” said Peter Romer-Friedman, a principal at the law firm Gupta Wessler PLLC.

The result could be that only men are shown a certain housing ad, even when the advertiser did not specifically try to only show men the ad, said Romer-Friedman, who has brought several civil rights cases against the company, including the 2018 settlement in which the company agreed to limit the ad targeting categories.

Romer-Friedman said the settlement was a “huge achievement,” because it was the first time a platform was willing to make major changes to its algorithms in response to a civil rights lawsuit.

For years, Facebook has struggled with complaints from civil rights advocates and people of color, who argue that Facebook’s enforcement would sometimes unfairly remove content in which people complained about discrimination. In 2020, the company submitted to an independent civil rights audit, which found that company policies were a “tremendous setback” to civil rights.

FOLLOW US ON GOOGLE NEWS

 

Read original article here

Denial of responsibility! WebToday is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment