Meta agrees to change ad system as part of US government deal • The Register

Facebook parent Meta has settled a lawsuit filed by the US government, which alleged the internet giant’s machine learning algorithms broke the law by blocking some users from viewing real estate listings online based on their nationality, race, religion, sex and marital status. .

Specifically, Meta violated the US Fair Housing Act, which protects people seeking to buy or rent properties from discrimination, it was claimed; it is illegal for landlords to refuse to sell or rent their homes or to advertise homes to specific demographics and to evict tenants based on their demographics.

This week, prosecutors sued Meta in New York, alleging the mega-corporation’s algorithms discriminated against Facebook users by unfairly targeting people with real estate ads based on their “race, color, religion, gender, disability, family status and national origin”. “

Meta agreed to settle the case and promised to pay a fine of $115,054 to end the case. Above all, it also agreed to modify its advertising targeting system. The US government cannot impose a heavier penalty, as this is the maximum penalty for violating the FHA, and we suspect the main purpose was to force a change in the software.

“When a company develops and deploys technology that denies users housing opportunities based in whole or in part on protected characteristics, it has violated the FHA, just as when companies engage in discriminatory advertising using more traditional advertising methods,” said Damian Williams, a U.S. attorney. of the court for the Southern District of New York, said in a statement.

“As technology moves rapidly, companies like Meta have a responsibility to ensure that their algorithmic tools are not used in a discriminatory way,” added Justice Department Assistant Attorney General Kristen Clarke of the Division. civil rights. “This settlement is historic, marking the first time Meta has agreed to terminate one of its algorithmic targeting tools and change its delivery algorithms for real estate listings in response to a civil rights lawsuit.”

Meta allowed advertisers to target specific users through its “Lookalike Audience” or “Special Ad Audience” tool, it was alleged. The software uses machine learning to group users from similar backgrounds together so that advertisers can serve ads to people of the same race, religion, or gender, and block those who aren’t in specific groups from seeing them. Which would be a no, no.

The Special Ad Audience system was implemented as an alternative to the Lookalike Audience tool. Meta has pledged to discontinue its special audience system for housing-related ads as well as other riskier categories, such as job and credit ads, as part of the deal with Uncle Sat.

Advertisers will have to comply with company non-discrimination policies, and the ability to target specific users on social media will be limited. For example, Meta has banned gender and age targeting for ads related to housing, jobs, and credit. Geo-targeting will be limited to a 15 mile radius.

“We’re making this change in part to address feedback we’ve heard from civil rights groups, policymakers, and regulators about how our advertising system delivers certain categories of personalized ads, particularly with respect to the equity…Discrimination in housing, employment, and credit is a deep-rooted issue with a long history in the United States, and we are committed to expanding opportunities for marginalized communities in these spaces and in other others,” Meta said in a statement. ®

Comments are closed.