Meta Agrees to Alter Ad Technology in Settlement With US

SAN FRANCISCO – Meta agreed Tuesday to alter its ad technology and pay a penalty of $ 115,054, in a settlement with the Justice Department over claims that the company’s ad systems were discriminated against by Facebook users who were able to see housing ads on the platform. based on their race, gender and ZIP code.

Under the agreement, Meta, the company formerly known as Facebook, said it would change its technology and use a new computer-aided method that aims to regularly check those who are targeted and eligible to receive housing ads, in fact, seeing those ads. The new method, which is referred to as a “variance reduction system,” relies on machine learning to ensure that advertisers are delivering ads related to housing-specific protected classes of people.

“Meta will – for the first time – change its ad delivery system to address algorithmic discrimination,” Damian Williams, a US attorney for the Southern District of New York, said in a statement. “But if Meta fails to demonstrate that it has sufficiently changed its delivery system to guard against algorithmic bias, this office will proceed with the litigation.”

Facebook, which has become a business colossus by collecting its users’ data and letting advertisers target ads based on its characteristics of an audience, has faced complaints for years that some of those practices are biased and discriminatory. The company’s ad systems have allowed marketers to choose who saw thousands of different features using their ads, which also let those advertisers exclude a number of protected categories, such as race, gender and age.

The Justice Department filed both its suit and the settlement against Meta on Tuesday. In its suit, the agency said it had concluded that “Facebook could achieve its interests by maximizing its revenue and providing relevant ads to users through less discriminatory means.”

While the settlement pertains well to housing ads, Meta said it will also plan to apply to its new system of targeting ads related to employment and credit. The company has faced a blowback for allowing bias against women in job ads and excluding certain credit card ads from people watching.

The issue of biased ad targeting has been particularly debated in housing ads. In 2016, Facebook’s potential for ad discrimination was revealed in an investigation by ProPublica, which showed that the company’s technology made it simple for marketers to exclude specific ethnic groups for advertising purposes.

In 2018, Ben Carson, who was the Secretary of the Department of Housing and Urban Development, announced a formal complaint against Facebook, accusing the company of having ad systems that were “unlawfully discriminated” based on categories such as race, religion and disability. In 2019, HUD sued Facebook for engaging in housing discrimination and violating the Fair Housing Act. The agency said Facebook’s systems did not deliver ads to “a diverse audience,” even if an advertiser wanted to be seen broadly.

“Facebook is discriminating against people based on who they are and where they live,” Mr. Carson said at the time. “Using a computer to limit a person’s housing choices can be just as discriminatory as slamming a door into someone’s face.”

The Justice Department’s lawsuit and settlement is based on HUD’s 2019 investigation and discrimination charge against Facebook.

In an issue related to its own tests, the US Attorney’s Office for the Southern District of New York found that Meta’s ad systems directed housing ads away from certain categories of people, even when advertisers were not aiming to do so. The ads were steered “disproportionately to white users and away from Black users, and vice versa,” according to the Justice Department’s complaint.

Many housing ads in neighborhoods where most of the people were white were also directed to white users, while housing ads in areas that were largely black were shown to be black users, the complaint added. As a result, the complaint said, Facebook’s algorithms “actually and predictably reinforce or perpetuate segregated housing patterns because of race.”

In recent years, civil rights groups have also been pushing back against the vast and complex advertising systems that underpin some of the largest Internet platforms. The groups have argued that those systems have inherent biases built into them, and that tech companies like Meta, Google and others should do more to back those biases.

The area of ​​study, known as “algorithmic fairness,” has been a significant topic of interest among computer scientists in the field of artificial intelligence. Leading researchers, including former Google scientists like Timnit Gebru and Margaret Mitchell, have sounded the alarm bell for such biases for years.

In the years since, Facebook has clamped down on the types of categories that marketers can choose from when purchasing housing ads, cutting down the number to hundreds and eliminating options based on race, age and ZIP code.

Chancela Al-Mansour, executive director of the Housing Rights Center in Los Angeles, said it was “essential” that “fair housing laws should be aggressively enforced.”

“Housing ads have become tools for unlawful behavior, including segregation and discrimination in housing, employment and credit,” she said. “Most users had no idea they were being targeted or denied housing ads based on their race and other characteristics.”

Meta’s new ad technology, which is still in development, will occasionally check on who is being served ads for housing, employment and credit, and make sure those audiences match up with the marketers want to target. If the ads being served skew heavily toward white men in their 20s, for example, the new system would theoretically identify this shift and serve to be more equitably broader and more varied audiences.

“We’re going to be taking a snapshot of marketers’ audiences occasionally, seeing who they target, and removing as much variance as we can from that audience,” said Roy L. Austin, Meta’s vice president of civil rights and a deputy general counsel. , said in an interview. He called it “a significant technological advancement for how machine learning is used to deliver personalized ads.”

Meta said it would work with HUD over the coming months to incorporate technology into Meta’s ad targeting systems, and agree to a third-party audit of the new system’s effectiveness.

The company also said it would no longer use a feature called “special ad audiences,” a tool that would help advertisers expand groups of people to reach their ads. The Justice Department said the tool also engaged in discriminatory practices. Meta said the tool was an early effort to combat biases, and that its new methods would be more effective.

The $ 115,054 penalty that Meta agreed to pay in the settlement is most available under the Fair Housing Act, the Justice Department said.

“The public should know the latest abuse by Facebook was worth the same amount of money Meta makes in about 20 seconds,” said Jason Kint, chief executive of Digital Content Next, an association for premium publishers.

As part of the settlement, Meta did not admit to any wrongdoing.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker