SAN FRANCISCO – Meta agreed to alter its ad-targeting technology and pay a penalty of $ 115,054 on Tuesday, in a settlement with the Justice Department over claims that the company was engaged in housing discrimination by letting advertisers restrict who was able to see ads on ads. platform based on their race, gender and ZIP code.
Under the agreement, Meta, the company formerly known as Facebook, said it would change its technology and use a new computer-aided method that aims to regularly check whether its audiences who are targeted and eligible to receive housing ads are, in fact, seeing. those ads. The new method, which is referred to as a “variance reduction system,” relies on machine learning to ensure that advertisers are delivering ads related to housing-specific protected classes of people.
Meta also said it would no longer use a feature called “special ad audiences,” a tool that would help advertisers expand their groups of people to reach their ads. The company said the tool was an early effort to combat biases, and that its new methods would be more effective.
“We’re going to be taking a snapshot of marketers’ audiences occasionally, seeing who they target, and removing as much variance as we can from that audience,” said Roy L. Austin, Meta’s vice president of civil rights and a deputy general counsel. , said in an interview. He called it “a significant technological advancement for how machine learning is used to deliver personalized ads.”
Facebook, which has become a business colossus by collecting its users’ data and letting advertisers target ads based on its characteristics of an audience, has faced complaints for years that some of those practices are biased and discriminatory. The company’s ad systems have allowed marketers to choose who saw thousands of different features using their ads, which also let those advertisers exclude people who fall under a number of protected categories.
While Tuesday’s settlement pertains to housing ads, Meta said it also plans to apply its new system of targeting ads related to employment and credit. The company has faced a blowback for allowing bias against women in job ads and excluding certain credit card ads from people watching.
“Because of this groundbreaking lawsuit, Meta will – for the first time – change its ad delivery system to address algorithmic discrimination,” Damian Williams, a US attorney, said in a statement. “But if Meta fails to demonstrate that it has sufficiently changed its delivery system to guard against algorithmic bias, this office will proceed with the litigation.”
The issue of biased ad targeting has been particularly debated in housing ads. In 2018, Ben Carson, the Secretary of the Department of Housing and Urban Development at the time, announced a formal complaint against Facebook, accusing the company of having ad systems that were “unlawfully discriminated” based on categories such as race, religion and disability. Facebook’s potential for ad discrimination was also revealed in a 2016 investigation by ProPublica, which showed that the company made it simple for marketers to exclude specific ethnic groups for advertising purposes.
In 2019, HUD sued Facebook for engaging in housing discrimination and violating the Fair Housing Act. The agency said Facebook’s systems did not deliver ads to “a diverse audience,” even if an advertiser wanted to be seen broadly.
“Facebook is discriminating against people based on who they are and where they live,” Mr. Carson said at the time. “Using a computer to limit a person’s housing choices can be just as discriminatory as slamming a door into someone’s face.”
The HUD suits amid a broader push from civil rights groups claiming that vast and complex advertising systems that underpin some of the largest Internet platforms have built-in biases into them, and that tech companies like Meta, Google and others should do more to bat. back those biases.
The area of study, known as “algorithmic fairness,” has been a significant topic of interest among computer scientists in the field of artificial intelligence. Leading researchers, including former Google scientists like Timnit Gebru and Margaret Mitchell, have sounded the alarm bell for such biases for years.
In the years since, Facebook has clamped down on the types of categories that marketers can choose from when purchasing housing ads, cutting down the number to hundreds and eliminating options based on race, age and ZIP code.
Meta’s new system, which is still in development, will occasionally check on who is being served ads for housing, employment and credit, and make sure those audiences match up with the people who want marketers. If the ads being served skew heavily toward white men in their 20s, for example, the new system would theoretically identify this shift and serve to be more equitably broader and more varied audiences.
Meta said it will work with HUD over the coming months to incorporate technology into Meta’s ad targeting systems, and agree to a third-party audit of the new system’s effectiveness.
The penalty that Meta is paying in the settlement is the maximum available under the Fair Housing Act, the Justice Department said.