Skip to main content

Facebook will remove 5,000 ad targeting categories to prevent discrimination

Facebook will remove 5,000 ad targeting categories to prevent discrimination

/

The company’s ad platform is still being heavily scrutinized for its targeting tools

Share this story

Illustration by Alex Castro / The Verge

Facebook’s latest attempt at placating activists and lawmakers who say its advertising platform permits discrimination is to remove 5,000 options that can be used to exclude certain religious and ethnic minority groups.

The company’s response, outlined in a blog post today, arrives just a few days after the US Department of Housing and Urban Development (HUD) filed an official complaint against the company alleging it violated the Fair Housing Act. The HUD investigation findings were just the latest in a multi-month series of mea culpas from Facebook representatives and ongoing litigation from nonprofit groups. The HUD complaint also opened the door to a federal lawsuit, reports The Washington Post, perhaps prompting Facebook to take more definitive action.

The removal is scheduled to happen this fall, and it will prevent advertisers from excluding identifiers like “Native American culture,” “Islamic culture,” and “Buddhism,” among thousands of others, reports BuzzFeed. Those categories are based on optional behavior on behalf of Facebook users, including likes and participation in groups and pages. However, they can be used as proxies to racially discriminate against certain users in ads for housing, employment, and other federally regulated markets, according to investigative journalists who have repeatedly bypassed Facebook’s safeguards to purchase test ads.

Facebook is taking steps following a federal complaint that may lead to a lawsuit

“We’re committed to protecting people from discriminatory advertising on our platforms. That’s why we’re removing over 5,000 targeting options to help prevent misuse,” reads Facebook’s blog post. “While these options have been used in legitimate ways to reach people interested in a certain product or service, we think minimizing the risk of abuse is more important. This includes limiting the ability for advertisers to exclude audiences that relate to attributes such as ethnicity or religion.”

In 2016, ProPublica began an extensive look at Facebook’s ad platform, initially revealing that Facebook’s targeting tools permitted advertisers to exclude black Americans from housing ads, which is a violation of federal law. This was possible because Facebook has historically included targeting options under the name “ethnic affinity,” claiming that a user’s affinity for certain pages, groups, and other Facebook content did not technically correspond to their race or religious background. (The company has since renamed the targeting category from “ethnic affinity” to “multicultural affinity,” and it’s reclassified it as a behavior instead of a demographic.)

In numerous subsequent reports, ProPublica and other news outlets discovered that those same tools let advertisers exclude multiple ethnic groups, religious groups, and other protected classes from not just housing ads, but ads for employment and insurance. Investigations also found that by using ZIP codes and other targeting techniques, advertisers could even bypass relying on racial identifiers by engaging in what’s called redlining, an age-old discrimination technique that deprives certain geographic areas of resources. Even after Facebook claimed to address the problem, the discrimination tools remained intact and the company’s algorithmic and human review systems failed to catch obvious examples.

Facebook has tried and failed to fix the problem multiple times over the last 18 months

Earlier this year, Facebook permanently removed advertisers’ ability to use these multicultural affinity groups for any type of advertising, following a temporary ban on the activity last year. It also signed an agreement with Washington state in July pledging to scrub its platform of discriminatory targeting tools that went beyond race and religion and included veteran and military status, disability status, national origin, and sexual orientation.

Now, it looks like Facebook is going even further by removing thousands of additional identifiers across the board, and not just for ads in categories that invite federal oversight like housing and employment.

“We want to help educate advertisers about their obligations under our policies. For over a year, we have required advertisers we identify offering housing, employment or credit ads to certify compliance with our non-discrimination policy. In the coming weeks, this new certification will roll out gradually to US advertisers via our Ads Manager tool,” reads the blog post. “Advertisers will be required to complete this certification in order to continue advertising on Facebook. We’ve designed this education in consultation with outside experts to underscore the difference between acceptable ad targeting and ad discrimination.”