Skip to main content

Facebook will study whether its algorithms are racially biased

Facebook will study whether its algorithms are racially biased

/

The new equity teams will study the company’s AI efforts

Share this story

Illustration by James Bareham / The Verge

Facebook is forming new internal teams dedicated to studying its main social network and Instagram for racial bias, in particular for whether its algorithms trained using artificial intelligence adversely affect Black, Hispanic, and other underrepresented groups. The team will be “tasked with ensuring fairness and equitable product development are present in everything we do,” a Facebook spokesperson tells The Verge. “We will continue to work closely with Facebook’s Responsible AI team to ensure we are looking at potential biases across our respective platforms.”

The new equity team — as it’s called within Instagram (a similar team is being formed to study Facebook’s main app and website) — marks a departure for the company, which has historically resisted efforts within the company to study effects of racial bias. The news of Facebook’s new research teams was first reported on Tuesday by The Wall Street Journal.

“The racial justice movement is a moment of real significance for our company. Any bias in our systems and policies runs counter to providing a platform for everyone to express themselves,” Vishal Shah, the vice president of product for Instagram, said in a statement given to The Verge. While we’re always working to create a more equitable experience, we are setting up additional efforts to continue this progress — from establishing the Instagram Equity Team to Facebook’s Inclusive Product Council.”

“The racial justice movement is a moment of real significance for our company.”

The company’s decisions today arrive as protests against racism and policy brutality continue daily in American cities following George Floyd’s death in late May. Facebook is also enduring the final stages of a month-long advertising boycott under the #StopHateForProfit movement organized by the Anti-Defamation League, Color of Change, the NAACP, and other civil rights groups. The boycott has included large spenders like Coca-Cola, Disney, McDonald’s, Starbucks, and Walmart, among dozens of others, although CEO Mark Zuckerberg was called “disappointing” and accused of offering shallow excuses when meeting with boycott organizers earlier this month.

Last month, Instagram CEO Adam Mosseri pledged to overhaul how the company tries to solve problems that Black and other underrepresented groups face on the platform. “It starts with accounting for the experiences and challenges that underrepresented groups, such as our Black community, face when they use Instagram,” Mosseri wrote. “We’ve done a lot of work to better understand the impact our platform has on different groups, and that’s helped us get to where we are today. But I think there’s more to do across some key areas, which fit into our broader company commitments.” Mosseri specifically called out harassment, account verification, distribution, and algorithmic bias as areas he thinks Instagram needs to improve on to better serve Black users.

Facebook has a rocky history with racial bias. The company was found to have been allowing advertisers to exclude certain minority groups when advertising within federally regulated markets like housing and jobs. Facebook often sidestepped the concerns using technicalities, such as classifying users not by race but by so-called “ethnic” or “multicultural” affinities, or self-selecting identity groups consistent with what a majority of Black, Hispanic, or other minorities might statistically like on the platform. Only after rigorous coverage from media outlets, most prominently ProPublica, did Facebook eventually disable ad targeting options for housing, job, and credit ads as part of a legal settlement with civil rights groups.

The company is also comprised of less than 4 percent Black employees, and last year company officials barred employees from studying the racial effects of its platform without approval from the most senior leaders, according to the WSJ. As a result, Facebook software that automates moderation like account suspensions on Instagram has been found to disproportionately affect users the platform suspects are Black, the WSJ reports.

Facebook has pledged to donate $200 million to support black-owned businesses in the wake of Black Lives Matter protests around the country. The company also said last month it plans to boost the percentage of Black and other underrepresented minorities composing ts leadership team by 30 percent over the next five years, as well as trying to double the number of Black and Hispanic employees by 2023.

An 89-page civil rights audit conducted bylaw firm Relman Colfax over the course of two years and released on July 8th called for Facebook to study and eliminate racial bias in its AI-trained software and systems, although those conducting the audit were not permitted to study internal company research and AI models.