Skip to main content

Facebook’s ad delivery system still has gender bias, new study finds

Facebook’s ad delivery system still has gender bias, new study finds

/

Women are excluded from seeing some job listings ‘beyond what can be legally justified’

Share this story

Photo: Michele Doying / The Verge

An audit by researchers at the University of Southern California found that Facebook’s ad delivery system discriminates against women, showing them different ads than it shows to men and excluding women from seeing some ads.

“Facebook’s ad delivery can result in skew of job ad delivery by gender beyond what can be legally justified by possible differences in qualifications,” the researchers wrote in their report, “thus strengthening the previously raised arguments that Facebook’s ad delivery algorithms may be in violation of anti-discrimination laws.”

The team of researchers bought ads on Facebook for delivery driver job listings that had similar qualification requirements but for different companies. The ads did not specify a specific demographic. One was an ad for Domino’s pizza delivery drivers, the other for Instacart drivers. According to the researchers, Instacart has more female drivers but Domino’s has more male drivers. Sure enough, the study found that Facebook targeted the Instacart delivery job to more women and the Domino’s delivery job to more men.

The researchers conducted a similar experiment on LinkedIn, where they found the platform’s algorithm showed the Domino’s listing to as many women as it showed the Instacart ad.

This isn’t the first time research has found Facebook’s ad targeting system to be discriminating against some users

Two other pairs of similar job listings the researchers tested on Facebook revealed similar findings: a listing for a software engineer at Nvidia and a job for a car salesperson were shown to more men, and a Netflix software engineer job and jewelry sales associate listing were shown to more women. Whether that means the algorithm had figured out each job’s current demographic when it targeted the ads is not clear since Facebook is tight-lipped about how its ad delivery works.

“Our system takes into account many signals to try and serve people ads they will be most interested in, but we understand the concerns raised in the report,” Facebook spokesperson Tom Channick said in an email to The Verge. “We’ve taken meaningful steps to address issues of discrimination in ads and have teams working on ads fairness today. We’re continuing to work closely with the civil rights community, regulators, and academics on these important matters.”

This isn’t the first time research has found Facebook’s ad targeting system to be discriminating against some users, however. A 2016 investigation by ProPublica found that Facebook’s “ethnic affinities” tool could be used to exclude Black or Hispanic users from seeing specific ads. If such ads were for housing or job opportunities, the targeting could have been considered in violation of federal law. Facebook said in response it would bolster its anti-discrimination efforts, but a second ProPublica report in 2017 found the same problems existed.

And in 2019, the US Department of Housing and Urban Development filed charges against Facebook for housing discrimination, after finding there was reasonable cause to believe Facebook had served ads in violation of the Fair Housing Act.

HUD said in a complaint that Facebook’s targeting tools were reminiscent of redlining practices, as it allowed ads to exclude men or women from seeing particular ads, as well as a map tool “to exclude people who live in a specified area from seeing an ad by drawing a red line around that area,” according to the complaint. Facebook settled the lawsuit and said in 2019 it had dropped ad targeting options for housing and job ads.

Updated April 9th 11:53AM ET: Adds comment from Facebook spokesperson