In 2019, Facebook settled a lawsuit with civil rights organizations following the revelation that advertisers could use the targeting options on its platform to exclude many specific demographic groups from seeing its ads. It’s now more difficult for an unscrupulous advertiser to use Facebook’s platform to discriminate.
However, even when you remove human bias from the system, Facebook’s ad delivery algorithms can result in biased outcomes. According to researchers at Northeastern University, Facebook sometimes displays ads to highly skewed audiences based on the content of the ad.
By purchasing ads and inputting neutral targeting options, the researchers found that the algorithmically determined audience for job ads for cleaners, secretaries, nurses, and preschool teachers was mostly women. The job ads for fast food workers, supermarket cashiers, and taxi drivers skewed toward Black users.
As we show in the video above, this research shows that by targeting “relevant” users, these systems can reinforce existing disparities in our interests and our opportunities. Users who are comfortable with being stereotyped for their taste in shoes or music might not feel the same way about being stereotyped for job ads or political messages.
You can find this video and all of Vox’s videos on YouTube. And join the Open Sourced Reporting Network to help us report on the real consequences of data, privacy, algorithms, and AI.
Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.