Facebook’s ad delivery algorithms have been known to introduce bias and create echo chambers for job, housing and politics ads for several years. Our latest research shows that for job ad delivery, Facebook’s algorithms are not merely biased, but discriminatory under US law.

2/ We develop a new auditing methodology to distinguish between skew introduced by job ad delivery algorithms that may be explainable by differences in qualifications (permissible by law) from skew due to factors such as the ad platform’s optimization for its business objectives.
3/ The idea is to simultaneously run ads for jobs with identical qualifications but skewed gender distributions in reality. The pairing controls for factors such as competition from other advertisers, allowing to isolate the role of the delivery algorithm in reproducing biases.
4/ We apply our methodology to Facebook and LinkedIn using neutral job ads for delivery drivers, software engineers, and sales associates, and find a statistically significant skew in delivery by gender in Facebook’s case for all three job categories.
5/ The gender skew in delivery persists even when an advertiser chooses to optimize for “reach”, i.e. aims to show their job ad to a broad audience, rather than only to the people likely to click it.
6/ Although there may be a debate about allocating responsibility for discrimination between advertiser and platform when the advertiser asks to optimize for clicks, the responsibility for any discrimination observed in “reach” ads rests on the ad platform.
7/ Our work shows that despite commitments in response to prior studies, settlements, & a civil rights audit, Facebook hasn't made visible progress in addressing discrimination in its ad delivery algorithms; and calls for meaningful algorithmic transparency to be mandated by law.
8/ Our paper https://ant.isi.edu/datasets/addelivery/Discrimination-Job-Ad-Delivery.pdf and presentation will appear later this month @TheWebConf.