The enormous financial success of online advertising platforms is partially due to the precise targeting features they offer. Although researchers and journalists have found many ways that advertisers can target—or exclude—particular groups of users seeing their ads, comparatively little attention has been paid to the implications of the platform’s ad delivery process, comprised of the platform’s choices about which users see which ads. It has been hypothesized that this process can “skew” ad delivery in ways that the advertisers do not intend, making some users less likely than others to see particular ads based on their demographic characteristics. In this talk I will demonstrate that such skewed delivery occurs on Facebook, due to market and financial optimization effects as well as the platform’s own predictions about the “relevance” of ads to different groups of users. We observe significant skew in delivery along gender and racial lines for “real” ads for employment and housing opportunities despite neutral targeting parameters. We also measure skews in delivery of political ads along inferred political leaning. Our results demonstrate  mechanisms that can lead to potentially discriminatory ad delivery, even when advertisers set their targeting parameters to be highly inclusive. This underscores the need for policymakers and platforms to carefully consider the role of the ad delivery optimization run by ad platforms themselves—and not just the targeting choices of advertisers—in preventing discrimination in digital advertising.


Piotr Sapiezynski is an Associate Research Scientist at the Khoury College of Computer Sciences at Northeastern University in Boston, MA. He received a PhD in Network Science/Data Science from Technical University of Denmark. The core of his work is auditing platforms and their algorithms for fairness and privacy. He investigates systems that are optimized for corporate profit yet drive many aspects of our daily lives. All too often he find these systems have (possibly unintended but often predictable) side effects that bring harm to individuals and the society.

His findings on Facebook's use of personal data without consent were a part of a record $5B dollar settlement with the US Federal Trade Commission. The Department of Justice settled with Facebook over issues documented through his research, in the first ever case where the federal government challenged algorithmic discrimination under the Fair Housing Act. He briefed Members of the House Financial Services Committee on discriminatory effects in ad delivery. He presented at a public hearing to the European Parliament's Internal Market and Consumer Protection Committee on price differentiation and negative effects of personalization in political advertising. He was a member of the Public Health, Surveillance, and Human Rights Network in the wake of the COVID-19 pandemic.