Lambrecht, A and Tucker, C E (2020) Apparent Algorithmic Discrimination and Real-Time Algorithmic Learning. Working Paper. Social Sciences Research Network.
Abstract
An important concern is that algorithms can inadvertently discriminate against minority groups and reinforce existing inequality. Typically, the worry is that when classification algorithms are trained on a dataset that itself reflects bias this may reinforce bias. However, in the world of digital content many algorithms are making judgements in real time to determine what content will be engaging. We revisit the context of a classic study which documents that searches on Google for black names were more likely to return ads that highlighted the need for a criminal background check than searches for white names. We document that one explanation for this finding is that if an algorithm receives in real time less data about one group, it will learn at different speeds. Since black names are less common, the algorithm learns about the quality of the underlying ad more slowly, and as a result an ad, including an undesirable ad, is more likely to persist for searches next to black names even if the algorithm judges the ad to be of low-quality. We extend this result by presenting evidence that ads targeted towards searches for religious groups persist for longer for religious groups that are less searched for. This suggests that the process of real-time algorithmic learning can lead to differential outcomes across those whose characteristics are more common and those who are rarer in society.
More Details
Item Type: | Monograph (Working Paper) |
---|---|
Subject Areas: | Marketing |
Date Deposited: | 20 Jan 2021 13:01 |
Last Modified: | 21 Dec 2024 02:54 |
URI: | https://lbsresearch.london.edu/id/eprint/1638 |