
Algorithmic discrimination causes less moral outrage than human discrimination.
Author(s) -
Yochanan E. Bigman,
Desman Wilson,
Mads N Arnestad,
Adam Waytz,
Kurt Gray
Publication year - 2023
Publication title -
journal of experimental psychology. general
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.521
H-Index - 161
eISSN - 1939-2222
pISSN - 0096-3445
DOI - 10.1037/xge0001250
Subject(s) - outrage , attribution , context (archaeology) , psycinfo , psychology , praise , social psychology , human rights , political science , law , politics , paleontology , medline , biology
Companies and governments are using algorithms to improve decision-making for hiring, medical treatments, and parole. The use of algorithms holds promise for overcoming human biases in decision-making, but they frequently make decisions that discriminate. Media coverage suggests that people are morally outraged by algorithmic discrimination, but here we examine whether people are less outraged by algorithmic discrimination than by human discrimination. Eight studies test this algorithmic outrage deficit hypothesis in the context of gender discrimination in hiring practices across diverse participant groups (online samples, a quasi-representative sample, and a sample of tech workers). We find that people are less morally outraged by algorithmic (vs. human) discrimination and are less likely to hold the organization responsible. The algorithmic outrage deficit is driven by the reduced attribution of prejudicial motivation to algorithms. Just as algorithms dampen outrage, they also dampen praise-companies enjoy less of a reputational boost when their algorithms (vs. employees) reduce gender inequality. Our studies also reveal a downstream consequence of algorithmic outrage deficit-people are less likely to find the company legally liable when the discrimination was caused by an algorithm (vs. a human). We discuss the theoretical and practical implications of these results, including the potential weakening of collective action to address systemic discrimination. (PsycInfo Database Record (c) 2023 APA, all rights reserved).