Biased Humans, (Un)Biased Algorithms?

Journal of Business Ethics 183 (3):637-652 (2022)
  Copy   BIBTEX

Abstract

Previous research has shown that algorithmic decisions can reflect gender bias. The increasingly widespread utilization of algorithms in critical decision-making domains (e.g., healthcare or hiring) can thus lead to broad and structural disadvantages for women. However, women often experience bias and discrimination through human decisions and may turn to algorithms in the hope of receiving neutral and objective evaluations. Across three studies (N = 1107), we examine whether women’s receptivity to algorithms is affected by situations in which they believe that their gender identity might disadvantage them in an evaluation process. In Study 1, we establish, in an incentive-compatible online setting, that unemployed women are more likely to choose to have their employment chances evaluated by an algorithm if the alternative is an evaluation by a man rather than a woman. Study 2 generalizes this effect by placing it in a hypothetical hiring context, and Study 3 proposes that relative algorithmic objectivity, i.e., the perceived objectivity of an algorithmic evaluator over and against a human evaluator, is a driver of women’s preferences for evaluations by algorithms as opposed to men. Our work sheds light on how women make sense of algorithms in stereotype-relevant domains and exemplifies the need to provide education for those at risk of being adversely affected by algorithmic decisions. Our results have implications for the ethical management of algorithms in evaluation settings. We advocate for improving algorithmic literacy so that evaluators and evaluatees (e.g., hiring managers and job applicants) can acquire the abilities required to reflect critically on algorithmic decisions.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 91,438

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

When AI Is Gender-biased.Galit P. Wellner - 2020 - Humana Mente 13 (37).
Democratizing Algorithmic Fairness.Pak-Hang Wong - 2020 - Philosophy and Technology 33 (2):225-244.
Is Critical Thinking Culturally Biased?Robert H. Ennis - 1998 - Teaching Philosophy 21 (1):15-33.
Complementarity of behavioral biases.Toru Suzuki - 2012 - Theory and Decision 72 (3):413-430.

Analytics

Added to PP
2022-03-01

Downloads
27 (#580,079)

6 months
15 (#160,492)

Historical graph of downloads
How can I increase my downloads?