To read this content please select one of the options below:

Algorithmic equity in the hiring of underrepresented IT job candidates

Lynette Yarger (College of Information Sciences and Technology, Pennsylvania State University, University Park, Pennsylvania, USA)
Fay Cobb Payton (Department of Information Systems and Business Technology, North Carolina State University, Raleigh, North Carolina, USA)
Bikalpa Neupane (College of Information Sciences and Technology, Pennsylvania State University, University Park, Pennsylvania, USA)

Online Information Review

ISSN: 1468-4527

Article publication date: 31 December 2019

Issue publication date: 9 June 2020




The purpose of this paper is to offer a critical analysis of talent acquisition software and its potential for fostering equity in the hiring process for underrepresented IT professionals. The under-representation of women, African-American and Latinx professionals in the IT workforce is a longstanding issue that contributes to and is impacted by algorithmic bias.


Sources of algorithmic bias in talent acquisition software are presented. Feminist design thinking is presented as a theoretical lens for mitigating algorithmic bias.


Data are just one tool for recruiters to use; human expertise is still necessary. Even well-intentioned algorithms are not neutral and should be audited for morally and legally unacceptable decisions. Feminist design thinking provides a theoretical framework for considering equity in the hiring decisions made by talent acquisition systems and their users.

Social implications

This research implies that algorithms may serve to codify deep-seated biases, making IT work environments just as homogeneous as they are currently. If bias exists in talent acquisition software, the potential for propagating inequity and harm is far more significant and widespread due to the homogeneity of the specialists creating artificial intelligence (AI) systems.


This work uses equity as a central concept for considering algorithmic bias in talent acquisition. Feminist design thinking provides a framework for fostering a richer understanding of what fairness means and evaluating how AI software might impact marginalized populations.



This paper forms part of the special section on Social and cultural biases in information, algorithms and systems.


Yarger, L., Cobb Payton, F. and Neupane, B. (2020), "Algorithmic equity in the hiring of underrepresented IT job candidates", Online Information Review, Vol. 44 No. 2, pp. 383-395.



Emerald Publishing Limited

Copyright © 2019, Emerald Publishing Limited

Related articles