The purpose of this paper is to propose that in order to tackle the question of bias in algorithms, a systemic, sociotechnical and holistic perspective is needed. With reference to the term “algorithmic culture,” the interconnectedness and mutual shaping of society and technology are postulated. A sociotechnical approach requires translational work between and across disciplines. This conceptual paper undertakes such translational work. It exemplifies how gender and diversity studies, by bringing in expertise on addressing bias and structural inequalities, provide a crucial source for analyzing and mitigating bias in algorithmic systems.
After introducing the sociotechnical context, an overview is provided regarding the contemporary discourse around bias in algorithms, debates around algorithmic culture, knowledge production and bias identification as well as common solutions. The key concepts of gender studies (situated knowledges and strong objectivity) and concrete examples of gender bias then serve as a backdrop for revisiting contemporary debates.
The key concepts reframe the discourse on bias and concepts such as algorithmic fairness and transparency by contextualizing and situating them. The paper includes specific suggestions for researchers and practitioners on how to account for social inequalities in the design of algorithmic systems.
A systemic, gender-informed approach for addressing the issue is provided, and a concrete, applicable methodology toward a situated understanding of algorithmic bias is laid out, providing an important contribution for an urgent multidisciplinary dialogue.
The authors wish to thank the anonymous reviewers for their suggestions and additional references for this paper.
This paper forms part of the special section on Social and cultural biases in information, algorithms and systems.
Draude, C., Klumbyte, G., Lücking, P. and Treusch, P. (2020), "Situated algorithms: a sociotechnical systemic approach to bias", Online Information Review, Vol. 44 No. 2, pp. 325-342. https://doi.org/10.1108/OIR-10-2018-0332
Emerald Publishing Limited
Copyright © 2019, Emerald Publishing Limited