Search results

1 – 5 of 5
Content available
Article
Publication date: 1 July 1999

D.M. Hutton

110

Abstract

Details

Kybernetes, vol. 28 no. 5
Type: Research Article
ISSN: 0368-492X

Keywords

Content available
Book part
Publication date: 21 March 2017

Abstract

Details

Grassroots Leadership and the Arts for Social Change
Type: Book
ISBN: 978-1-78635-687-1

Content available
Book part
Publication date: 15 September 2017

Abstract

Details

Including a Symposium on the Historical Epistemology of Economics
Type: Book
ISBN: 978-1-78714-537-5

Open Access
Article
Publication date: 31 March 2022

Kun Tracy Wang, Guqiang Luo and Li Yu

The purpose of this study is to examine whether and how analysts’ foreign ancestral origins would have an effect on analysts’ earning forecasts in particular and ultimately on…

Abstract

Purpose

The purpose of this study is to examine whether and how analysts’ foreign ancestral origins would have an effect on analysts’ earning forecasts in particular and ultimately on firms’ information environment in general.

Design/methodology/approach

By inferring analysts’ ancestral countries based on their surnames, this study empirically examines whether analysts’ ancestral countries affect their earnings forecast errors.

Findings

Using novel data on analysts’ foreign ancestral origins from more than 110 countries, this study finds that relative to analysts with common American surnames, analysts with common foreign surnames tend to have higher earnings forecast errors. The positive relation between analyst foreign surnames and earnings forecast errors is more likely to be observed for African-American analysts and analysts whose ancestry countries are geographically apart from the USA. In contrast, this study finds that when analysts’ foreign countries of ancestry are aligned with that of the CEOs, analysts exhibit lower earnings forecast errors relative to analysts with common American surnames. More importantly, the results show that firms followed by more analysts with foreign surnames tend to exhibit higher earnings forecast errors.

Originality/value

Taken together, findings of this study are consistent with the conjecture that geographical, social and ethnical proximity between managers and analysts affect firms’ information environment. Therefore, this study contributes to the determinants of analysts’ earnings forecast errors and adds to the literature on firms’ information environment.

Details

China Accounting and Finance Review, vol. 24 no. 1
Type: Research Article
ISSN: 1029-807X

Keywords

Content available
Article
Publication date: 14 March 2023

Paula Hall and Debbie Ellis

Gender bias in artificial intelligence (AI) should be solved as a priority before AI algorithms become ubiquitous, perpetuating and accentuating the bias. While the problem has…

3019

Abstract

Purpose

Gender bias in artificial intelligence (AI) should be solved as a priority before AI algorithms become ubiquitous, perpetuating and accentuating the bias. While the problem has been identified as an established research and policy agenda, a cohesive review of existing research specifically addressing gender bias from a socio-technical viewpoint is lacking. Thus, the purpose of this study is to determine the social causes and consequences of, and proposed solutions to, gender bias in AI algorithms.

Design/methodology/approach

A comprehensive systematic review followed established protocols to ensure accurate and verifiable identification of suitable articles. The process revealed 177 articles in the socio-technical framework, with 64 articles selected for in-depth analysis.

Findings

Most previous research has focused on technical rather than social causes, consequences and solutions to AI bias. From a social perspective, gender bias in AI algorithms can be attributed equally to algorithmic design and training datasets. Social consequences are wide-ranging, with amplification of existing bias the most common at 28%. Social solutions were concentrated on algorithmic design, specifically improving diversity in AI development teams (30%), increasing awareness (23%), human-in-the-loop (23%) and integrating ethics into the design process (21%).

Originality/value

This systematic review is the first of its kind to focus on gender bias in AI algorithms from a social perspective within a socio-technical framework. Identification of key causes and consequences of bias and the breakdown of potential solutions provides direction for future research and policy within the growing field of AI ethics.

Peer review

The peer review history for this article is available at https://publons.com/publon/10.1108/OIR-08-2021-0452

Details

Online Information Review, vol. 47 no. 7
Type: Research Article
ISSN: 1468-4527

Keywords

Access

Only content I have access to

Year

Content type

1 – 5 of 5