Search results

1 – 3 of 3
Open Access
Article
Publication date: 26 July 2019

Helen Creswick, Liz Dowthwaite, Ansgar Koene, Elvira Perez Vallejos, Virginia Portillo, Monica Cano and Christopher Woodard

The voices of children and young people have been largely neglected in discussions of the extent to which the internet takes into account their needs and concerns. This paper aims…

3000

Abstract

Purpose

The voices of children and young people have been largely neglected in discussions of the extent to which the internet takes into account their needs and concerns. This paper aims to highlight young people’s lived experiences of being online.

Design/methodology/approach

Results are drawn from the UnBias project’s youth led discussions, “Youth Juries” with young people predominantly aged between 13 and 17 years.

Findings

Whilst the young people are able to use their agency online in some circumstances, many often experience feelings of disempowerment and resignation, particularly in relation to the terms and conditions and user agreements that are ubiquitous to digital technologies, social media platforms and other websites.

Practical implications

Although changes are afoot as part of the General Data Protection Regulation (herein the GDPR) to simplify the terms and conditions of online platforms (European Union, 2016), it offers little practical guidance on how it should be implemented to children. The voices and opinions of children and young people are put forward as suggestions for how the “clear communication to data subjects” required by Article 12 of the GDPR in particular should be implemented, for example, recommendations about how terms and conditions can be made more accessible.

Originality/value

Children and young people are an often overlooked demographic of online users. This paper argues for the importance of this group being involved in any changes that may affect them, by putting forward recommendations from the children and young people themselves.

Details

Journal of Information, Communication and Ethics in Society, vol. 17 no. 2
Type: Research Article
ISSN: 1477-996X

Keywords

Article
Publication date: 9 April 2019

Helena Webb, Menisha Patel, Michael Rovatsos, Alan Davoust, Sofia Ceppi, Ansgar Koene, Liz Dowthwaite, Virginia Portillo, Marina Jirotka and Monica Cano

The purpose of this paper is to report on empirical work conducted to open up algorithmic interpretability and transparency. In recent years, significant concerns have arisen…

Abstract

Purpose

The purpose of this paper is to report on empirical work conducted to open up algorithmic interpretability and transparency. In recent years, significant concerns have arisen regarding the increasing pervasiveness of algorithms and the impact of automated decision-making in our lives. Particularly problematic is the lack of transparency surrounding the development of these algorithmic systems and their use. It is often suggested that to make algorithms more fair, they should be made more transparent, but exactly how this can be achieved remains unclear.

Design/methodology/approach

An empirical study was conducted to begin unpacking issues around algorithmic interpretability and transparency. The study involved discussion-based experiments centred around a limited resource allocation scenario which required participants to select their most and least preferred algorithms in a particular context. In addition to collecting quantitative data about preferences, qualitative data captured participants’ expressed reasoning behind their selections.

Findings

Even when provided with the same information about the scenario, participants made different algorithm preference selections and rationalised their selections differently. The study results revealed diversity in participant responses but consistency in the emphasis they placed on normative concerns and the importance of context when accounting for their selections. The issues raised by participants as important to their selections resonate closely with values that have come to the fore in current debates over algorithm prevalence.

Originality/value

This work developed a novel empirical approach that demonstrates the value in pursuing algorithmic interpretability and transparency while also highlighting the complexities surrounding their accomplishment.

Details

Journal of Information, Communication and Ethics in Society, vol. 17 no. 2
Type: Research Article
ISSN: 1477-996X

Keywords

Content available
Article
Publication date: 4 September 2019

Marty J. Wolf, Alexis M. Elder and Gosia Plotka

383

Abstract

Details

Journal of Information, Communication and Ethics in Society, vol. 17 no. 2
Type: Research Article
ISSN: 1477-996X

1 – 3 of 3