“… They don’t really listen to people”: Young people’s concerns and recommendations for improving online experiences

Helen Creswick (Horizon Digital Economy Research Institute, University of Nottingham, Nottingham, UK)
Liz Dowthwaite (Horizon Digital Economy Research Institute, University of Nottingham, Nottingham, UK)
Ansgar Koene (Horizon Digital Economy Research Institute, University of Nottingham, Nottingham, UK)
Elvira Perez Vallejos (NIHR Biomedical Research Centre for Mental Health, University of Nottingham, Nottingham, UK)
Virginia Portillo (Horizon Digital Economy Research Institute, University of Nottingham, Nottingham, UK)
Monica Cano (Horizon Digital Economy Research Institute, University of Nottingham, Nottingham, UK)
Christopher Woodard (Department of Philosophy, University of Nottingham, Nottingham, UK)

Journal of Information, Communication and Ethics in Society

ISSN: 1477-996X

Article publication date: 26 July 2019

Issue publication date: 4 September 2019

3016

Abstract

Purpose

The voices of children and young people have been largely neglected in discussions of the extent to which the internet takes into account their needs and concerns. This paper aims to highlight young people’s lived experiences of being online.

Design/methodology/approach

Results are drawn from the UnBias project’s youth led discussions, “Youth Juries” with young people predominantly aged between 13 and 17 years.

Findings

Whilst the young people are able to use their agency online in some circumstances, many often experience feelings of disempowerment and resignation, particularly in relation to the terms and conditions and user agreements that are ubiquitous to digital technologies, social media platforms and other websites.

Practical implications

Although changes are afoot as part of the General Data Protection Regulation (herein the GDPR) to simplify the terms and conditions of online platforms (European Union, 2016), it offers little practical guidance on how it should be implemented to children. The voices and opinions of children and young people are put forward as suggestions for how the “clear communication to data subjects” required by Article 12 of the GDPR in particular should be implemented, for example, recommendations about how terms and conditions can be made more accessible.

Originality/value

Children and young people are an often overlooked demographic of online users. This paper argues for the importance of this group being involved in any changes that may affect them, by putting forward recommendations from the children and young people themselves.

Keywords

Citation

Creswick, H., Dowthwaite, L., Koene, A., Perez Vallejos, E., Portillo, V., Cano, M. and Woodard, C. (2019), "“… They don’t really listen to people”: Young people’s concerns and recommendations for improving online experiences", Journal of Information, Communication and Ethics in Society, Vol. 17 No. 2, pp. 167-182. https://doi.org/10.1108/JICES-11-2018-0090

Publisher

:

Emerald Publishing Limited

Copyright © 2019, Helen Creswick, Liz Dowthwaite, Ansgar Koene, Elvira Perez Vallejos, Virginia Portillo, Monica Cano and Christopher Woodard.

License

Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


Introduction

A body of scholarly research has emerged that considers how digital technologies and the online world intersect with the experiences of children and young people, revealing many ethical concerns. Research has identified the risks (e.g. of physical harm) that may be posed to children online (Livingstone et al., 2010; Livingstone and Haddon, 2009), as well as the impact of digital technologies on children’s well-being and development (Children’s Commissioner, 2018; Kidron and Rudkin, 2017; McDool et al., 2016; Royal Society for Public Health and Youth Health Movement, 2017). Social media has been shown to have some potentially damaging effects on children’s well-being, including worsening children’s anxiety, disrupting their sleep patterns and heightening their concerns around body image (Royal Society for Public Health and Youth Health Movement, 2017). Moreover, a study by McDool et al. (2016) found that increased time spent on social media was correlated with children having a decreased level of overall satisfaction with their lives. Such findings indicate a concerning lack of ethical consideration by internet providers of their effects on children and young people, leading some to argue that the digital world is not fit for purpose in meeting the needs of these groups (Children’s Commissioner, 2017; Kidron and Rudkin, 2017).

Research addressing the intersection of digital technologies and children has revealed other ethical concerns in relation to their data. Children’s use of the internet has been shown to have increased rapidly (Frith, 2017), with 2016 signifying a substantial shift in children’s behaviour as they now spend more time online than they do watching television (Childwise, 2016). The internet has become an intrinsic part of young people’s lives as they grow up in a digital age; however, young people risk becoming “datafied”, meaning that substantial amounts of information are being collected about their lives, posing concerns for their privacy and ability to consent (Lievens and Verdoodt, 2017; Lupton and Williamson, 2017). This datafication of children’s lives has led some scholars to express apprehension that little is being done to tackle the issue: “there remains little evidence that specific instruments to safeguard children’s rights in relation to dataveillance have been developed or implemented, and further attention needs to be paid to these issues” (Lupton and Williamson, 2017, p. 780). Internet providers and platforms have been accused of harbouring a “cavalier” attitude to the rights and needs of children by behaving unethically and failing to account for the needs of children (Kidron and Rudkin, 2017, p. 4). The digital world has not been designed with children and young people in mind, and as a result, this is having a detrimental effect on their safety and well-being.

That said, the intersection of digital technologies and children is attracting growing attention. The importance of internet platforms and websites behaving ethically towards children and young people has become a prominent issue amongst legislators, for example the Children and the internet inquiry by UK House of Lords (Department for Digital, Culture Media and Sport, 2017), and national (e.g. 5Rights, in the UK) and international (Unicef) organisations that protect the rights of children (Kidron and Rudkin, 2017; Third et al., 2014). Moreover, it is also slowly drawing the attention of political parties from across the political spectrum. For example, the UK Government is currently in the process of introducing an “Age Appropriate Design Code” to serve as an amendment to the Data Protection bill in an effort to champion children’s digital rights (Information Commissioner’s Office, 2018).

This paper argues that whilst moves have been made to promote the adoption of a more ethical approach towards children and young people in the digital world across the scholarly, political and policy-making spectrum, considerable work still remains to improve young people’s online experiences. Children play a central role in the UnBias project as they are invited to share their lived experiences of their online activity in Youth Juries, which are sessions designed to promote reflection, discussion and debate, and which are discussed in more detail below. By drawing attention to lived experiences, this paper uncovers important findings related to feelings of disempowerment and resignation. Previous research has discussed children’s sense of disempowerment in relation to the impact of “sharenting”, which refers to when parents post pictures and videos of their children on social media, without necessarily having asked for their child’s consent (Children’s Commissioner, 2018). This paper explores other ways that children may feel disempowered by their internet experiences, focussing in particular on their experiences of online terms and conditions. It should be noted that in this paper the term “online terms and conditions” is used as an overarching term that also includes “terms of use”, which “[…] establish the rights and obligations that will govern an agreement between two parties” (Wauters et al., 2014, p. 12), in accord with how the young people themselves talk about and understand such agreements.

The use of plain and transparent language in official, legal and technical documents aimed at all sectors of society has been campaigned for over a number of years, including several calls for evidence around this important topic from the UK government, (Koene, 2016a, 2016b). The use of plain and intelligible language also forms part of the Consumer Rights Act (2015), Legislation.gov.uk., 2015, for example in §64 and §68 (Conklin et al., 2019). Furthermore, there is a growing body of literature that discusses how online terms and conditions and click-through agreements are inadequate for ensuring that users are fully informed, are rarely read by users, and are often inaccessible even to adults (Elshout et al., 2016; Plaut and Bartlett, 2012; Wauters et al., 2014). Online terms and conditions also often include clauses that are mostly irrelevant to the average user; for example, a clause in Apple’s Mac App Store states that a customer is forbidden from downloading a book for the purpose of the development of nuclear weapons. Such clauses “[…] add to the length and complexity of T&Cs and thus to the unattractiveness for consumers to reading them” (Elshout et al., 2016, p. 15). Moreover, research has highlighted the recommendations of adults that online user agreements should be revised to be shorter and clearer, to increase readability and comprehension, and increase the overall levels of trust that users have in their fairness (Elshout et al., 2016; Plaut and Bartlett, 2012).

There is a need to go beyond this literature to also take into account the opinions of children and young people, since they have been born into a digital age and spend increasing amounts of time online (Coleman et al., 2017; Perez Vallejos et al., 2015). The Children’s Commissioner asked a law firm (Schillings) to change Instagram’s terms and conditions to be more user friendly, and the results of this showed that children were able to understand the agreement with ease (Children’s Commissioner, 2017). Thus, the significance of changing online terms and conditions should not be underestimated.

Furthermore, it is encouraging that some efforts are being made to improve the user experience of documentation, including online terms and conditions, through the General Data Protection Regulation (GDPR) (European Union, 2016), which came into force in May 2018. Article 12 states that information that relates to processing data should be clearly communicated “[…] in a concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular for any information addressed specifically to a child” (European Union, 2016). Recital 58 of the GDPR acknowledges the importance of safeguarding children on the internet by ensuring that any information that is for children is communicated in “[…] a clear and plain language that the child can easily understand” (European Union, 2016). However, the GDPR “offers little clarity as to the actual implementation and impact of a number of provisions that may significantly affect children and their rights, leading to a legal uncertainty for data controllers, parents and children” (Lievens and Verdoodt, 2017, p. 1). Whilst there are multiple areas of uncertainty in relation to guarding the rights of children in the GDPR overall (Lievens and Verdoodt, 2017, for further information), moves are being made to add clarity through the “Age Appropriate Design Code” (Information Commissioner’s Office, 2018). This paper will also build upon Article 12 of the GDPR in relation to the young people’s experiences of, and recommendations to improve, online terms and conditions.

This paper serves as a timely reminder to social media platforms, policymakers and others about the need to ensure that the changes in the stipulations of the GDPR (European Union, 2016) are implemented in a way that is accessible to young people. Moreover, this paper argues for the importance of involving young people in these discussions.

These claims reflect the ethical importance of considerations of well-being and autonomy. Here, “well-being” is understood as the value of a person’s life for that person, or, how well their life goes for them. This is a broad conception of well-being, whose importance is recognised by nearly all ethical traditions, not just utilitarianism (on the concept of well-being see Bradley, 2015). Accessibility matters to young people’s well-being because it mediates their access to and experience of online services, which are deeply embedded in many aspects of life. “Autonomy”, on the other hand, is the idea of self-governance: of an agent who makes their own decisions and is capable of implementing them (for a review of some different conceptions of autonomy, Mackenzie, 2014, pp. 16-19). It may therefore be assumed that autonomous use of online services requires understanding of their terms and conditions.

This suggests that, to protect and promote young people’s well-being and to enable them to access online services autonomously, the terms and conditions of those services must be comprehensible and accessible to them. Why not go further and claim that in addition, young people’s voices should feed into the process of ensuring this comprehensibility and accessibility? One straightforward but powerful answer is that, in practice, co-production is the best way to achieve this goal (Filipe et al., 2017), as it ensures that young people play a significant part in the research process. It is all too easy to jump to false conclusions about what is comprehensible and accessible to others; if these goals are to be taken seriously, the relevant audience should be consulted. Second, though, failure to consult may constitute a specific form of injustice. If failure to consult reflects a presumption that young people have nothing worthwhile to contribute to discussions in this area, then they may be subject to testimonial injustice, in which someone’s views are given less credibility than they should receive, due to prejudice (Fricker, 2007). Co-production is not without its challenges however, as some cast doubt on the effectiveness of such methods, believing that it may not lead to an authentic representation of the voices of children, or that it may lead to tokenism, whereby young people’s voices are channelled into pre-existing adult-driven agendas (James, 2009; Todd, 2012).

Nevertheless, there are strong ethical reasons to want to make user agreements accessible to young people, and to want to involve them in this process. Considerations of well-being and autonomy both speak in favour of these things. These reasons do not depend on the precise conception of “well-being” or “autonomy”, or the exact way in which these considerations feature in any specific ethical theory. Access to online services is such a fundamental part of contemporary life that any plausible way of protecting and promoting well-being, and of respecting autonomy, will require that terms and conditions and user agreements are accessible to young people. Moreover, including young people’s voices in this process is a necessary means of achieving this goal, as well as, perhaps, a requirement of justice.

Coleman et al. (2017) postulate that whilst considerable and important attention has gravitated towards introducing measures to safeguard and protect children on the internet, “[…] this should not be allowed to overwhelm other questions about the kind of Internet that children and young people want” (Coleman et al., 2017, p. 9). It is an imperative that research also addresses the latter by consulting children and seeking their recommendations in improving their internet experiences. The changes in the GDPR (European Union, 2016) offer an opportunity for social media providers and others to take note of the recommendations of children and young people, allowing them to be involved in the production of such agreements, with the view to improve the online experiences of this group.

Method

This paper draws on data from the UnBias project’s first wave of Youth Juries that took place in February 2017. A total of fourteen two-hour juries were conducted that included 140 young people aged between 12 and 18, plus two 19 year olds, one 20 and one 23 year old; this gave 144 participants with a mean age of 15 years. The Youth Jury method is highly regarded as an effective way to involve young people in a discussion that is youth led and that centralises the importance of discussion and reflection. Participants are encouraged to share experiences, discuss and debate issues, form and change opinions, and ultimately put forward recommendations for how an issue may be solved. The method is designed to understand the thoughts, ideas and recommendations of young people about their experiences of the internet, through facilitated group discussions around specific scenarios that were co-created with age-matched peers (Coleman et al., 2017; Perez Vallejos et al., 2015). The Youth Jury method enables the tackling of challenges related to the online world by engaging young people with the problems they may encounter, and by encouraging them to reflect on their online behaviours.

The overall purpose of the Youth Juries was to bring the voices of young people to the fore, by understanding how they use the internet and to gather a sense of their experiences online. The juries set about eliciting the views of young people by using a facilitator to stimulate discussion on different aspects of algorithmic decision making processes. The content of the juries is dynamic and changes in response to the knowledge and experience of the young people attending, although a common format ran throughout all sessions. Focussing on youth-led discussion and debate, the facilitator used slides to introduce the concept of algorithms and how they affect the online world, across each of three main themes. The three themes were: personalisation through algorithms, including consideration of filter bubbles and echo chambers; “gaming the system” as reported for autocomplete, search results and fake news and algorithm transparency and regulation. In general the topic was introduced and a couple of examples given, the jurors were encouraged to share their opinions and experiences of the issues, and then they were asked how they would like to see things change or how they would like issues to be tackled. Additional prompts were used as needed. The audio from the Youth Juries was recorded with the permission of participants, transcribed by a University approved, GDPR compliant external company, and then thematically analysed. A single researcher used an inductive approach to code all of the transcripts, before grouping them into themes using NVivo. Three other members of the research team independently coded a random selection of the transcripts. The results were validated collectively as a team, and any discrepancies were discussed and reconciled. Key themes were then identified from this analysis.

Results and discussion

Many scholars have identified the paradigm shift within social research that now views children as social actors and agents in their own right (James, 2009; Porter et al., 2012; Todd, 2012). Holt (2011) argues that scholars focussing on the importance of emphasising children’s agency have helped to counteract prevalent views that have suggested that children lack agency. This paper illustrates the importance of recognising children’s agency, although it acknowledges the challenges, both methodologically and theoretically, in adopting this viewpoint (Todd, 2012 for a detailed discussion). It is important to note that the agency of children may differ from that of adults; children now grow up in a digital age that often means that they fall under considerable peer pressure to use social media frequently, and fear social exclusion if they do not take part (Children’s Commissioner, 2018). The following results highlight the multiple ways in which jurors were able to use their agency online, before exploring how the agency of young people may be constrained by the structures that govern the online world. Discussions are framed in relation to terms and conditions, as they were marked by the jurors as something that was critically in need of change.

Agency and its constraints

Some jurors demonstrated their use of agency by identifying various strategies to help them to assess the validity of the information that they read online: “If I saw it on Facebook and I don’t know if it was true, I would search it up on Google to see if there’s any more about it […]” Whilst some will search for other similar reports online, others will pay close attention to the source of the online news, for example one juror pointed out that they would trust BBC Bitesize “because that’s made for schools so you can normally trust it.” Many jurors expressed how they used their agency by weighing up in their own minds whether something appeared to be credible: “my first filter, like personal filter when looking at something would definitely just be common sense”, with one juror postulating that fake news was “quite obvious” to identify. Others were able to look for visual clues to help them to decide whether something was credible or not. It is encouraging that young people are able to undertake these activities, as the UK curriculum does not currently teach children how to identify fake news (Children’s Commissioner, 2017).

However, not all young people will necessarily engage in such verification practices, or indeed arrive at a conclusion that is credible. Indeed, one juror pointed out that although they were able to rely on their prior knowledge to determine whether something that they read online is true or not, they acknowledged that they may not necessarily be able to adopt the same approach successfully when it came to information that was less well known: “but then what about all the other little stupid things?”

The Youth Juries also revealed that young people often displayed agency in finding ways to counteract the power of an algorithm: “I can change my IP address and use other browsers so that I can trick the algorithm.” Others pointed to alternative strategies, for example using applications to block advertisements: “what I usually do to prevent all the sponsored stuff is like turn on AdBlock”, or by turning away from applications altogether if they are bothersome, as one juror pointed out that they did this because they had “favorited a bunch of things and it kept recommending things I didn’t like and it was really annoying”. Whilst it is encouraging that some young people are able to readily utilise their agency to try to influence the behaviour or the effect of an algorithm, their strategies also indicate that many of the young people felt overwhelmed and annoyed by some of the features of algorithms. For example, many of the young people complained about being exposed to relentless advertising online, “whenever I play one game after each level ad, ad, ad. It doesn’t stop”, and felt that their countermeasures appeared to be ineffective: “well, I just get annoyed, because yesterday I was on Facebook and it was just coming up with a lot of ads like ad after ad, so I just decided to block all of them but that didn’t stop it.” Thus, whilst on one level, many of the young people actively apply methods to soften the sometimes burdensome effect of targeting by algorithms, when probed further, some responses pointed to their agency being constrained by the powerful nature of the algorithm.

During the Youth Juries, many of the young people showed a strong awareness of the importance of protecting their personal data: “I wouldn’t let anything go on social media that I wouldn’t want anyone to see.” Some expressed ways in which they were able to do this, for example, in assuming an online persona: “I generally don’t use my actual name if I do so it would be pretty hard to find me”, or in closely guarding what personal data they choose to share with other social media platforms:

I only give my name and my email address, because that’s the only information they really need and I don’t want them having my private information because then they can do stuff with that, but they can’t really do much with my name.

Despite results showing that young people actively demonstrate their agency in a digital context, in verifying information, counteracting annoying content, and protecting their personal data, there were also many ways that the young people expressed feelings of disempowerment in their online experiences. This is not to argue that young people cannot be social actors with agency, but that the nature of the online world, which has so often disregarded the needs of children, constrains their agency (Children’s Commissioner, 2017). Expressions of disempowerment permeated the discussions amongst young people, as many alluded to a disparity in power between the platform and the individual:

[…] and it’s just scary how much information they have about you, just like you sharing some of your information, because that shouldn’t be how it is, because you should be able to do your own stuff.

Jurors expressed how they “[…] can’t really talk to the creator of the website because they don’t listen to people”, indicating that some feel that they are overlooked and silenced by a digital world that does not take account of their needs and opinions.

Many of the young people recognised the benefits of algorithms in being able to personalise online searches to the users’ interests (“I like it because it’s quicker!”); however, other jurors raised concerns in relation to feeling disempowered:

It feels like, I know this sounds a bit extreme but they’re like taking away your human rights. Because it’s like you have a right to give information and keep information to yourself as well. So they’ve sort of like got rid of that line between having a choice and not a choice.

Disempowerment as a result of terms and conditions

The young people’s feelings of being excluded and disempowered by their online experiences were particularly exacerbated by their experiences of online user agreements as specified in the form of terms and conditions. This part of the paper will focus on the lived experiences of young people by identifying the current problems that they have with terms and conditions. Focussing on this particular concern allows a timely response to be given to the changes suggested by the GDPR (European Union, 2016) by putting forward the recommendations of young people as to how to improve online terms and conditions.

The jurors overwhelmingly believed that online terms and conditions were not accessible: “it is Standard English but it’s not written in a way that you can easily understand it.” One juror believed that terms and conditions were particularly exclusionary towards young people: “especially for people our age, they definitely don’t target it towards us.” Such findings support existing studies, which have found that Instagram’s terms and conditions are written in such a complex way that they are only accessible to those with a postgraduate qualification (Children’s Commissioner, 2017). Some jurors raised concerns that online terms and conditions were not transparent and that users remained wholly unaware of what happens to their information when signing up to applications: “them selling off your data to external companies, not actually being consented at all. They should make that more obvious that they’re going to do that.” A general unawareness that their data is being sold to third party companies has also been found amongst adults (doteveryone, 2018a, 2018b). However, by specifically contributing to the ‘dataveillance’ of children, both through online technologies and mediums such as educational software (Lupton and Williamson, 2017), this can exacerbate an already important problem for young people; therefore the experiences and views of this vulnerable group in response to this knowledge are also important to consider.

Other jurors believed that internet platforms and companies deliberately obfuscate their terms and conditions to ensure that individuals sign up, prioritising their profits over any moral or ethical obligation to ensure that users fully understand and consent to how they operate: “companies do it on purpose, they blatantly make them as confusing as possible so people don’t take them up on it.” This is a concerning indication that internet platforms are attracting young people to engage in an environment that contravenes numerous articles of the UN Convention on the Rights of the Child (e.g. Article 42 “knowledge of rights”, Unicef, 2010), and UK guidelines for online child safety (UK Council for Child Internet Safety, 2015).

For some, this has fostered a sense of resignation, for example many jurors said that they accepted the Terms and Conditions because they wished to use the internet platform, despite believing that they were neither accessible nor transparent: “I might read the bottom bit but most of the time I don’t understand it […] so I just tick it anyway.”

Terms of use are, more often than not, created by the service providers without any consultation with the user, and many young people disliked the fact that they had little alternative but to sign them (Plaut and Bartlett, 2012; Wauters et al., 2014). One juror had attempted to use a website without accepting the terms and conditions, however they eventually relented to sign them when the site did not allow them to continue their activity on the site:

I thought that because I tried it once on some random website and I scrolled down and I said no I do not agree and then it wouldn’t let me click the signup account like signup button until I actually said yes I agree.

This is an example of the agency of users, including children, being restricted by the current structures of digital technologies. Moreover, the social media platform may hold dominance within the market, and there may be few alternatives that offer a similar service for the user to choose instead (Wauters et al., 2014). Thus, young people may find themselves stuck between a rock and a hard place, as they are excluded from any meaningful accessibility to online terms and conditions, but have little choice but to accept them anyway to avoid feeling excluded by not being able to use their chosen website. Existing research points out that young people often feel immense pressure by their peers to join particular social media platforms and to always be available on them, which may continue to limit the active choices that they have to resist agreeing to online terms and conditions (Children’s Commissioner, 2018).

A minority but important viewpoint expressed during the Youth Juries was that often people did not actually wish to know the terms and conditions, because they will participate in the website or online platform regardless of what they say:

I think the problem is people don’t want to know because if they’re using those sites, you’re using them and your friends are using them so you’re probably going to use them anyway. But the fact that you don’t know what they’re using them for is a bit worrying but you’re not going to find out, because you just don’t want to know.

Thus, this paper argues that the digital world has been complacent in not offering any form of meaningful transparency to their users, and as a consequence this is unethical as it has fuelled a sense of exclusion amongst many young people and adults alike. Whilst efforts are under way to ensure that Terms and Conditions are accessible to all, through changes to the GDPR (European Union, 2016), it is not yet clear how this will be implemented. Wauters et al. (2014) have already identified the importance of considering the needs of the user as a basis for any agreement.

Young people’s recommendations

This paper argues that social media platforms must take into account the experiences and views of young people, and that changes to terms and conditions should be co-produced with the young people themselves. Whilst recognising the challenges and limitations of co-production (James, 2009; Porter et al., 2012; Todd, 2012), this paper argues that it is imperative that young people are consulted in how social media platforms and others communicate their terms and conditions in their user agreements. Young people across all the Youth Juries overwhelmingly agreed that online terms and conditions needed to be improved. As part of the Youth Juries, the voices of the young people were brought to the fore by not only asking them about their experiences of existing online terms and conditions, but also by asking them for their recommendations for how terms and conditions may be improved, particularly for their age group.

As a result, the jurors made many recommendations about how to improve the accessibility and transparency of online platforms, including their terms and conditions, to ensure that meaningful informed consent is achieved: “[…] it should be made fair so that the public can understand at a general level what’s going on when they’re agreeing to the terms and conditions,” and to counteract the sense of powerlessness that many of the young people alluded to when “consenting” to existing terms and conditions of online services. Such findings support the work of Eslami et al. (2015) who recognise the importance of transparency in building overall levels of trust amongst all users of social media.

The next part of this paper discusses how the terms and conditions can be made more accessible, by considering how changes to the layout, content and overall structure or format of online terms and conditions may improve their accessibility not just to young people but to all internet users. The young people believed that terms and conditions should be made more accessible “to average people”, and in particular to their age group. One juror summarised their concerns about the dangers of young people not being properly informed about the terms and conditions:

[…] it should be like a lot clearer because there’s so many much younger children using things like Instagram and they might not understand that their photo is being posted like so many different places […]. So I feel as though they should make it, so like a more concise or just like easier version so that people do understand.

There was much debate around how existing terms and conditions could be changed to make them more user-friendly. Many suggested that bullet points would be a useful way for communicating the terms and conditions, however there was some disagreement as to how many would suffice: “ten bullet points and quick to the point, no long thing”. Others believed that there should be fewer bullet points: “if there were […] three bullet points then I’d probably read them but otherwise I can’t really see myself reading them”. In the absence of a consensus, it is still clear that the young people wanted online terms and conditions to be presented in a format that was clearer, ideally with the use of bullet points.

The jurors also made many suggestions in terms of the type of language that should be used in the user agreements of social media platforms. Agreements are often filled with “legalese” that use difficult and complex legal terminology, which exclude the user and deter them from reading such agreements (Milne and Culnan, 2004; Wauters et al., 2014). The jurors believed that the language of the user agreements should be simple, and the agreement should be as short as possible. In addition, the young people spoke of their recommendations for the content of terms and conditions: “I’d like to know how long they keep your data for.” It is disappointing that existing terms and condition agreements already state this information, yet they are so inaccessible that young people (and indeed, many adults) are not aware of it. Moreover, a primary concern for many jurors was for social media platforms to have greater transparency in where and how their personal data is shared with third parties and others: “a bad thing how they don’t directly tell you they’re going to sell information to make money”. There was some disparity between the youth jurors with regards to the extent of their concerns, however even amongst those who appeared to be less concerned, many wanted more transparency in relation to where their data is sold:

If they’re going to sell it I’m not really bothered about that but I would be concerned about, I’d maybe like to know where it went or who was going to use it.

To many of the young people, the sharing and selling of their information to third parties by social media platforms was a revelation. Moreover, many expressed a great unease that the social media platforms were able to do this (“that’s really creepy”), especially if their location data was revealed: “yeah, like location data, you don’t want that being sold because that’s sensitive and there’s a lot of users that are under the age that Facebook says.” As this appeared to be a primary concern amongst many jurors, this paper argues for the importance of making this clear and transparent in the terms and conditions.

As well as changes being made to existing terms and conditions, some jurors suggested other innovative ways in which this information may be communicated after the signing of such agreements. For example, one juror explained that the user should receive an email after the social media platform sells their data, which should provide details of where the data had been sold to, and give the user the opportunity to contact the company if they do not wish for their data to be used. The juror believed that this would not only increase transparency between the platform and the user, empower the user by giving them the opportunity to withdraw their data from third parties, but that the process itself would be quite time consuming, which may deter some users and therefore benefit the company:

[…] so it’s a long process, so people are less likely to do it so they still make money but still do it as an option, so people can still opt out, but it’s just a bit more difficult.

Amongst the suggestions, the jurors came up with some interesting suggestions for how the design of engagement between the user and social media platform may be rethought. Some jurors felt that the terms and conditions could be made to be more “user-friendly”, by using pictures or videos to convey their messages. Haapio et al. (2012) discuss the opportunities that are presented by using visuals in contracts, as they may allow complex information to be communicated more simply. However, using visuals is not without its challenges, as they must suitably fit the content of the agreement so that it is understood properly, and they must be age appropriate (Haapio et al., 2012; Wauters et al., 2014). Moreover, such methods may not account for the heterogeneity in the audience, which may include children with multiple and diverse learning needs (Lievens and Verdoodt, 2017).

Some youth jurors also explained that they wanted to be able to personalise terms and conditions so that they were able to opt in and out of different parts of the agreement, particularly in relation to what personal data is shared and who it is shared with:

They could add check boxes so it’s like will you let us sell the data without consent and you can either tick it or not tick.

Given the need for companies to make money, personalised terms and conditions that allow users to decide if their data is sold on to third party companies may seem optimistic. However, it is vital that the voices of children and young people are considered, and that platforms and others recognise and work with these recommendations to make them practicable.

Whilst the jurors agreed that the terms and conditions should be more transparent, some also believed that it should be made harder for the user to simply agree without reading them. One juror recommended that there should be a minimum amount of time that the user should have to stay on the page: “I think they should force you to stay on that page for a certain amount of time.” Others suggested that the terms and conditions should be made so that the user does not have to click on a separate page to read them:

[…] and to have it somewhere where you’re kind of forced to read it. Like a lot of places you’ve got to click on to a separate page to read them. It’s like nobody’s that bothered.

Indeed, a report by the European Commission (Elshout et al., 2016) found that online internet users are considerably more likely to read terms and conditions when they are available on the same internet page.

Conclusions

The voices of young people have been systematically ignored by digital technology companies, despite them being a pivotal audience and user of the digital world. This paper puts forward the opinions and experiences of young people to convey their feelings of powerlessness and resignation. Some young people showed digital agency by their ability to counteract the effect and impact of algorithms on their internet searches, to verify information, and to stay safe online by closely guarding particular data, especially related to their location. However, often the agency of young people is constrained by the systems and structures that currently govern the way that the internet works. In particular, online terms and conditions were vocalised by the young people as being a point of frustration and inevitability. Whilst adults too may have similar experiences with terms and conditions, this paper argues that children and young people are more likely to feel pressure to sign up to platforms, and their voices are often forgotten about. Their current experiences of online terms and conditions are evidence of this, and signify that they are little understood.

In this paper, it has been argued that young people should play an instrumental role in shaping the digital world. It is vital that solutions are put forward that promote and cultivate agency amongst young people. As part of the Youth Jury methodology, the paper reiterated the value of eliciting the views of young people and encouraged them to make recommendations for how the internet could be improved. The paper also uses young people’s recommendations to provide guidance to social media platforms and others on how to implement the stipulations of Article 12 of the GDPR (European Union, 2016) effectively, and in particular how to ensure that young people are listened to and are involved in co-producing such changes. The findings from the Youth Juries have revealed that whilst young people actively use their agency when online, their sense of agency is constrained by the way that the internet is currently governed.

It is important to involve young people in the production of the terms and conditions for social media platforms and similar sites. This paper presented some of the jurors’ recommendations to help to remedy the young people’s sense of exclusion, and to contribute to fostering a more ethical and empowering internet for all. Whilst it focussed on the views of young people, for the reasons detailed above, it is important to note that adults are also affected by unclear terms and condition, so the proposed solutions are not just applicable to young people. These recommendations are of value to all stakeholders, as solutions that improve the transparency and clarity of such contracts for young people are also likely to make them more usable by others. A summary of suggestions as to how to improve the accessibility and transparency of online terms and conditions across all sections of the population are below:

  • use simple language;

  • use bullet points to convey the information;

  • summarise the key information and make explicit any information relating to the sharing and selling of data to third parties; and

  • offer “personalised” terms and conditions, which allow users to opt in and out of different parts of the agreement.

Both industry and the government must recognise the importance of the opinions of children, who are often overlooked despite being a key demographic for the online world. They must take responsibility for adapting the digital world to inform, protect and empower young people in ways that are meaningful, so that they may become informed citizens who are aware of their digital rights. Doing so forms a large part of making the digital world safer and accessible for all.

References

Bradley, B. (2015), Well-Being, Polity Press, Cambridge.

Children’s Commissioner (2017), “Growing up digital. A report of the growing up digital taskforce”, available at: www.childrenscommissioner.gov.uk/wp-content/uploads/2017/06/Growing-Up-Digital-Taskforce-Report-January-2017_0.pdf (accessed 21 January 2018).

Children’s Commissioner (2018), “Life in ‘likes’. children’s commissioner report into social media use amongst 8-12 year olds”, available at: www.childrenscommissioner.gov.uk/wp-content/uploads/2018/01/Childrens-Commissioner-for-England-Life-in-Likes-3.pdf (accessed 5 January 2018).

Childwise (2016), “Childhood 2016”, available at: www.childwise.co.uk/uploads/3/1/6/5/31656353/childwise_press_release_-_monitor_2016.pdf (accessed 23 January 2018).

Coleman, S., Pothong, K., Perez Vallejos, E. and Koene, A. (2017), “The internet on our own terms: How children and young people deliberated about their digital rights”, available at: http://casma.wp.horizon.ac.uk/wp-content/uploads/2016/08/Internet-On-Our-Own-Terms.pdf (accessed 4 January 2018).

Conklin, K., Hyde, R. and Parente, F. (2019), “Assessing plain and intelligible language in the consumer rights act: a role for reading scores?”, Legal Studies, pp. 1-20.

Department for Digital, Culture Media and Sport (2017), “House of lords communication select committee report on growing up with the internet. Government response”, London, available at: www.parliament.uk/documents/lords-committees/communications/children-internet/governmentresponsegrowingupwiththeinternet.pdf (accessed 14 May 2018).

doteveryone. (2018a), “People, power and technology: the 2018 digital attitudes report”, available at: http://attitudes.doteveryone.org.uk/files/People%20Power%20and%20Technology%20Doteveryone%20Digital%20Attitudes%20Report%202018.pdf (accessed 14 May 2018).

doteveryone. (2018b), “People, power and technology: the 2018 digital understanding report”, available at: http://understanding.doteveryone.org.uk/files/Doteveryone_PeoplePowerTechDigitalUnderstanding2018.pdf (accessed 14 May 2018).

Elshout, M., Elsen, M., Leenheer, J., Loos, M. and Luzak, J. (2016), “Study on consumers’ attitudes towards terms and conditions (T&Cs)”, available at: http://ec.europa.eu/consumers/consumer_evidence/behavioural_research/docs/terms_and_conditions_final_report_en.pdf (accessed 22 January 2018).

Eslami, M., Rickman, A., Vaccaro, K., Aleyasen, A., Vuong, A., Karahalios, K., Hamilton, K., et al. (2015), I Always Assumed That I Wasn’t Really That Close To [Her]’: Reasoning About Invisible Algorithms In News Feeds, ACM Press, pp. 153-162.

European Union (2016), “Regulation (EU) 2016/679 of the european parliament and of the council of 27 april 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing directive 95/46/EC (general data protection regulation)”, available at: http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=OJ%3AL%3A2016%3A119%3ATOC (accessed 6 February 2018).

Filipe, A., Renedo, A. and Marston, C. (2017), “The co-production of what? knowledge, values, and social relations in health care”, PLOS Biology, Vol. 15 No. 5, pp. e2001403.

Fricker, M. (2007), Epistemic Injustice: Power and the Ethics of Knowing, Oxford University Press, Oxford.

Frith, E. (2017), “Social media and children’s mental health: a review of the evidence”, available at: https://epi.org.uk/wp-content/uploads/2017/06/Social-Media_Mental-Health_EPI-Report.pdf (accessed 23 January 2018).

Haapio, H., Berger-Walliser, G., Walliser, B. and Rekola, K. (2012), “Time for a visual turn in contracting?”, Journal of Contract Management, Vol. 10, pp. 49-57.

Holt, L. (2011), “Introduction: geographies of children, youth and families: disentangling the socio-spatial contexts of young people across the globalised world”, Geographies of Children, Youth and Families: An International Perspective, Routledge, London, pp. 1-8.

Information Commissioner’s Office (2018), “Call for evidence – age appropriate design code | ICO”, available at: https://ico.org.uk/about-the-ico/ico-and-stakeholder-consultations/call-for-evidence-age-appropriate-design-code/ (accessed 25 October 2018).

James, A. (2009), “Agency”, in Qvortrup, J., Corsado, W.A. and Honig, M.-S. (Eds), The Palgrave Handbook of Childhood Studies, Palgrave Macmillan, Hampshire, pp. 34-46.

Kidron, B. and Rudkin, A. (2017), “Digital childhood. Addressing childhood development milestones in the digital environment”, available at: http://5rightsframework.com/static/Digital_Childhood_report_-_EMBARGOED.pdf (accessed 21 December 2017).

Koene, A. (2016a), “Response to lords EU internal market Sub-Committee inquiry on ‘online platforms and the EU digital single market’, publication number OPL0079, 11th january 2016”, available at: http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/eu-internal-market-subcommittee/online-platforms-and-the-eu-digital-single-market/written/26033.pdf (accessed 21 January 2019).

Koene, A. (2016b), “Response to commons science and technology committee ‘digital skills’ inquiry, publication number: DIG0029, 12th January 2016”, 12 January, available at: http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/science-and-technology-committee/digital-skills/written/26473.pdf (accessed 21 January 2019).

Legislation.gov.uk (2015), “Consumer Rights Act 2015”, available at: www.legislation.gov.uk/ukpga/2015/15/contents/data.pdf (accessed 21 January 2019).

Lievens, E. and Verdoodt, V. (2017), “Looking for needles in a haystack: key issues affecting children’s rights in the general data protection regulation”, Computer Law and Security Review, available at: https://doi.org/10.1016/j.clsr.2017.09.007.

Livingstone, S. and Haddon, L. (2009), “EU kids online: Final report”, available at: www.lse.ac.uk/media@lse/research/EUKidsOnline/EU%20Kids%20I%20(2006-9)/EU%20Kids%20Online%20I%20Reports/EUKidsOnlineFinalReport.pdf (accessed 22 January 2018).

Livingstone, S., Haddon, L., Görzig, A. and Ólafsson, K. (2010), “Risks and safety for children on the internet: the UK report”, LSE, available at: www.lse.ac.uk/media@lse/research/EUKidsOnline/EU%20Kids%20II%20(2009-11)/National%20reports/UKReport.pdf (accessed 22 January 2018).

Lupton, D. and Williamson, B. (2017), “The datafied child: the dataveillance of children and implications for their rights”, New Media and Society, Vol. 19 No. 5, pp. 780-794.

McDool, E., Powell, P., Roberts, J. and Taylor, K. (2016), “Social media use and children’s wellbeing”, available at: http://ftp.iza.org/dp10412.pdf (accessed 23 January 2018).

Mackenzie, C. (2014), “Three dimensions of autonomy: a relational analysis”, Autonomy, Oppression, and Gender, Oxford University Press, Oxford, pp. 15-41.

Milne, G.R. and Culnan, M.J. (2004), “Strategies for reducing online privacy risks: Why consumers read (or don’t read) online privacy notices”, Journal of Interactive Marketing, Vol. 18 No. 3, pp. 15-29.

Perez Vallejos, E., Koene, A., Carter, C.J., Statache, R., Rodden, T., McAuley, D., Cano, M. et al. (2015),. “Juries: acting out digital dilemmas to promote digital reflections”, presented at the ETHICOMP 2015, ACM SIGCAS Computers and Society, Leicester, Vol. 45, pp. 84-90.

Plaut, V.C. and Bartlett, R.P. (2012), “Blind consent? a social psychological investigation of non-readership of click-through agreements”, Law and Human Behavior, Vol. 36 No. 4, pp. 293-311.

Porter, G., Townsend, J. and Hampshire, K. (2012), “Children and young people as producers of knowledge”, Children’s Geographies, Vol. 10 No. 2, pp. 131-134.

Royal Society for Public Health and Youth Health Movement (2017), “Status of mind social media and young people’s mental health and wellbeing.Pdf”, Royal Society for Public Health and Youth Health Movement, available at: www.rsph.org.uk/uploads/assets/uploaded/62be270a-a55f-4719-ad668c2ec7a74c2a.pdf (accessed 23 January 2018).

Third, A., Bellerose, D., Dawkins, U., Keltie, E. and Pihl, K. (2014), “Children’s rights in the digital age [documento elettronico]: a download from children around the world”, Young and Well Cooperative Research Centre, Melbourne, available at: www.unicef.org/publications/files/Childrens_Rights_in_the_Digital_Age_A_Download_from_Children_Around_the_World_FINAL.pdf (accessed 21 December 2017).

Todd, L. (2012), “Critical dialogue, critical methodology: bridging the research gap to young people’s participation in evaluating children’s services”, Children’s Geographies, Vol. 10 No. 2, pp. 187-200.

UK Council for Child Internet Safety (2015), “Child safety online. a practical guide for providers of social media and interactive services”, available at: www.gov.uk/government/uploads/system/uploads/attachment_data/file/487973/ukccis_guide-final__3_.pdf (accessed 5 January 2018).

Unicef (2010), “A summary of the UN convention on the rights of the child”, available at: https://downloads.unicef.org.uk/wp-content/uploads/2010/05/UNCRC_summary.pdf?_ga=2.170284576.1894284748.1515149992-579898435.1515149992 (accessed 5 February 2018).

Wauters, E., Donoso, V. and Lievens, E. (2014), “Optimizing transparency for users in social networking sites”, edited by luciano morganti, info, Vol. 16 No. 6, pp. 8-23.

Acknowledgements

The authors would like to acknowledge the thoughtful contributions of Jacob LaVioletee and Caio Machado, from the Oxford Internet Institute, to a previous draft of this paper. The authors would also like to acknowledge the contribution of all project participants and all project activities to the ideas that underpin this paper.

Funding: This work was supported by EPSRC Trust, Identity, Privacy and Security grant “UnBias: Emancipating users against algorithmic biases for a trusted digital economy” (EP/N02785X/1). Elvira Perez Vallejos acknowledges the financial support of the NIHR Nottingham Biomedical Research Centre.

Copyright: Copyright remains with the authors. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Corresponding author

Helen Creswick can be contacted at: Helen.Creswick@nottingham.ac.uk

About the authors

Helen Creswick is a Research Fellow at Horizon Digital Economy Research Institute at the University of Nottingham. She works on the UnBias research project and is interested in engaging with young people to find out their views and experiences when online, and in promoting digital awareness amongst this age group and others. Helen is particularly interested in data privacy, meaningful transparency and in the ethics that are associated with these and other related digital issues. She enjoys public engagement through STEM activities.

Liz Dowthwaite is a Research Fellow in Horizon at the University of Nottingham with a research background grounded in Psychology and Human Factors. Her research interests revolve around individual attitudes and behaviour in online crowd systems, including crowdfunding, social media and citizen science. She is especially interested in motivation for participation; outreach and science communication; and the effects of the online world on psychological well-being. Amongst other projects, she currently works with young people on UnBias to understand their perceptions of algorithm-driven internet services, and the ReEnTrust project on how trust in these systems might affect well-being.

Ansgar Koene is a Senior Research Fellow and Lead Researcher for policy impact at the University of Nottingham Horizon Digital Economy Research institute. He is Co-Investigator on the EPSRC funded ReEnTrust project on Rebuilding and Enhancing Trust in Algorithms, which starts January 2019, and the UnBias project (running since September 2016) that uses co-design methods to investigates public perceptions and ethical concerns regarding autonomous decision-making systems. Within the project, Ansgar leads the stakeholder engagement work package which brings together industry, civil society and regulators for multi-stakeholder dialog on design standards and legal and regulatory recommendations for algorithmic (semi-)autonomous systems.

Elvira Perez Vallejos is an Associate Professor at NIHR Nottingham Biomedical Research Centre for Mental Health and Digital Tech. She is interested on the ethical challenges embedded in digital solutions for mental health including the widespread of machine learning/AI methods on the development of new mental health interventions. She has experience on RRI (Responsible Research and Innovation) and data ethics as well as on data privacy, online consent, user-centric design, creative practices for mutual recovery, experimental psychology, participatory research, children and young people and older adults.

Virginia Portillo is a Researcher at the UnBias and ReEnTrust projects at the University of Nottingham Horizon Digital Economy Research institute. She has multi-disciplinary research background and has executed several research projects, led public engagement activities and liaised with a wide range of stakeholders. Her current research interests focus on the impact of digital technology in society, meaningful transparency and trust by design. In particular, she is interested in capturing young and older adults’ experiences when interacting with algorithmic driven online platforms, promoting digital literacy and providing educational and technological tools for a more transparent and ethical digital economy.

Monica Cano is a Research Assistant at Horizon working on the UnBias project (EPSRC funded). Monica’s role includes recruiting young people, liaising with schools and youth groups to promote the study, data collection, and interpretation. Monica enjoys youth engagement and previously worked on the CaSMa project (ESRC funded) promoting digital rights among young people through iRights Youth Juries.

Christopher Woodard is an Associate Professor of Philosophy at the University of Nottingham. He works on ethics and political philosophy, with particular interests in well-being and utilitarianism. He is the author of Reasons, Patterns, and Cooperation (New York: Routledge, 2008) and Taking Utilitarianism Seriously (Oxford: Oxford University Press, forthcoming).

Related articles