Ethical aspects of voice assistants: a critical discourse analysis of Indonesian media texts

Anisa Aini Arifin (Department of Civil and Industrial Engineering, Division of Industrial Engineering and Management, Uppsala University Uppsala Sweden)
Thomas Taro Lennerfors (Department of Civil and Industrial Engineering, Division of Industrial Engineering and Management, Uppsala University Uppsala Sweden)

Journal of Information, Communication and Ethics in Society

ISSN: 1477-996X

Article publication date: 10 October 2021

Issue publication date: 7 February 2022

2175

Abstract

Purpose

Voice assistant (VA) technology is one of the fastest-growing artificial intelligence applications at present. However, the burgeoning scholarship argues that there are ethical challenges relating to this new technology, not the least related to privacy, which affects the technology’s acceptance. Given that the media impacts public opinion and acceptance of VA and that there are no studies on media coverage of VA, the study focuses on media coverage. In addition, this study aims to focus on media coverage in Indonesia, a country that has been underrepresented in earlier research.

Design/methodology/approach

The authors used critical discourse analysis of media texts, focusing on three levels (text, discourse practice and social practice) to study how VA technology was discussed in the Indonesian context and what power relations frame the representation. In total, 501 articles were collected from seven national media in Indonesia from 2010 to 2020 and the authors particularly focus on the 45 articles that concern ethics.

Findings

The ethical topics covered are gender issues, false marketing, ethical wrongdoing, ethically positive effects, misuse, privacy and security. More importantly, when they are discussed, they are presented as constituting no real critical problem. Regarding discursive practices, the media coverage is highly influenced by foreign media and most of the articles are directed to well-educated Indonesians. Finally, regarding social practices, the authors hold that the government ideology of technological advancement is related to this positive portrayal of VAs.

Originality/value

First, to provide the first media discourse study about ethical issues of VAs. Second, to provide insights from a non-Western context, namely, Indonesia, which is underrepresented in the research on ethics of VAs.

Keywords

Citation

Arifin, A.A. and Lennerfors, T.T. (2022), "Ethical aspects of voice assistants: a critical discourse analysis of Indonesian media texts", Journal of Information, Communication and Ethics in Society, Vol. 20 No. 1, pp. 18-36. https://doi.org/10.1108/JICES-12-2020-0118

Publisher

:

Emerald Publishing Limited

Copyright © 2021, Anisa Aini Arifin and Thomas Taro Lennerfors.

License

Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) license. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this license may be seen at http://creativecommons.org/licences/by/4.0/legalcode


1. Introduction

Artificial intelligence (AI) applications are becoming increasingly prevalent in our lives. As Ouchchy et al. (2020) argue, a “crucial objective in developing ethical AI is cultivating public trust and acceptance of AI technologies.” Given that media has a large impact on how issues are framed to the public (Ouchchy et al., 2020; Cave et al., 2018), it is reasonable to assume that media will have an impact on public opinion and acceptance of AI. Up to the present day, a few papers have studied the representation of AI in the media (Cave et al., 2018; Chuan et al., 2019; Ouchchy et al., 2020). Cave et al., 2018 found that portrayals of AI are either exaggeratedly optimistic or pessimistic and that there is a lack of more nuanced perspectives, while Chuan et al. (2019) held that there is a lack of depth. Ouchchy et al. (2020) conducted a study on English language media and its representations of ethics of AI from 2013 to 2018 and found that increasing the tone of the articles is more balanced (rather than enthusiastic or critical). Most of the public debate concerned General AI, followed by autonomous vehicles and autonomous weapons. The central issues that were covered in the media representation were prejudice, privacy/data protection and transparency. Yet another example of such studies is a recent paper within the Journal of Information, Communication and Ethics in Society, which reviews ethical concerns of algorithms in leading UK national newspapers (Barn, 2019) and which argues that the ethical concerns surveyed were limited in scope and unfocused.

Aligned with this previous research, our study zooms in on a particular AI technology. Voice assistant (VA) technology, installed on smartphones, smart speakers or those on wearable devices, is one of the fastest-growing AI applications in the market now and a growth rate of about 25.4% is expected between 2019 and 2023 with an estimated 8 billion VAs installed in 2023 (Moar, 2019). The market leader Google Assistant, was available on a billion devices by the end of January 2019 (Huffman, 2019), followed by Apple’s Siri (more than 500 million devices), Microsoft’s Cortana (approximately 400 million devices) and about 100 million installed Amazon Alexas. In line with this growth, there is a burgeoning scholarship on the reasons for adopting or rejecting this technology, as well as studies about how users engage with the technology, including ethical issues such as privacy and trust (Yang and Lee, 2018; Arnold et al., 2019; Han and Yang, 2018; Liao et al., 2019; Brill et al., 2019; Chopra and Chivakula, 2017; Lopatovska et al., 2019; Foehr and Germelmann, 2020). However, there are no papers that study media representations of VA technology, as we intend to do.

Our country of choice for studying the discussion of ethical aspects related to VAs is Indonesia. As the biggest economy among Southeast Asian countries, Indonesia has shown significant economic growth in the past two decades. It is the fourth most populated country with 267 million inhabitants (BAPPENAS, 2019). Indonesia has hundreds of different ethnic and linguistic groups and the motto of the country is Unity in Diversity. In terms of religion, about 87% are Muslim, 10% Christian and 2% Hindu. Indonesia has an increasing internet penetration, reaching 171.17 million at the beginning of 2019, approximately 64.8% of the total population (Asosiasi Penyelenggara Jasa Internet Indonesia, 2018). Also, the growth of access to smartphones, more than half of mobile phone users (Moneythor, 2019), makes this archipelago country attractive among developers of digital services.

Despite the ethical issues of VA technologies, as well as the populous and developing Indonesian market, VA technology and the geographical context of Indonesia are both underrepresented in earlier research and this paper, thus, seeks to advance the discussion on ethical aspects of VAs in this particular context. This national focus goes in line with a strand within information ethics, called intercultural information ethics, which has surged as a response to the a-contextual or even contextually insensitive or ethnocentric approaches of studies within information ethics (Ess, 2002). Throughout the history of the field of intercultural information ethics, we have seen a number of empirical examples of how local cultures are resisting or translating and modifying information and communication technologies. However, there has been criticism of essentialist, deterministic understandings of culture within the field underlying frameworks such as that by Geert Hofstede (Palm, 2016) and in response, culture is construed as non-essentialist and that we should rather study how “culture” is used by actors in particular settings (Lennerfors and Murata, 2021). Given this, we will not view the Indonesian media representation of VAs as determined by Indonesian culture, but rather use critical discourse analysis (CDA) which combines a study of the media texts, with discourse practices and social practices, all to highlight power relations and ideology, to effectuate social change. For us, the social issue that we are interested in in the paper is that there should be a sound discussion about the ethical issues of VA technology in Indonesian media. This leads up to the following research questions:

RQ1.

What ethical issues relating to VA technology are represented in Indonesian media texts and how are they represented?

RQ2.

How does the text resonate with discourse practices and social practices?

The remainder of the paper proceeds as follows. Part 2 surveys the research debate about VA technologies, globally and in Indonesia. Part 3 presents our analytical framework and the method. Part 4 presents the results, divided into text, discourse practices and social practices. Part 5 is a concluding discussion which answers the research questions, discusses the contribution of this paper to existing research and suggests future research avenues.

2. Voice assistant technologies globally and in Indonesia: a literature review

Recent research on VAs follows two broader methodological underpinnings: quantitative (based on surveys) and qualitative (based on interviews and diaries).

The broader, quantitative approaches seek to derive a picture about the use of VAs or the reason for adopting the technology. Arnold et al. (2019) argued that VA users predominantly use the assistant for information retrieval online, while Yang and Lee (2018) showed that user intention of VA technology depended also on perceived enjoyment, illustrating both the utilitarian and hedonic value of VAs. Han and Yang (2018) hold that interpersonal attraction between the user and the assistant drives user acceptance, while privacy and security risk is a threat to adoption. Liao et al. (2019) showed that respondents’ trust in the service provider was important for the adoption or non-adoption of the technology, in addition to other above-mentioned factors. Although the studies are different, the common pattern is that different values such as utility and hedonism, are important for the adoption of the technology, while privacy and security concerns are a barrier to adoption.

The in-depth qualitative studies give additional perspectives to the above picture. For example, Lopatovska et al. (2019) argued that interactions were often of a more leisurely or casual kind than related to information retrieval. Lau et al. (2018) follow 17 smart speaker users particularly related to the users’ privacy and show that they often rely on rationalizations and incomplete understandings of privacy risks. Foehr and Germelmann (2020) showed four different mechanisms for trusting VA technology, going beyond trust in the service provider. Chopra and Chivakula (2017) discuss the need for socio-cultural understandings of VA technology adoption and discuss a set of particularities of user preferences for VAs in India. Pradhan et al. (2019) explored how older adults anthropomorphize VAs, showing that the users move between seeing the VA as “human-like” and “object-like.” Kudina and Coeckelbergh (2021) study the meaning-making processes of users of VAs, as a part of appropriation and techno-performances. These studies show the situatedness of ethical concerns related to VAs.

Two studies do not fall into the above-mentioned methodological characterizations but are concerned with new technology development for increased privacy control. Campagna et al. (2018) propose a new VA – Almond – which one better way than the existing applications on the market can be adapted to users’ privacy preferences. Mhaidli et al. (2020) propose a VA that takes gaze direction and voice volumes as cues for starting to “listen,” rather than always listening.

Most of the existing studies related to attitudes to VAs concern Western markets, with one exception in Chopra and Chivakula (2017). Studies about the attitudes of Indonesians toward VAs are limited. To our knowledge, only two articles discussed attitudes toward the use of VAs. The first concerned the attitude of students toward the use of Apple Siri for English as a foreign language learning (Haryanto and Ali, 2018) and the second one was focused on the effectiveness of the Indigo assistant, Lyra, to enhance speaking skills (Charisma and Suherman, 2018). In other words, they did not concern ethics.

As mentioned in the introduction, we complement earlier research on VAs by focusing on media representation, to highlight how VAs are represented and how public opinion about them is shaped. By studying Indonesian media, we also contribute to non-Western perspectives about VAs.

3. Methodology

In this article, our aim is to understand better what ethical issues are portrayed related to VAs and how they are portrayed. We are inspired by CDA which is concerned with how power relations and ideology are constructing language use and vice versa. CDA was first developed in the Lancaster school of linguists and Norman Fairclough was one of the most prominent proponents of CDA. Both Fairclough and other CDA scholars such as van Dijk, T. (1988) stress that the link between the texts that are analyzed and various forms of contexts is crucial. For Fairclough (2013) the text should be analyzed in itself but also its discourse practice, which means how media texts are produced, consumed and distributed. Furthermore, another context should be included – that of social practices (Fairclough, 2013), which concerns the dominant ideology and how it influences social relations of power within and between different groups of people, in particular historical circumstances. CDA is often used as an analytical framework but according to Fairclough it should also be normative and aim to introduce social change (Fairclough, 2013). In this paper, we are concerned with the social issue that ethical aspects of VA technology should be discussed in Indonesian media, given that media contributes to shaping public perception related to an AI technology that is undergoing significant diffusion in society.

For the analysis of the texts, we focus on the narrative – the story that is told and use Kenneth Burke’s five key elements of a narrative or story: “the act (what is done), the scene or setting (the context in which it is done), the agent (who does it), the agency or instrument (how it is done) and the purpose or motive (why it is done)” (Mroz et al., 2021), to structure the analysis. These are similar to the “five Ws and an H” in journalism, which is a common heuristic to analyze a journalistic text (Singer, 2008). The focus is then on: who (who is the agent?), what (what is the act? what has happened?), when and where, why and how. Apart from the narratives that are told, we also focus on silences or what is not said (Fairclough, 2013, p. 4). We have also been aware of the multimodal nature of texts, by taking the images accompanying the texts into account, but after having looked at the images on articles about ethical issues of VA, we have not been able to see the relevance of those images in our analysis.

Our analysis of discourse practices has concerned the different media outlets where the articles are published and how they are related to news production in other social contexts. We have not studied how these articles are received by readers, although that could have been an important facet of discursive practices. As to the social practices, we draw on how the Indonesian Government views new technology, including digital technology. This is one way to capture the dominant ideology, although we are aware that much deeper studies are required to do full justice to social practices.

There are seven national media in Indonesia used in this analysis, namely, CNN Indonesia (CNN), Antara News (ANT), Detikcom (DTK), Kompas (KPS), Koran Tempo (TMP), The Jakarta Post (TJP) and Tirto.id (TTO). They were chosen as they have different characteristics (Table 1) and together provide a representative view of the mainstream newspapers in Indonesia. We have chosen these as we are interested in dominant representations of ethical issues regarding VA, which we believe is in line with the ethos of CDA.

Online open access articles were used. Keywords such as “voice assistant,” “asisten pintar,” “intelligent assistant,” “personal assistant,” “google assistant,” “apple siri” and “amazon alexa” were used for searching for articles. The gathered reports spanned from May 1, 2010, to February 28, 2020, given that there was no news related to VAs before May 1, 2010. To do the analysis, the first author read all articles and analyzed which articles concerned ethical issues. Both authors then read the articles which concerned ethics and analyzed them using CDA.

4. Results

Following CDA, the results section will be divided into three parts: text, discourse practice and social practice.

4.1 Text

In total, there were 501 articles collected. In total, 328 focused on VAs, while the rest were about AI in general or other viewpoints. The overall trend was increasing from 2010 to 2019 (Figure 1). KPS was dominating the distribution of the VA news based on media publishers, with 100 publications (Figure 2).

All the mentioned 328 articles were read by the first author. 283 articles were not directly relating to ethics but were more concerned with informing the reader about introducing a new VA or a new feature of an existing VA, how to use the VA and how to connect the VA to other devices, to name a few topics. In total, 45 of the articles were related to ethical issues defined broadly, as relating to “how one should live one’s life, what is good, what is the right way to act and what one should do” (Lennerfors, 2019, p. 13). Topics such as gender issues of the VA, false marketing of the VA, ethical wrongdoing of the VA, ethically positive effects of the VA, misuse of VAs, privacy and security were covered. We will now describe each of the issues (the act of the narrative in Burke’s framework).

4.2 Flawed voice assistants (2012–2018)

This topic includes different aspects of the VA not adhering to our expectations. Some early articles (KPS 2012–03-15, KPS 2012–06-20, 2014–02-16) concern how users are not satisfied with the Siri VA, claiming that it is false marketing to portray Siri as a true VA. Furthermore, there are stories that concern various actions by VAs that are not meeting some expectations and also committing ethical wrongdoing. An interesting example is an article (TMP 2012–05-15) where it says that Siri, when asked which is the best smartphone, answers “Nokia Lumia 900.” Siri is of course expected to favor Apple, which indicates that Siri should be loyal to its master. Other kinds of ethical wrongdoing are also showing in this category. An article (KPS 2015–04-16) discusses how Siri’s Russian version was anti-gay, saying that same-sex marriage is “a negative thing.” When Siri was asked about the existence of same-sex clubs in the nearby area, Siri answered again, with a dropping tone. “I’d give it a red flag if I could.” The article was concluded with a statement from the company that the issue has been fixed. Another article (KPS 2016–04-04) concerns how Siri is now more responsive to users’ questions about depression, rape, sexual violence and other related matters. Rather than telling a user who wants to jump off a bridge, where the closest bridge is, Siri responds differently. Rather than claiming that Siri does not know what rape means, which happened in the past, it now responds differently by directing the user to a support telephone number, but that this only works in the US. For a summary of the narrative structure, Table 2.

4.3 Voice assistants as everyday saviors (2016–2017)

Four articles (2016–2017) concern cases when VAs have been useful to take the users out of precarious situations, suggesting the ethics of inclusivity of the VAs. For example, a man whose house had been burglarized and subsequently exploded whereby the man burned his hands, face and neck was able to use Siri to call 911 (KPS 2017–05-18). In another article, a VA called the police when a mother and a child were threatened by a boyfriend. The boyfriend asked “did you call the police?” which triggered the VA to call the police (KPS 2017–07-10). An article from TMP (2016–06-09), tells the story of a mother and her sick baby. When the baby suddenly stopped breathing instinctively, the mother shouted: “Hey Siri, call the ambulance.” An ambulance arrived and the baby was saved. KPS (2017–03-26) describes how a four-year-old boy, using the VA, was able to contact the ambulance to save his mother who did not wake up and the mother’s life was saved. For a summary of the narrative structure, Table 3.

4.4 The future of voice assistants and their ethical implications (2017–2018)

Four articles are of a somewhat more futuristic character. TJP 2018–09-02 and TJP 2018–09-16 discuss how a VA-powered future is near, given the naturalness of using voice as an interface. TJP 2018–09-16 discusses how VAs will be providers or rather curators, of news in the future and exposes the powerful role that companies such as Google or Apple will continue to play in the future. The role of the voice – how the news is read to the user – is discussed and how we would react to artificial emotional responses of the VA to tragedies in the news. An article (TJP 2017–05-08) concerns the possibility of dating VAs. The founder – Minori Takechi – of a Japanese company is reported arguing that digital assistants are seen as too utilitarian and one should rather emphasize the emotional aspects. In this article, it is also pointed out that the notion of a kawaii (cute) assistant can sustain adaption:

Takechi said the company is developing behavior patterns that will let their characters make mistakes without getting on your nerves. The bet is that when a virtual girlfriend fails to order an Uber, you’re more likely to forgive her than a disembodied voice from a cylinder.

These articles do not follow a narrative character but rather describes a setting (Table 4).

4.5 Security risks of voice assistants (2014–2019)

Some articles concern security risks of using VAs, for example, that it is possible to “fool” Siri to get into a passcode-locked phone (KPS 2014–05-08, KPS 2016–11-21). Also, the idea that people can give commands to the VA without being around it was mentioned in two TMP articles. It is revealed that hackers can transmit commands to Siri from a distance of almost 5 m (TMP, 2015–10-16). That is similar to an article (TMP 2019–11-07) that raised a security issue about VAs. It was stated that we can access and control smart speakers without any sound using a laser. For the moment, it seems as if there are no major security risks, but the article points to a future where new forms of hacking become possible (Table 5).

4.6 User misuse (2017–2018)

Some articles concern the misuse of users of VAs. An article concerns how children can order goods online without the consent of their parents and also tells the story of Burger King who in their ad activated Google Assistant (KPS 2018–03-12). The Burger King story is also the topic of TMP 2017–04-13. Another article told the story of a child who did his homework by asking the VA about the right answers to all his homework questions (KPS 2018–12-29). Yet another article explains how a parrot could order stuff online (KPS 2018–12-18):

With the ability to imitate Marion’s voice, Rocco [the parrot] can activate Alexa and then order watermelon, biscuits, broccoli or ice cream. Rocco bought not only food but also light bulbs and a kite (Table 6).

4.7 Gender norms (2011, 2019)

Two articles concern the gender of the VA. The first (TMP 2011–10-25) explained why Siri is male in the UK while it is female in the US. The second article (TJP 2019–03-12) brought up the issue of female VAs. It told the story of an attempt to make a gender-neutral pitch voice for the Pride festival. The finished voice was tested and:

About half said they could not tell the gender and the other half were roughly evenly divided between guessing it was male or female.

This indicated the potential that big tech companies have to reinstate new norms, rather than relying on ancient, sexist stereotypes (Table 7).

4.8 Privacy (2015–2019)

4.8.1 Privacy issues before the data leak 2015–2019.

Privacy issues were first discussed in 2015 and many of the articles concerned Apple and Google. Apple and Siri were portrayed as protecting the privacy of the user, as the data would only be stored on the user’s device, not associated with Apple’s services, nor shared with third parties (DTK 2015–06-09, KPS 2015–09-14). On the other hand, Google was discussed, as Google Allo users were forced to choose between Google Assistant and privacy (KPS 2016–05-23, DTK 2016–09-26). Google was seen to store all data, but an article pointed out that it is easy to delete the recordings, including instructions about how to do it (KPS 2016–06-04). It was also argued in 2016 that while Apple’s Siri is seen to protect users’ privacy, it is lagging behind Google in the AI race (TJP 2016–06-11, TJP 2016–06-17). The first article, also concerning Google, which features some voices from Indonesia is DTK 2018–06-01. Here, the head of marketing of Google in Indonesia Veronica Utami states:

“The data we record is really in the hands of the user because we really value everyone’s privacy,” “[…] So everyone can go to myaccount.google.com and do a privacy check-up and make any data settings that can be accessed by Google.”

Google’s efforts to protect privacy are related in the article to the development of the EU’s general data protection regulation (GDPR).

Also, Amazon’s Alexa was covered in the news. TJP 2015–07-30 brought up the privacy issue and the choice between convenience and privacy but also mentioned that users clearly see when Alexa is listening. The article still ends with two statements from users: one that says that no one will have an interest in spying on him and the other saying that “people [are] willing to bring spying technology into their own house if they think it will do something great for them.” Privacy concerns were also raised in an article concerning how Alexa is being used as assistants in hotel rooms, but the article claimed that the data would be deleted daily, the devices would be muted by default and the hotel would not have access to individual recordings (TJP 2018–06-22). The discussion about privacy continued in an article on adding “amnesia” features to Alexa as a means to protect user privacy (TJP 2019–05-31). Privacy was also the key concern when Mattel scrapped its plan for a VA for kids (TJP 2017–10-06) (Table 8).

4.8.2 Privacy issues and data leaks in 2019.

There are then 10 articles on privacy following the various data leaks in July-August 2019, as these incidents highlighted the concerns that had been raised before. Voice data from Google Assistant had leaked from a third-party language expert (TJP 2019–07-13) and in the same article privacy activists who have criticized Echo Kids were mentioned. More of the story unfolded in upcoming articles, where it was mentioned that sensitive recordings such as parent-child conversations, bed conversations and professional phone calls containing confidential information had been made by digital devices. These were recorded despite the fact that Google Assistant was not activated. A representative of Google stated that they have fixed the problem and that “false accept,” commands that sound like the activating command might be the issue (KPS 2019–07-19). A similar article points out that privacy is expensive nowadays and also ends with a connection to the terms and conditions of using Google Assistant (DTK 2019–07-21). Also, Apple was subjected to criticism, as quality checks and efforts to improve the VA’s responses were done by humans, which raised privacy issues. Apple, therefore, suspended its response grading program (TJP 2019–08-03). The turn then came to Microsoft that responded that third parties only listen to users’ conversations when obtaining user permission (TJP 2019–08-08). What Apple does to its data is picked up in an article (TMP 2019–08-19).

When Amazon, Apple and Microsoft had been covered, the turn came to Facebook. An article exposed that Facebook has paid contractors to listen to audio files of users’ conversations. Similar to Apple and Amazon, Facebook paused this practice (TJP 2019–08-14). The Apple case is revisited in TJP 2019–08-29, where Apple claims that only 0.2% of the total requests were reviewed. Now, Apple changed its policy to rather than having an opt-out policy letting users opt-in if they allow their audio to be reviewed (Table 9).

4.9 Summary

To sum up this whole section on the text dimension of CDA from the point of view of Burke’s five dimensions, the trends in reporting on ethics of VAs show that there is a variety of topics that are discussed throughout the years. Some acts such as flaws of VAs, security or gender, come back over the years, while others, at least in our data set, are limited to certain years. For example, the VA is a savior and future visions are limited to two years. Where we can see the clearest interest regarding the ethics of VAs is privacy issues (Figure 3).

The setting is always abroad. Most of the issues concern countries in the West, but there is one article set in Japan. In only one article, a statement is taken from a local Indonesian representative of Google. This is an indicator that ethics is constructed as something foreign.

Combining the insights from the different topics, the following picture of the agents emerges. The users are depicted in a wide variety of ways. Sometimes they are seen as being at the mercy of the VA, sometimes (in the future) as an equal partner in an emotional relationship, risking their data and being snooped upon, but at the same time as rational actors who make choices about when, where and how to use the VA. An unexpected result was that many of the articles discuss the useability of VAs when the adult, rational, subject is destabilized. Furthermore, there is plenty of coverage concerning children, who are either able to call an ambulance for help, children who are spied upon by VAs or children who misuse a VA, letting it do the children’s homework. The service providers are portrayed as agents of the VAs and an increasingly powerful, but willing to help the users’ protect their privacy. The VAs themselves are portrayed as instruments, as potentially snooping but also portrayed as flawed, cute and somewhat stupid.

While the VA is portrayed as an agent, as described above, it is also portrayed as an instrument for the user, service provider and hacker. Depending on the act, the purpose or motive differs.

A general observation is that the articles are written in a way that portrays ethics as being under control – there are often no issues left open or problematic in the articles. Various issues are portrayed, but the articles always state that the flaws have been fixed (whether it concerns malfunction or privacy issues). Only some future-oriented articles about news curating leave the ethical issue of the power of the curators open, as well as the future-oriented article about emotional relationships with VAs and an article about security issues.

5. Discourse practice

The various texts that have been analyzed appear predominantly in KPS, TJP and TMP. Out of these 45 articles, 19 were published in KPS, 16 in TJP, 9 in TMP, four in DTK, 1 in CNN and 1 in TRT. In Table 1, we described that TJP and TMP have a more explicit pro-democratic stance, which makes it understandable that issues related to ethics are discussed in such outlets. The independent KPS also produced a larger number of articles related to privacy issues.

Compared to the total number of articles published on VAs (Figure 2), there is an overrepresentation of articles related to ethics in TJP, which is an English-language newspaper. The readership of TJP is either well-educated Indonesians with proficiency in English or other English speakers. This suggests that ethics issues are more prevalent in relation to a more well-educated part of the population. Also, the readership of KPS is well-educated (Table 1).

In many of the articles on ethics, the mode of production is strongly connected to countries outside Indonesia. Most articles in TJP are written by non-Indonesians and many other ethics-oriented pieces are based on an article in a non-Indonesian media outlet and edited. As the first author has noticed, commonly some Indonesian media adapted the content from other media abroad. However, the way they translated the content sometimes resulted in confusing wording in Bahasa Indonesia (Indonesian Language) that led to difficulties in interpretation. This translated “feel” to articles is yet a sign that ethics is something that is brought in from abroad.

Although we do not have data for comparison with other countries’ media discourse about VAs, our results indicate that there is a lack of discussion about ethical aspects of VA technology, particularly for those who are not well-educated nor anglophone. There are several reasons that could be the cause of this. One interpretation is that there were other more important issues that have taken the attention of the journalists during this period of time. Another possibility is that ethical issues relating to technology are not openly discussed in the media, which will be contextualized below.

In the present Reform order (1998-), following the Indonesian independence revolution (1945–1949), the parliamentary (1949–1959), the guided democratic (1959–1965) and the New Order (1965–1998), Indonesia is a country that adheres to a democratic system that includes the freedom to voice opinions, including freedom for the press (Kemdikbud, 2015; Sahrasad, 2014). However, during the New Order regime, press freedom was severely restricted. If a media organization was deemed to have violated regulations, it could be banned. To be a journalist, special permission was also required. During President Soeharto’s regime, several newspapers were banned. For example, Kompas and some other newspapers were banned in 1965 for a few days, relating to the G30S/PKI movement – the kidnap and murder of six Army generals some higher-ups of Indonesian soldiers (Suyono et al., 2014). Some online media sources mentioned that Tempo has an archive reporting that Kompas (and the other six newspapers) was yet again banned in early 1978 when they made a report about President Soeharto planning to go for the third nomination as a presidential election and demonstration against rampant corruption. Unfortunately, the archive (Ishwara, 2001) was banned and the link was deactivated. Moreover, the Tempo was banned in 1982, as they were considered too sharp in criticizing the new order regime and its political vehicle, the Golongan Karya political party, as well as in 1994 when they criticized the government’s purchase of an old German fleet and the high budget that the Minister of Research and Technology BJ Habibie had proposed. This report was deemed to be endangering “state stability,” where the main report discussed the military’s objections to the imports done by the minister (Amir, 2012).

Press freedom in 1998 resulted in the birth of a significant number of media producers. An article from the Dewan Pers (Press Council) estimated that the number of mass media in Indonesia reached 47,000 media. Of which, 43,300 are online media, about 2,000–3,000 are printed media and the rest are radios and televisions (Setiawan, 2020). Sen and Hill have argued that although from 1998 there is freedom of the press, “there can be no simple connection between the erosion of government censorship, the opening up of the media and the establishment of a pluralist democracy as understood in the West.” (Sen and Hill, 2000, p. 218). Sen describes a much more problematic picture related to press freedom than the standard narrative, where press freedom is still not present and where some of the former government control of the media is now in the hands of private owners (Haryanto, in Sen and Hill, 2011), which exercise control over the content. Despite the reforms, many journalists find it difficult to play the role of “watchdog” and some Indonesian journalists “struggle with conflicting challenges of being critical of institutional powers while maintaining their job” (Davies et al., 2016). Sensitive issues can then rather be covered in trivialized and sensationalized tones (Hartono, 2015).

While VAs in themselves might not be a topic that is worth controlling, the fact that the service providers of VAs are powerful new media companies could have an impact on the news coverage and framing of the news about VAs, which would then explain the lack of critical debate about the technology.

6. Social practice

In this section, we describe relevant aspects of the social practices that can have an influence on the media coverage of VAs. Indonesia is a multicultural country, the motto of which is unity in diversity. In our analysis of media texts, we have not seen any explicit cultural identifications and as our take on culture is non-essentialist, we have not focused on describing the Indonesian cultures and their impact on media representation. Rather, we will describe some aspects of Indonesia’s ideology of technology.

President Soeharto, the second president and has the longest leadership period, had a predilection for high technology. He saw high technology as a way by which he could, apart from economic wealth, elevate his esteem and override the powerful image of his predecessor and the founding father of Indonesia, President Soekarno. He shared this ambition of high technology with Minister of Research and Technology B.J Habibie (Amir, 2012).

The incumbent President Jokowi, in his first reign (2014–2019), had the vision to make Indonesia the World’s Maritime Axis country (KKP-RI, 2019) (Setiawan, 2019), but he has also continued the legacy of former presidents of having a pro-technology stance, wanting to lead a digital transformation of Indonesia (Nugroho and Hikmat, 2017). In relation to the view on social practice, it is understandable that VA technology is presented in a positive light.

As a part of the efforts to create possibilities for businesses to get a unified framework, while protecting citizens’ data, the Indonesian Government through the Ministry of Communication and Information Technology began to consider personal data protection around mid-2013. The work has gradually developed and in 2015 it was revealed that they prepare the rancangan undang-undang perlindungan data pribadi (RUU PDP), a personal data protection regulation similar to GDPR in Europe (Kemkominfo, 2013, 2015). In early 2020, the manuscript was sent by the president to the legislative body (Kemkominfo, 2020). In Indonesia, there has been a growing interest in privacy, particularly in the later years of the 2010s.

In addition, President Jokowi shared his ambitions for Indonesia on several occasions. For instance, in the opening of the Indonesia Digital Economy Summit 2020, he mentioned the importance of protection of user data for data security and sovereignty and the need to build data centers to support digital transformation in Indonesia urgently, “[…] to bring many benefits to local start-ups while protecting users’ personal data” (BPMI Setpres, 2020). Again, early this year, on March 8th, 2021, the incumbent President expressed the idea to make the Agency for the Assessment and Application of Technology (BPPT) taking part in the development of AI and becoming the center of Indonesian technology intelligence (Humas BPPT, 2021).

7. Concluding discussion

Beginning our study we noticed that there was a lack of media discourse studies on VA, which led to the motivation to carry out this study. We also identified that there is generally a lack of studies of ethics of VAs in non-Western countries. We will now discuss the research questions and relate our answers to them to previous research. The first research question was:

RQ3.

What ethical issues relating to VA technology are represented in Indonesian media texts and how are they represented?

We have discussed that 45 out of 328 articles concern ethical issues, related to gender issues, false marketing of the VA, ethical wrongdoing of the VA, ethically positive effects of the VA, misuse of VA, privacy and security. While these issues are covered in different earlier studies (Yang and Lee, 2018; Arnold et al., 2019; Han and Yang, 2018; Liao et al., 2019; Brill et al., 2019; Chopra and Chivakula, 2017; Lopatovska et al., 2019; Foehr and Germelmann, 2020), we maintain that such an overarching categorization of the ethical issues could inform future studies of VA technology, create a roadmap of what ethical issues to look for related to VA when doing empirical studies. Given that the media coverage in Indonesia was strongly linked to Western media and social practices, it is likely that the issues covered in Indonesian media are similar to those covered in Western media. We have also seen that the media coverage on privacy is larger than that of other topics, which corroborates research that argues that privacy is the central issue of VA technology (Lau et al., 2018). Whether 45 out of 328 articles is a low number compared to media coverage in other countries is low or not needs to be studied by future research.

However, more qualitatively, we have described that ethical issues seem to be taken care of in the present or relegated to the future. There is no earlier research on media coverage on VA, but if we relate to the research on media representation of AI, our study shows that there are neither enthusiastic nor critical articles (Cave et al., 2018; Ouchchy et al., 2020), but rather articles that are somewhat positive to the new technology, avoiding both the enthusiastic and the critical. Most of the articles are also not discussing the issue in depth which goes in line with earlier research (Cave et al., 2018; Chuan et al., 2019).

As a part of our analysis, we studied how different actors were represented in the media and showed that service providers are portrayed as powerful, but willing to make efforts to improve the ethics of VAs. The users are sometimes portrayed as the stereotypical adult user, but other users are also portrayed, for example, people in distress, children and even a parrot. This has not been discussed by earlier research and, thus suggests that future research could understand interactions with VA, focusing on different groups of users in different situations. From conducting interviews with VA users, we see that some users interact with the VA when they feel lonely or when they are in a new country. The situatedness of use is part of some qualitative studies on VA use (Kudina and Coeckelbergh, 2021), but could be further explored, for example, relating to users in vulnerable situations. The VAs themselves are often portrayed as potential spies as they are always listening but also as flawed, not fully conforming to our expectations and to some extent kawaii (cute) or stupid. Our analysis suggested that the latter representation of VAs, which is prevalent in the media, could improve the public image of them, as they seem less harmful and potentially less able to learn about the user. Anecdotally, this idea of stupid AI is also prevalent in popular culture in Western countries. However, the VA is but the interface and the tip of the iceberg, in a relationship between users and powerful service providers. Seen in this light, the helpless representation of the VA could contribute to even more power asymmetries between service providers and users. This stupidity of the VA has to our knowledge not been adequately discussed and merits further attention in future research.

The second research question was:

RQ4.

How does the text resonate with discourse practices and social practices?

Related to discourse practices, the production and consumption of media texts, we have argued that the media texts were often referring to events abroad, particularly in the USA, Europe and Australia and also built on articles from Western contexts. Many of the texts were published in newspapers for well-educated classes and there was an overrepresentation of articles in the English language TJP. Furthermore, this understanding of discourse practice led us to understand the somewhat strange wordings in Bahasa Indonesia (Indonesian Language), which were a result of direct translations from foreign articles. As far as we have noticed, this English-speaking, well-educated target audience signifies that ethical issues are not for all, but only of interest to a limited group. In line with our critical agenda, as described in the introduction, it is not desirable that ethical issues should only be debated within some parts of the population, as they are relevant to all.

The generally positive view on VA and lack of critical debate about it could also be related to the processes of media production in Indonesia. Although there is no state censorship, media has increasingly become privatized after 1998 and earlier research has argued that this also leads to censorship, albeit of another kind (Haryanto and Ali, 2018).

This focus on the production and consumption of media texts has been absent in earlier research on the portrayal of ethical issues of AI in the media. Given that it has contributed to our understanding of the media texts, this suggests that the three levels of CDA could be valuable to future research on media representation of AI.

Regarding social practice, Indonesia is a multicultural country, the motto of which is unity in diversity. In our analysis of media texts, we have not seen any explicit cultural identifications, and therefore we have not focused on describing the Indonesian cultures and their impact on media representation. This view of culture as an underlying force determining social practice has been criticized (Palm, 2016) and there are rather voices suggesting a more non-essentialist view of culture, focusing on how “culture” is used (Lennerfors and Murata, 2021). However, we have described that there is a pro-technological worldview on the government level and President Jokowi is seen to lead a digital transformation of the country. Such an understanding of the dominant ideology could also throw light on the lack of critique in media texts.

Even this limited study of discourse practice and social practice which fits the scope of a paper led to some further understanding and we recommend such an approach to future studies about media representation of VA or AI, rather than a sole focus on media texts (such as in Ouchchy et al., 2020). This is also important in studies of media representation in Western contexts, which sometimes is presented as a neutral context, where social practices do not need to be described. This could rather be a sign of ethnocentrism (Ess, 2002). Comparative studies of media representation are a way to derive similarities or differences that could be understood through an understanding of discourse practices or media practices and should be a promising avenue for future research endeavors.

Given our CDA approach, we can see that there is in general a lack of critical debate about VA technology in the Indonesian media and that the discussion that exists is directed to a small part of the population. This calls for further discussions in various media outlets, where a balanced view is taken and the positives and negatives of VA technology are presented and discussed. However, it might be the case that such debates are taking place in non-mainstream, national media, as well as in social media or everyday discussions. For example, it could be possible that non-mainstream, local newspapers or newspapers with a stronger religious affiliation would portray the issues in a different way, but this is a topic for further research. Further studies are needed in Indonesia and other non-Western contexts, as well as in Western contexts, not only to understand the situation but if needed to drive change toward a more critical debate about new technology and our increasing dependence on powerful companies that have more and more access to our data.

Figures

Number of articles about voice assistant technology 2010–2020

Figure 1.

Number of articles about voice assistant technology 2010–2020

Number of articles per media producer

Figure 2.

Number of articles per media producer

The distribution of ethical issues of voice assistants in Indonesian media

Figure 3.

The distribution of ethical issues of voice assistants in Indonesian media

Seven national media and their main characteristics. The information is compiled from their websites, entries in the press council database and Wikipedia articles on each newspaper

Media Basic characteristics
The Jakarta Post (TJP) Owner: PT Bina Media Tenggara
Established: April 25, 1983
Coverage: national, international, printed and digital, daily news and English
Political/ideological leaning: pro-democracy stance
Circulation: 41,049 in 1998, but no data afterward. Alexa rank 9,834 (July 2021)
Additional info: Indonesia’s main English-language daily
Tempo (TMP) Owner: PT Tempo Inti Media
Established: 1995, but published a magazine in 1971
Coverage: national, daily news until Jan 2021, after which it is fully digital
Political/ideological leaning: independent, pro-democracy, investigative
Circulation: 100.000. Alexa rank 729 (July 2021)
Detik (DTK) Owner: PT Trans Corporation 
Established: online 9 July 1998 
Coverage: focus on politics, economics, information technology, entertainment and sports
Political/ideological leaning: no data
Popularity: Alexa rank 110 (July 2021) 
Kompas (KPS) Owner: PT Kompas Media Nusantara 
Established: June 28, 1965 
Coverage: national, daily news
Political/ideological leaning: neutral
Circulation: 751,000 (2012). Alexa rank 83 (July 2021) 
Additional info: most KOMPAS readers are from the upper-middle-class based on their financial condition and background. In fact, as many as 60% more of KOMPAS readers have taken higher education at universities (Kompas, 2008)
Tirto (TRT) Owner: PT Tirta Adi Surya 
Established: August 3, 2016
Type: digital news and infographics site
Political/ideological leaning: independent, non-partisan 
Popularity: Alexa rank 1883 (July 2021)
CNN Owner: PT Trans Media, WarnerMedia International, part of CNN International
Established: October 20, 2014 
Type: digital
Coverage: general news, business, sport, technology and entertainment 
Popularity: Alexa rank 381 (July 2021)
Antara (ANT) Owner: Indonesian Government-owned news agency
Established: January 1996, the news agency was founded on December 13, 1937
Type: digital
Popularity: Alexa rank 2613 (July 2021)
Additional info: available in Bahasa Indonesia and English

The narrative structure of flawed voice assistants

Act Setting Agent Agency/instrument Purpose/motive
VA engages in wrongdoing User-VA conversation. Abroad User: either not described or belonging to a vulnerable group or in a vulnerable situation
VA: not well enough programmed to deal with these issues, that it was socially awkward or even stupid
Service provider: has the power to fix the VA
N/A No purpose from the VA: it is socially awkward or stupid

The narrative structure of voice assistants as everyday saviors

Act Setting Agent Agency/instrument Purpose/motive
VA is a savior Precarious situation. Abroad. USA, UK and Australia User: in distress, incapacitated or a child
VA: a useful voice interface. Or, an anti-hero, an accidental savior
Service provider: has the power to fix the VA
The voice assistant is the instrument N/A 

The narrative structure of the future of voice assistants

Act Setting Agent Agency/instrument Purpose/motive
N/A Future. unspecified location, but influences from abroad (Japan and US) User: subjected to power from news curators or an equal partner to the VA
VA: as a kawaii (cute) service provider: powerful
N/A N/A 

The narrative structure of security risks of voice assistants

Act Setting Agent Agency/instrument Purpose/motive
Hackers hack VA Abroad. Europe/USA Hackers: 
Users: at risk because of VAs
VA: vulnerable
The VA is the instrument for hackers to hack the user’s device/data Malevolence 

The narrative structure of user misuse of voice assistants

Act Setting Agent Agency/instrument Purpose/motive
Users misuse VA Abroad. The USA Users: cleverly tricking VAs 
VA: something that can be tricked
The VA is the instrument for the user’s self-interest Self-interest 

The narrative structure of gender norms of voice assistants

Act Setting Agent Agency/instrument Purpose/motive
VAs embed gender norms Abroad. UK, USA VA: being gendered
Service provider: has the potential to do societal good by reshaping gender norms
The VA is the instrument for change of norms Emancipatory

The narrative structure of privacy (before data leaks)

Act Setting Agent Agency/instrument Purpose/motive
Privacy issues are raised, but the service providers protect users’ privacy Abroad. The USA, but one voice from Indonesia represented Users: rational actors, who can control privacy settings
Service provider: offers privacy protection.
N/A For service providers, privacy is either a business advantage or because of regulations

The narrative structure of privacy (data leaks)

Act Setting Agent Agency/instrument Purpose/motive
Users’ voice data is leaked Abroad. Europe, USA Service provider: has allowed data to leak
Third-party contractors: at fault for not following data handling policies of service providers
The third-party contractors are the instruments of the service providers and are the cause of the leak The service providers collect sensitive data not only when the VAs are supposed to listen but other data as well

References

Amir, S. (2012), “The technological state in Indonesia: the co-constitution of high technology and authoritarian politics”, (1st ed.). Routledge, available at: https://doi.org/10.4324/9780203084120

Arnold, R., Tas, S., Hildebrandt, C. and Schneider, A. (2019), “Any sirious concerns yet? An empirical analysis of voice assistants’ impact on consumer behavior and assessment of emerging policy challenges”, TPRC47: The 47th Research Conference on Communication, Information and Internet Policy 2019, available at: http://dx.doi.org/10.2139/ssrn.3426809

Asosiasi Penyelenggara Jasa Internet Indonesia (2018), “Laporan survey: penetrasi and profil perilaku pengguna internet Indonesia”.

BAPPENAS (2019), “Jumlah penduduk Indonesia 2019 mencapai 267 juta jiwa | databoks”, Databooks, available at: https://databoks.katadata.co.id/datapublish/2019/01/04/jumlah-penduduk-Indonesia-2019-mencapai-267-juta-jiwa (accessed 10 November 2019).

Barn, B.S. (2019), “Mapping the public debate on ethical concerns: algorithms in mainstream media”, Journal of Information, Communication and Ethics in Society (Online), Vol. 18 No. 1, pp. 38-53.

BPMI Setpres (2020), “Pengembangan pusat data di Indonesia dorong ekonomi digital dan lindungi data pribadi pengguna”, [online] presiden RI, available at: www.presidenri.go.id/siaran-pers/pengembangan-pusat-data-di-indonesia-dorong-ekonomi-digital-dan-lindungi-data-pribadi-pengguna/ (accessed 28 June 2021).

Brill, T., Munoz, L. and Miller, R. (2019), “Siri, Alexa, and other digital assistants: a study of customer satisfaction with artificial intelligence applications”, Journal of Marketing Management, Vol. 35 Nos 15/16, pp. 1401-1436.

Campagna, G., Xu, S., Ramesh, R., Fischer, M. and Lam, M.S. (2018), “Controlling fine-grain sharing in natural language with a virtual assistant”, Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, Vol. 2 No. 3, pp. 1-28.

Cave, S., Craig, C., Dihal, K., Dillon, S., Montgomery, J., Singler, B. and Taylor, L. (2018), “Portrayals and perceptions of AI and why they matter”, available at: https://royalsociety.org/˜/media/policy/projects/ai-narratives/AI-narratives-workshop-findings.pdf

Charisma, D. and Suherman, S. (2018), “The effectiveness of using lyra personal assistant in improving students’ speaking skill”, Community Concern for English Pedagogy and Teaching (CONCEPT), Vol. 11 No. 1, available at: https://e-journal.umc.ac.id/index.php/CJU/article/view/308 (accessed 12 March 2020).

Chopra, S. and Chivukula, S. (2017), “My phone assistant should know I am an Indian, My phone assistant should know I am an Indian: influencing factors for adoption of assistive agents, – MobileHCI ‘17”, Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services, pp. 1-8, available at: https://doi.org/10.1145/3098279.3122137

Chuan, C.H., Tsai, W.H. and Cho, S. (2019), “Framing artificial intelligence in American newspapers”, AAAI workshop: AI, ethics, and society. Proceedings of the 2019 AAAI/ACM Conference on AI, Ethics, and Society, pp. 339-344.

Davies, S., Stone, L. and Buttle, J. (2016), “Covering cops: critical reporting of Indonesian police corruption”, Pacific Journalism Review, Vol. 22 No. 2, pp. 185-201.

Ess, C. (2002), “Computer-mediated colonization, the renaissance, and educational imperatives for an intercultural global village”, thics and Information Technology, Vol. 4 No. 1, p. 11.

Fairclough, N. (2013), “Critical discourse analysis: the critical study of language”, Routledge, New York.

Foehr, J. and Germelmann, C. (2020), “Alexa, can I trust you? Exploring consumer paths to trust in smart Voice-Interaction technologies |”, Journal of the Association for Consumer Research, Vol. 5 No. 2, pp. 181-205, doi: 10.1086/707731, [online] Doi.org, (accessed 5 March 2020).

Han, S. and Yang, H. (2018), “Understanding adoption of intelligent personal assistants: a parasocial relationship perspective”, Industrial Management and Data Systems, Vol. 118 No. 3, pp. 618-636.

Hartono, H.S. (2015), “Muslim mothers and Indonesian gossip shows in everyday life”, Indonesia and the Malay World, Vol. 43 No. 126, pp. 298-316, doi: 10.1080/13639811.2014.996995.

Haryanto, E. and Ali, R. (2019), “Students’ attitudes towards the use of Artificial Intelligence SIRI in EFL learning at one public university”, In International Seminar and Annual Meeting BKS-PTN Wilayah Barat, Vol. 1 No. 1.

Huffman, S. (2019), “Here’s how the google assistant became more helpful in 2018”, [online] available at: www.blog.google/products/assistant/heres-how-google-assistant-became-more-helpful-2018/(accessed 17 February 2020).

Humas BPPT (2021), “Melalui inovasi, presiden jokowi dorong BPPT dalam pertumbuhan ekonomi nasional”, [online] Bppt.go.id, available at: https://bppt.go.id/layanan-informasi-publik/4206-melalui-inovasi-presiden-jokowi-dorong-bppt-guna-pertumbuhan-ekonomi-nasional (accessed 28 June 2021).

Ishwara, H. (2001), P.K. Ojong: Hidup Sederhana, Berpikir Mulia, Kompas, Jakarta.

Kemkominfo (2013), “Sesditjen IKP : perlu segera UU perlindungan data pribadi”, [online], available at: https://www.kominfo.go.id/content/detail/1337/sesditjen-ikp-perlu-segera-uu-perlindungan-data-pribadi/0/berita_satker (accessed 6 May 2020).

Kemkominfo (2015), “Kemkominfo siapkan RUU perlindungan data pribadi”, [online], available at: https://www.kominfo.go.id/content/detail/6142/kemkominfo-siapkan-ruu-perlindungan-data-pribadi/0/sorotan_media (accessed 6 May 2020).

Kemkominfo (2020), “Presiden serahkan naskah RUU PDP Ke DPR RI”, [online], available at: www.kominfo.go.id/content/detail/24039/siaran-pers-no-15hmkominfo012020-tentang-indonesia-akan-jadi-negara-asia-tenggara-kelima-yang-miliki-uu-pdp/0/siaran_pers (accessed 6 May 2020).

Kemdikbud (2015), “Kementerian pendidikan dan kebudayaan Indonesia”, [online], available at: www.kemdikbud.go.id/main/tentang-kemdikbud/sejarah-kemdikbud (accessed 27 June 2021).

KKP-RI (2019), “Laut Masa Depan Bangsa, Mari Jaga Bersama”, [online] available at: https://kkp.go.id/artikel/12993-laut-masa-depan-bangsa-mari-jaga-bersama (accessed 28 June 2021).

Kompas (2008), Profil Pembaca, Angket pembaca KOMPAS. [online] Kompas, available at: <https://web.archive.org/web/20111120202228/http://www.kompasiklan.com/profil>; (accessed 28 June 2021).

Kudina, O. and Coeckelbergh, M. (2021), “Alexa, define empowerment”: voice assistants at home, appropriation and technoperformances”, Journal of Information, Communication and Ethics in Society, Vol. 19 No. 2, pp. 299-312.

Lau, J., Zimmerman, B. and Schaub, F. (2018), “Alexa, are you listening?”, Proceedings of the ACM on Human-Computer Interaction, Vol, 2, (CSCW), pp. 1-31.

Lennerfors, T.T. (2019), Ethics in Engineering, Studentlitteratur, Lund.

Lennerfors, T.T. and Murata, K. (2021), “Culture as suture: on the use of ‘culture’ in cross-cultural studies in and beyond intercultural information ethics”, The Review of Socionetwork Strategies, Vol. 15 No. 1, pp. 71-85, doi: 10.1007/s12626-021-00080-x.

Liao, Y., Vitak, J., Kumar, P., Zimmer, M. and Kritikos, K. (2019), “Understanding the role of privacy and trust in intelligent personal assistant adoption”, In Information in Contemporary Society: 14th International Conference, iConference 2019, Washington, DC, Proceedings, eds. Taylor, N.G., Christian-Lamb, C., Martin, M.H. and Nardi, B. (Eds.,), Springer, Berlin, pp. 102-113.

Lopatovska, I., Rink, K., Knight, I., Raines, K., Cosenza, K., Williams, H., Sorsche, P., Hirsch, D., Li, Q. and Martinez, A. (2019), “Talk to me: exploring user interactions with the amazon Alexa”, Journal of Librarianship and Information Science, Vol. 51 No. 4, pp. 984-997.

Mhaidli, A., Venkatesh, M.K., Zou, Y. and Schaub, F. (2020), “Listen only when spoken to: interpersonal communication cues as smart speaker privacy controls”, Proceedings on Privacy Enhancing Technologies, Vol. 2020 No. 2, pp. 251-270.

Moar, J. (2019), The Digital Assistants of Tomorrow, Juniper Research Ltd, Hampshire.

Moneythor (2019), “Unbanked potential | moneythor”, available at: www.moneythor.com/2019/09/24/unbanked-potential/ (accessed 12 November 2019).

Mroz, G., Papoutsi, C. and Greenhalgh, T. (2021), “From disaster, miracles are wrought’: a narrative analysis of UK media depictions of remote GP consulting in the COVID-19 pandemic using Burke’s pentad”, Medical Humanities, Vol. 47 No. 3, pp. medhum-2020-012111.

Nugroho, Y. and Hikmat, A. (2017), “2. An insider’s view of e-governance under Jokowi: political promise or technocratic vision?”, in Jurriens, E. (Ed.), Digital Indonesia, ISEAS Publishing, Singapore, pp. 21-37, doi: 10.1355/9789814786003-008.

Ouchchy, L., Coin, A. and Dubljević, V. (2020), “AI in the headlines: the portrayal of the ethical issues of artificial intelligence in the media”, AI and Society, Vol. 35 No. 4, pp. 927-936.

Palm, E. (2016), “What is the critical role of intercultural information ethics?”, in Collste, G. (Ed.), Ethics and Communication: global Perspectives, Rowman and Littlefield International, pp. 181-195.

Pradhan, A., Findlater, L. and Lazar, A. (2019), “Phantom friend” or “just a box with information”, Proceedings of the ACM on Human-Computer Interaction, Vol. 3 No. CSCW, pp. 1-21.

Sahrasad, H. (2014), “Pers, demokrasi dan negara Indonesia Post-Soeharto: Sebuah perspektif”, Masyarakat, Kebudayaan Dan Politik, Vol. 27 No. 1, pp. 27-43.

Sen, K. and Hill, D. (2000), Media, Culture and Politics in Indonesia, Oxford University Press, Melbourne.

Sen, K. and Hill, D. (2011), Politics and the Media in Twenty-First Century Indonesia, Routledge, New York, NY.

Setiawan, A. (2019), “Indonesia layak jadi negara poros maritim dunia”, [online] Sekretariat Kabinet Republik Indonesia. available at: https://setkab.go.id/indonesia-layak-jadi-negara-poros-maritim-dunia/ (accessed 30 June 2021).

Setiawan, A. (2020), “Media online perlu berbenah diri”, [Blog] Dewa Pers, available at: https://dewanpers.or.id/publikasi/opini_detail/173/Media_Online_Perlu_Berbenah_Diri (accessed 28 June 2021).

Singer, J.B. (2008), “Five Ws and an H: Digital challenges in newspaper newsrooms and boardrooms”, International Journal on Media Management, Vol. 10 No. 3, pp. 122-129.

Suyono, S.J., Zulkifli, A., Setiadi, P., Soejono, D. and Yunus, S. (2014), “Seri Buku tempo: lekra dan geger 1965”, Kepustakaan Populer Gramedia (KPG), Jakarta.

van Dijk, T. (1988), News Analysis: Case Studies of International and National News in the Press, Lawrence Erlbaum Associates, Publishers, Hillsdale, NJ.

Yang, H. and Lee, H. (2018), “Understanding user behavior of virtual personal assistant devices”, Information Systems and e-Business Management, Vol. 17 No. 1, pp. 65-87.

Corresponding author

Anisa Aini Arifin can be contacted at: anisaa.arifin@gmail.com

Related articles