Introduction

The Emerald International Handbook of Technology-Facilitated Violence and Abuse

ISBN: 978-1-83982-849-2, eISBN: 978-1-83982-848-5

Publication date: 4 June 2021

Citation

Henry, N. (2021), "Introduction", Bailey, J., Flynn, A. and Henry, N. (Ed.) The Emerald International Handbook of Technology-Facilitated Violence and Abuse (Emerald Studies In Digital Crime, Technology and Social Harms), Emerald Publishing Limited, Leeds, pp. 167-170. https://doi.org/10.1108/978-1-83982-848-520211061

Publisher

:

Emerald Publishing Limited

Copyright © 2021 Nicola Henry. Published by Emerald Publishing Limited. This chapter is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of these chapters (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode.

License

This chapter is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of these chapters (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode.


The importance of language to our sense of self and our relationships with others makes us especially vulnerable to the injurious potential of speech, particularly “hate speech,” which has been the subject of much political and polarizing debate, especially concerning questions of regulation in the form of hate speech or vilification laws. Although there is no universally agreed definition of “hate speech,” critical race theorists Matsuda, Lawrence, Delgado, and Crenshaw (2018) define “assaultive speech” as “words that are used as weapons to ambush, terrorize, wound, humiliate, and degrade.” The metaphor of physical injury for the harms of speech acts – the “wounds of words” (Matsuda et al., 2018) – was powerfully explored in Toni Morrison's (1993) Nobel Lecture, in which she claimed “Oppressive language does more than represent violence; it is violence” (emphasis added). Speech acts then both cause harm and constitute harm (see Barendt, 2019; Gelber & McNamara, 2016).

Speech harms can be cumulative, long-term, and generational, reinforcing and amplifying discriminatory attitudes and behaviors that treat the “other” as inferior and subordinate, and solidifying existing power relations (Matsuda et al., 2018; see also Calvert, 1997 for a discussion). Jeremy Waldron (2012) argues that hate speech undermines and compromises the dignity “of those at whom it is targeted, both in their own eyes and in the eyes of other members of society” (p. 5). The lived experiences of targets of hate speech attest to these consequential and constitutive harms, such as normalizing discriminatory attitudes and behaviors, as well as “subordination, silencing, fear, victimization, emotional symptoms, restrictions on freedom, lowering of self-esteem, maintenance of power imbalances, and undermining of human dignity” (Gelber & McNamara, 2016, p. 336).

In this section of the Handbook on “speech-based harms,” the chapters by Kim Barker and Olga Jurasz, Emma Jane, Benjamin Colliver, Briony Anderson and Mark Wood, and Elina Vaahensalo, each take as their point of departure that words – in the form of online hate speech, doxxing, and other oppressive forms of online text or speech – cause wounds that are both physical and metaphysical, as well as individual and collective. Jane's chapter on “cyberhate” against women by women begins with a discussion of a “vicious attack” against a feminist colleague on Twitter. As Jane explains, online abuse leads to feelings of “hurt, fear, anger, and self-doubt,” which are compounded by “despair, disbelief, and betrayal” when the targets discover that the assailants are women because they assume them to be “peers, friends, or allies.” One participant in Jane's study described feeling like “she had been hit in the stomach by a baseball bat.”

Barker and Jurasz focus their chapter on online text-based sexual abuse, which they define as “written, electronic communication containing threatening and/or disruptive and/or distressing content, such as … textual threats to kill, rape, or otherwise inflict harm on the recipient of such messages.” Barker and Jurasz argue that text-based sexual abuse “has silencing effects on women and girls participating online and contributes to the creation of hostile spaces for women.” They argue that the legal system has created “hierarchy of harm … in which more credence and gravitas are given to forms of online abuse involving photographic representations of the victim than textual – and frequently very violent – abuse.” The resulting harms ranging from psychological to physical to democratic are frequently “shrugged off as less serious than offline,” and seen as “‘part and parcel’ of what happens online.”

Likewise, Colliver in his chapter describes a number of interconnected harms relating to transphobic online hate speech. In his study, he examined transphobic discourse in relation to YouTube videos concerning public toilet access and gender neutral toilets. He states that these speech acts reinforce rigid gendered stereotypes and binaries, as well as delegitimizing transpeople by invoking discourses of biological essentialism, functioning to “construct transgender people as a ‘scientific absurdity.’”

Anderson and Wood focus their chapter on a newer form of speech harm, “doxxing,” defined as “the intentional public release onto the internet of personal information about an individual by a third party, often with the intent to humiliate, threaten, intimidate, or punish the identified individual” (Douglas, 2016, p. 199). They characterize the harms of doxxing as a loss of the target's anonymity, obscurity, and legitimacy, with accompanying mental health effects (e.g., depression, anxiety), and in the case of “organizational doxxing,” the loss of competitive advantage.

In addition to exploring the harms of speech acts online, a number of the authors in this section have sought to explore the underlying drivers of oppressive speech acts as a means through which to exert and maintain power over marginalized groups. Anderson and Wood explain that the act of doxxing may or may not involve malicious intention. They use the example of journalists and activists using doxxing to serve the public interest (see also Cheung, this volume), or as a result of carelessness or negligence (e.g., failing to anonymize a source). And yet they also discuss the more hostile forms of doxxing, which involve, for instance, releasing compromising information about a person to prove oneself within an online network of peers, or to extort financial benefits, force individuals or groups of people to remove themselves from an online forum or platform, or as a form of control or punishment (e.g., in the context of domestic and family violence).

None of the authors in this section of the Handbook treat oppressive speech acts in simplistic, individualistic terms. Colliver, for instance, in his exploration of the ways in which cisgender YouTube commenters construct and position themselves as victims of oppression and political correctness, argues that those who engage in transphobic online hate speech “provide links to one another, and expressly attempt to encourage both recruitment and discussion among like-minded people” (pp. 57-8).

Similarly, Vaahensalo's chapter explores community building through “online othering discourse” by analyzing the cacophony of “antisocial” and “cruel” comments on the Finish forum Suomi24. Vaahensalo argues that “othering” is a result of the social “mechanics of intersubjectivity” and “sociality” that work to reinforce notions of “us” and “them” and is “not always a conscious act of harassment.” She notes too that “Othering, whether online or offline, is not always openly hostile or aggressive,” but rather can range from “innocent concerns and fears to genuine and open hostility.” This is an important point that highlights the insidious nature of less overt forms of discriminatory speech which are often downplayed as harmless expressions of free speech.

Jane too in her chapter examines the complex overlay of internalized misogyny and lateral violence as useful concepts for analyzing woman-to-woman online abuse. She argues that “subjugated peoples may lash out with displaced fury and frustration after internalizing and embodying dominant discourses and ideologies about their own groups.” The abuse of women by women, therefore, is a product of a broader, systemic form of interlocking forms of inequality as well as patriarchal oppression. Jane, however, acknowledges that in attributing the causes of cyberhate to structural factors, such as patriarchy, it risks “eliding part or all of subjects' agency and responsibility for their actions.” As such, individual traits such as psychopathy, Machiavellianism, and narcissism (or the “Dark Tetrad” as it is also known) must also be considered, alongside the “dark infrastructure” of digital platforms and their facilitation of “outrage and polarization.”

References

Barendt, 2019 Barendt, E. (2019). What is the harm of hate speech?. Ethical Theory and Moral Practice, 22, 539553.

Calvert, 1997 Calvert, C. (1997). Hate speech and its harms: A communication theory perspective. Journal of Communication, 47(1), 419.

Douglas, 2016 Douglas, D. M. (2016). Doxing: A conceptual analysis. Ethics and Information Technology, 18(3), 199210. doi:10.1007/s10676-016-9406-0

Gelber and McNamara, 2016 Gelber, K. , & McNamara, L. (2016). Evidencing the harms of hate speech. Social Identities, 22(3), 324341.

Matsuda et al., 2018 Matsuda, M. J. , Lawrence, C. L. , Delgado, R. , & Crenshaw, K. W. (2018). Words that wound: Critical race theory, assaultive speech, and the first amendment. New York, NY: Routledge.

Morrison, 1993 Morrison, T. (1993). Toni Morrison: Nobel lecture. Retrieved from https://www.nobelprize.org/prizes/literature/1993/morrison/lecture/

Waldron, 2012 Waldron, J. (2012). The harm in hate speech. Cambridge, MA and London: Harvard University Press.

Prelims
Technology-Facilitated Violence and Abuse: International Perspectives and Experiences
Section 1 TFVA Across a Spectrum of Behaviors
Chapter 1 Introduction
Chapter 2 Is it Actually Violence? Framing Technology-Facilitated Abuse as Violence
Chapter 3 “Not the Real World”: Exploring Experiences of Online Abuse, Digital Dualism, and Ontological Labor
Chapter 4 Polyvictimization in the Lives of North American Female University/College Students: The Contribution of Technology-Facilitated Abuse
Chapter 5 The Nature of Technology-Facilitated Violence and Abuse among Young Adults in Sub-Saharan Africa
Chapter 6 The Face of Technology-Facilitated Aggression in New Zealand: Exploring Adult Aggressors' Behaviors
Chapter 7 The Missing and Murdered Indigenous Women Crisis: Technological Dimensions
Chapter 8 Attending to Difference in Indigenous People's Experiences of Cyberbullying: Toward a Research Agenda
Section 2 Text-Based Harms
Chapter 9 Introduction
Chapter 10 “Feminism is Eating Itself”: Women's Experiences and Perceptions of Lateral Violence Online
Chapter 11 Claiming Victimhood: Victims of the “Transgender Agenda”
Chapter 12 Doxxing: A Scoping Review and Typology
Chapter 13 Creating the Other in Online Interaction: Othering Online Discourse Theory
Chapter 14 Text-Based (Sexual) Abuse and Online Violence Against Women: Toward Law Reform?
Section 3 Image-Based Harms
Chapter 15 Introduction
Chapter 16 Violence Trending: How Socially Transmitted Content of Police Misconduct Impacts Reactions toward Police Among American Youth
Chapter 17 Just Fantasy? Online Pornography's Contribution to Experiences of Harm
Chapter 18 Intimate Image Dissemination and Consent in a Digital Age: Perspectives from the Front Line
Section 4 Dating Applications
Chapter 19 Introduction
Chapter 20 Understanding Experiences of Sexual Harms Facilitated through Dating and Hook Up Apps among Women and Girls
Chapter 21 “That's Straight-Up Rape Culture”: Manifestations of Rape Culture on Grindr
Chapter 22 Navigating Privacy on Gay-Oriented Mobile Dating Applications
Section 5 Intimate Partner Violence and Digital Coercive Control
Chapter 23 Introduction
Chapter 24 Digital Coercive Control and Spatiality: Rural, Regional, and Remote Women's Experience
Chapter 25 Technology-Facilitated Violence Against Women in Singapore: Key Considerations
Chapter 26 Technology as Both a Facilitator of and Response to Youth Intimate Partner Violence: Perspectives from Advocates in the Global-South
Chapter 27 Technology-Facilitated Domestic Abuse and Culturally and Linguistically Diverse Women in Victoria, Australia
Section 6 Legal Responses
Chapter 28 Introduction
Chapter 29 Human Rights, Privacy Rights, and Technology-Facilitated Violence
Chapter 30 Combating Cyber Violence Against Women and Girls: An Overview of the Legislative and Policy Reforms in the Arab Region
Chapter 31 Image-Based Sexual Abuse: A Comparative Analysis of Criminal Law Approaches in Scotland and Malawi
Chapter 32 Revenge Pornography and Rape Culture in Canada's Nonconsensual Distribution Case Law
Chapter 33 Reasonable Expectations of Privacy in an Era of Drones and Deepfakes: Expanding the Supreme Court of Canada's Decision in R v Jarvis
Chapter 34 Doxing and the Challenge to Legal Regulation: When Personal Data Become a Weapon
Chapter 35 The Potential of Centralized and Statutorily Empowered Bodies to Advance a Survivor-Centered Approach to Technology-Facilitated Violence Against Women
Section 7 Responses Beyond Law
Chapter 36 Introduction
Chapter 37 Technology-Facilitated Violence Against Women and Girls in Public and Private Spheres: Moving from Enemy to Ally
Chapter 38 As Technology Evolves, so Does Domestic Violence: Modern-Day Tech Abuse and Possible Solutions
Chapter 39 Threat Modeling Intimate Partner Violence: Tech Abuse as a Cybersecurity Challenge in the Internet of Things
Chapter 40 Justice on the Digitized Field: Analyzing Online Responses to Technology-Facilitated Informal Justice through Social Network Analysis
Chapter 41 Bystander Apathy and Intervention in the Era of Social Media
Chapter 42 “I Need You All to Understand How Pervasive This Issue Is”: User Efforts to Regulate Child Sexual Offending on Social Media
Chapter 43 Governing Image-Based Sexual Abuse: Digital Platform Policies, Tools, and Practices
Chapter 44 Calling All Stakeholders: An Intersectoral Dialogue about Collaborating to End Tech-Facilitated Violence and Abuse
Chapter 45 Pandemics and Systemic Discrimination: Technology-Facilitated Violence and Abuse in an Era of COVID-19 and Antiracist Protest