Technology-Facilitated Violence Against Women and Girls in Public and Private Spheres: Moving from Enemy to Ally

The Emerald International Handbook of Technology-Facilitated Violence and Abuse

ISBN: 978-1-83982-849-2, eISBN: 978-1-83982-848-5

Publication date: 4 June 2021


While research on digital dangers has been growing, studies on their respective solutions and justice responses have not kept pace. The agathokakological nature of technology demands that we pay attention to not only harms associated with interconnectivity, but also the potential for technology to counter offenses and “do good.” This chapter discusses technology as both a weapon and a shield when it comes to violence against women and girls in public spaces and private places. First, we review the complex and varied manifestations of technological gender violence, ranging from the use of technology to exploit, harass, stalk, and otherwise harm women and girls in communal spaces, to offenses that occur behind closed doors. Second, we discuss justice-related responses, underscoring how women and girls have “flipped the script” when their needs are not met. By developing innovative ways to respond to the wrongs committed against them and creating alternate systems that offer a voice, victims/survivors have repurposed technology to redress harms and unite in solidarity with others in an ongoing quest for justice.



Marganski, A.J. and Melander, L.A. (2021), "Technology-Facilitated Violence Against Women and Girls in Public and Private Spheres: Moving from Enemy to Ally", Bailey, J., Flynn, A. and Henry, N. (Ed.) The Emerald International Handbook of Technology-Facilitated Violence and Abuse (Emerald Studies In Digital Crime, Technology and Social Harms), Emerald Publishing Limited, Leeds, pp. 623-641.



Emerald Publishing Limited

Copyright © 2021 Alison J. Marganski and Lisa A. Melander. Published by Emerald Publishing Limited. This chapter is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of these chapters (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at


This chapter is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of these chapters (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at


Literature on digital dangers has been growing, yet the research on their respective solutions and justice system responses has not kept pace. The agathokakological nature of technology demands that we pay attention to not only harms associated with interconnectivity but also the potential for technology to counter offenses and “do good.” The dichotic nature of digital progress in relation to violence against women and girls (VAWG) suggests that much more needs to be done to address and prevent increasingly common episodes of technology-facilitated gender violence, especially for vulnerable and marginalized groups, to exact positive cultural change. Further, we must be concerned with online displays of sexism and misogyny in communal spaces and private technologized violence (i.e., when electronic means are used “behind closed doors” or in exclusive communities as a tool for surveillance) as they may co-occur, and also share commonalities with wider public violations including street harassment and nonpublic violations like family violence.

This chapter reviews VAWG in public spaces and private places, and considers technology as both a weapon against women and girls and a shield for them. We first explore the complex and varied manifestations of technological transgressions, which include the use of technology to harass (e.g., cyberbullying) or nonconsensually record (e.g., “upskirting”) women and girls (hereafter referred to as WG), to violate positions of power and authority when electronically tracking them during mediated encounters (e.g., stalking by Uber drivers), and to monitor/control “loved” ones (e.g., home surveillance systems). We then reflect on formal and informal “justice” responses, underscoring the ways by which WG have “flipped the script” when their needs have not been met, to move from victim to survivor, utilizing the very tools that have been misappropriated by others to violate them. Last, we highlight recent, innovative technological approaches to combat online VAWG, closing with comprehensive recommendations on how to better serve victims/survivors of gender-related violence.

Violence against Women and Girls

Despite years of advocacy work, legislative change, and support services, VAWG affects millions across the globe (WHO, 2017), which includes offenses ranging from daily encounters tinged with benevolent sexism (see, e.g., discussion of gender microaggressions in Capodilupo et al., 2010) to less common yet more egregious and hostile events (see, e.g., discussion of mass murder in Marganski, 2019a). One in three women will have experienced an act of physical or sexual violence in her lifetime (WHO, 2013), with rates of physical violence, sexual violence, and stalking higher among bisexual (Walters, Chen, & Breiding, 2013) and transgender (Stotzer, 2009) women, and those belonging to other marginalized groups (e.g., Plummer & Findley, 2012; Rosay, 2016). Further, approximately 50% of women experience intimate partner–perpetrated psychological aggression and/or coercive control in their lifetime (Black et al., 2011). Yet, VAWG has been challenging to study due to definitional and methodological differences that sometimes fail to capture a range of experiences, including street harassment (Vera-Gray, 2017), workplace harassment (Westmarland, 2002), and other offenses (see, e.g., discussion of groping at pubs, clubs, and concert venues in Fileborn, 2016; Fileborn, Wadds, & Tomsen, 2019) by different perpetrators in a variety of settings.

Beyond in-person victimizations, VAWG is also technology-facilitated, consisting of overt and covert behaviors (e.g., aggression and surveillance) where perpetrators need not be present to aggress against their targets. Technology-facilitated VAWG has been defined by the United Nations as “acts of gender-based violence that are committed, abetted or aggravated, in part or fully, by the use of information and communication technologies (ICTs), such as phones, the internet, social media platforms, and email” (APC, 2020, para 1). With internet access being widely available and smart-device ownership overwhelmingly common (Anderson & Jiang, 2018), various social problems seep from the real world into the digital realm.

It is now estimated that three-quarters of WG have experienced or been exposed to online violence (Tandon & Pritchard, 2015), including threats of death or rape, bullying/harassment, and stalking, among other offenses, making this type of victimization more common than any in-person form. Even when these acts are limited to those perpetrated by intimate or dating partners, research indicates most young adults in college (Marganski & Melander, 2018) and youth in schools (Zweig & Dank, 2013; Zweig, Dank, Yahner, & Lachman, 2013) report technology-facilitated relationship violence. These online experiences have been associated with depression, substance use, antisocial behavior (Melander & Marganski, 2020), and increased risk for in-person partner violence victimizations (Marganski & Melander, 2018; NNEDV, 2014).

Technology-facilitated VAWG occurs in public spaces and private places, and it varies between and within groups (e.g., adults and minors, persons of color, those with different abilities, social class, etc.). Younger women, as well as those who are sexual minorities, are at higher risk for technology-facilitated sexual violence victimization, for example, than their counterparts (Henry & Powell, 2018). While the reasons for these patterns are complex, it points to the importance of social location in the study of these contemporary issues. Such violence falls on a continuum of illegal and legal behaviors (see, e.g., Kelly, 1987), and these transgressions threaten the privacy and freedoms of those on the receiving end (Citron, 2014). Further, it is a global phenomenon (see, e.g., Bailey & Steeves, 2015; Tandon & Pritchard, 2015) extending beyond the United States (US), as researchers in Australia (Dragiewicz et al., 2019), the United Kingdom (UK) (Mendes, Ringrose, & Keller, 2019), Canada (Bailey & Mathen, 2019), and elsewhere have noted. In all, technology's abuse, along with associated harms, creates an urgency for investigating VAWG comprehensively so we may derive informed and appropriate solutions.

Technology as a Weapon

Electronically facilitated VAWG has been dubbed an “old behavior in a new guise” (Campbell, 2005). Inherent in the virtual world are larger systems and interactions within those systems that imitate those in the real world; these large-scale structural forces impact power dynamics and abuses in ICTs, including online sexual exploitation and image-based sexual abuse, cyberstalking, and other forms of cyberviolence (Marganski, 2018, 2019b). While WG dominate social media platforms and are more likely to use them than male counterparts (78% vs. 65% 1 ), they are not necessarily protected in these spaces and are far more likely to experience certain kinds of victimization, such as technology-facilitated sexual violence and cyberstalking (Flynn & Henry, 2019; Henry & Powell, 2018; Powell, Scott, Henry, & Flynn, 2020). Perpetrators of image-based sexual violence, for example, are commonly ex-husbands and ex-boyfriends who seek out and receive male peer support online that encourages violence against former female lovers (see, e.g., DeKeseredy & Schwartz, 2016; Henry & Flynn, 2019). The reality, then, is that many online harms are gendered in nature.

Some acts of technological VAWG resemble those that occur offline, albeit new offenses have emerged that present novel threats. Such technocrimes include nonconsensual photo/video recordings taken in public or private settings without consent such as “upskirting” and “down-blousing,” which sometimes are distributed widely thereafter online (Henry et al., 2020; McGlynn & Downes, 2015; Powell, Henry, Flynn, & Scott, 2019), and cases where persons in power violate positions of trust by electronically tracking victims they meet during the course of their work (see, e.g., discussion of stalking and assaults of passengers by Uber and Lyft drivers in Osterheldt, 2019). Additional examples include, but are not limited to: e-trafficking, cyber-trolling, online impersonation, doxing, swatting, cyber-mob attacks, deep-fakes, and other image-based abuse (Henry & Flynn, 2019).

Similar to in-person VAWG, perpetrators may use technology to objectify, demean, intimidate, and/or control others. In contrast to face-to-face violence, however, perpetrators of technology-facilitated VAWG may be unknown because they may be unseen. They could be strangers, classmates, coworkers, friends, or loved ones (Henry et al., 2020). This may help explain high rates of these transgressions as the veil of anonymity protects and may embolden perpetrators; however, even when perpetrators are known, high rates of aggression still exist (see, e.g., Marganski & Melander, 2018; Melander & Hughes, 2018; Zweig & Dank, 2013).

Individuals have used technologies in invasive and vituperative ways; monitoring, controlling, and punishing others. Domestic and sexual violence programs have reported that technology-facilitated VAWG is a real concern, affecting nearly all who seek their services (NNEDV, 2014). Adolescent girls are also vulnerable to cybervictimization due to the amount of time spent online while navigating through difficult developmental life stages (Chisholm, 2006). Culturally -extolled femininity (Connell, 1987) collides with the normalization of VAWG (Klein, 2006), producing conditions conducive to these events. Girls more than boys experience pressure to sext (Crofts, Lee, McGovern, & Milivojevic, 2015), and those who do may be shamed later on; a by-product of contradictory cultural codes that judge girls on sex appeal while also policing their sexuality. Technology further amplifies harms, permitting perpetrators to reach a much wider audience than ever before, extending those who gaze, shame, and inflict harm. Thus, the ease, accessibility, and speed of our connections holds serious implications and presents unique challenges for not only victims who may experience heightened fear, anger, depression, and/or suicidal ideation (see, e.g., Bates, 2017; Powell, Henry, & Flynn, 2018) but also justice systems whose investigations grow ever more complicated.

Gendered cyberhate has become common in digital spaces, and behaviors comprising it have grown increasingly threatening to recipients (Jane, 2017; Mantilla, 2013). Sexually charged, highly misogynistic content and other forms of vitriol have a strong presence in segments of the Manosphere (see, e.g., discussion of incels, the red pill subreddit, etc. in Center on Extremism, 2018; Ging, 2017; Scaptura & Boyle, 2019; Zimmerman, Ryan, & Duriesmith, 2018), and though there have been challenges to discriminatory language and obscene material in the US (see, e.g., Miller v California, 1973; Pope v Illinois, 1987; Smith v US, 1977), many prejudicial posts have found protection under the guise of free speech. Some online spaces are known to house hostile viewpoints. For example, thousands of insecure men and boys who fail to live up to an ideal (e.g., hegemonic masculinity) have found solace in online communities that call for the subjugation of WG (Woolf, 2014), fostering animosity, violence, and even lethal events. The end result is toxic, as unregulated messages perpetuate myths, misinformation, and hate that influences various kinds of VAWG.

In all, technology-facilitated VAWG comes in many forms and has the potential to result in physical, psychological, behavioral, social, and financial problems. It constitutes legal and moral violations, and it infringes on individuals' active engagement in the digital world, which perpetuates and maintains inequalities that, in turn, continue to breed violence (e.g., Gorman, 2019). This can have devastating consequences on not only the persons targeted but also their families, social networks, and society.

Technology as a Shield

While technology has been used to harm WG, it has also been used to counter transgressions, hold perpetrators accountable for their actions, and protect individuals from future episodes of violence. Far from ideal, justice system responses have fallen short in meeting the justice needs of victims/survivors and, in some cases, contributed to further harms via secondary victimization (Flynn, 2015). WG have been pressured by police, courts, and other social service providers to turn off or alter the settings on their electronic devices, close social media accounts, and change financial and other accounts online, ignoring how difficult, impractical, and silencing/isolating such measures can be (Dragiewicz et al., 2018; Woodlock, McKenzie, Western, & Harris, 2019). When elements fundamental to safety, healing, and recovery are not included in justice practices or otherwise absent from the lives of those who suffer transgressions, the result is untreated, prolonged, and continued suffering. To move forward, it is imperative that structures incorporate trauma-informed care into operating systems and practices to improve users' experiences and set the tone for electronic environments in ways that shape expectations and promote responsible behavior.

For far too long, the burden of stopping or preventing VAWG has disproportionately fallen onto WG who are on the receiving end of violence (and more recently, bystanders, as found in the Mentors for Violence Prevention, Green Dot, etc. 2 ), rather than the men and boys perpetrating problematic behaviors. Due to long-standing patriarchal practices, WG who are victims of gender-based violence are often blamed for perpetrator actions or bear the weight of innovating the problem away.

The failure of justice systems to address VAWG online (see, e.g., Dunn, Lalonde, & Bailey, 2017) is symptomatic of larger social failures to treat such violence and discrimination seriously. Feminists have brought light to the issues and absurdities associated with perspectives that blame victims for others' actions, which alleviates the perpetrator – as well as the larger culture – from responsibility and feeds into misbehavior as well as the inability to change. Satire pieces such as “The Rape of Mr. Smith” (n.d.) or “Tips Guaranteed to Prevent Rape” (Tarrant, 2009) highlight unjust practices and the burdens that WG face. Moving from chastity belts to self-defense classes, to antirape technologies that place the onus on potential targets of violence, it seems as though the more things change, the more they stay the same (Flynn, 2015). It should come as no surprise then that our reliance on technology for everyday interactions has led to the proliferation of technological strategies to combat VAWG.

Contemporary “solutions” to VAWG are geared toward potential victims, making rounds on social media as real fixes for large, complex problems. Examples include wearable technologies designed to detect, deter, and even punish violators, including drug detecting nail polish (Zikalala, 2017), wristbands that emit a foul odor to ward off predators (Cuddy, 2019), pepper-spray stilettos (EFE, 2017), internal barbed condoms (Karimi, 2010), and jackets/underwear that deliver powerful jolts of electricity or are nearly impossible to pull off (Euronews, 2013; Kahney, 2003). Noncorporeal technologies also exist including straws to detect spiked drinks (Steffen, 2020), stamps to mark public transit gropers (Koizumi, 2019), and consent condoms that require four hands to open (Cuddy, 2019). When existing patriarchal structures combine with capitalism at a point in history when technology reliance is at an all-time high, we see private companies advertising, marketing, and profiting from strategies that claim to reduce risk, yet ultimately fail to target perpetration and may inadvertently result in target transference or create unexpected harms as a consequence of usage. Further, antirape technologies create a false sense of safety and security, especially considering the plethora of evidence pointing to alcohol as the leading date rape drug (Anderson, Flynn, & Pilgrim, 2017; Hindmarch & Brinkmann, 1999). These technologies, while driven by good intentions, have unanticipated and undesirable effects. They misplace responsibility by targeting victims rather than perpetrators, perpetuate and reinforce rape myths, and undermine victims'/survivors' agency (Bivens & Hasinoff, 2017; White & McMillan, 2019). We would be remiss to overlook this in discussions of effective, evidence-based solutions.

Despite its limitations, technology has been used to counteract women's perceived and actual vulnerability. Cell and mobile phones may be used to connect with others in prearranged calls to avoid “stranger danger,” reach emergency services, and document evidence of violations, while Apple watches may activate alarms. Technology can also facilitate the sharing of experiences and access to resources. The SmartSafe+ app, for example, assists with collecting and storing evidence that can be used in criminal justice proceedings (Caneva, 2016), whereas Circle of 6 not only provides immediate access to local and national rape crisis hotlines and resources but also automatically alerts up to six preselected friends that the person either needs help getting home safely, a phone call to interrupt a potentially dangerous situation or to talk immediately, all with the touch of a button (Circle of 6, 2020). EmergenSee and LiveSafe operate in similar ways.

Beyond responding to hazardous circumstances or events, victims/survivors have found support in the aftermath of trauma through digitized social networks, and some have taken action online to have their justice needs at least in some part met while others have advocated more broadly via ICTs (see, e.g., Fascendini & Fialová, 2011). Dissatisfied by formal justice (in)actions, some individuals have sought informal justice online. Referred to as “digital vigilantism,” “feminist digilantism,” or “DIY justice online” (Jane, 2016; Powell, 2017), women, girls, and allies have taken to technology to fight against online harassment to put perpetrators on notice and hold them accountable for their actions, when the criminal justice system has not (Al-Alosi, 2020). They have explicitly named their rapists online to warn others (Pryor, 2017), responded to unsolicited dick pics with the same kind of content to give perpetrators “a taste of their own medicine” (Hockaday, 2019, para 4), and shared screenshots of misbehavior with perpetrators' mothers, partners, and employers, in the hopes that informal sanctions might follow (e.g., Hawken, 2019; Payton, 2014).

Additionally, victims/survivors have used technology to gain support and organize “speak outs,” demonstrations, and “take-back-the-night” rallies (Fileborn, 2014). They have also engaged in hashtag activism through campaigns like #rapedneverreported, #whenIwas, and #NotOkay (Powell & Henry, 2017). Safe spaces such as, the internet's first women-only social network (Cox, 2014), have also been created, and victims/survivors have collaborated with law enforcement to develop apps like VictimsVoice (Ridley, 2019) to safely collect and store evidence that may later be used to press charges. In rare yet noteworthy instances, institutions have stepped up in solidarity by using positions of power to advocate for victims/survivors while sending a message to perpetrators. For example, The Bristol Post posted pictures of the adult men who threatened and harassed teen climate activist, Greta Thunberg, online (Staples, 2020).

In contrast to the dark corners of the web, there are bright ones dedicated to collective resistance, healing, and empowerment. Activism projects and online communities such as Project Unbreakable, Everyday Sexism, Take Back the Tech!, and Hollaback! exist to not only document transgressions but also inform and mobilize victims/survivors of gender-based violence (Mendes et al., 2019; Powell, 2017). As such, WG have become inspired online to discuss experiences in their own words and from their own perspective, which is something traditionally denied through more formal routes. These expressions can be beneficial to victims/survivors, providing them with emotional release and an opportunity to feel validated, help others in similar circumstances, and encourage others to bear the weight of intervention and advocacy (Al-Alosi, 2020). These efforts demonstrate the enormity of VAWG as a broad social problem, drawing attention to the issue in a more public way than could ever be possible through regular, in-person justice routes.


In much the same way that offline offenses have not been treated seriously or met with ineffective responses and solutions, online assaults too have been minimized or dismissed, leaving victims/survivors to handle these transgressions with little to no formal support. To counter the inadequacies of criminal justice (in)actions, victims/survivors have taken to technology to pursue their own justice by calling out abuses and collaborating with one another to provide information, education, and support. Although victims of technology-facilitated violence are often without legal recourse, recent news headlines – such as when a judge awarded $13 million to 22 women who were coerced into making sexually explicit videos sold on porn sites without their consent (Grant, 2020) – highlight major victories. Yet challenges remain, as inequalities are evident for groups of people in terms of technology access, ease of use, and overall engagement (Al-Alosi, 2020, 2018; Silver, 2019). Furthermore, some countries restrict or regulate user-generated content or platform access, presenting challenges for victims/survivors, along with researchers, advocates, and others who wish to have their voices heard and address VAWG. This therefore impacts victims'/survivors' options, choices, and pathways to recovery.

Additional issues exist when considering the role of culture in the kinds of transgressions we see, and the tools that render aid. The technology industry and innovations are still male-dominated, male-oriented, and male-controlled, which poses difficulties in meeting WG user needs, and the harms they experience may go unrecognized or be dismissed. Preventing and responding to VAWG therefore requires that we become more inclusive and stretch beyond traditional design to dismantle problematic norms (Bivens & Hasinoff, 2017; Jewkes, Flood, & Lang, 2015). Solutions must engage various actors – including marginalized and oppressed persons – through inclusive collaborations between not only victims/survivors and victim advocates but also tech companies, healthcare providers, legal/criminal justice personnel, and educators to integrate various voices and target the complexity of the issues.


Technology companies, social networking executives, and software engineers are well poised to make important advances in victim safety and protection. When these powerful institutions and employees combine with advocacy groups, they can better meet the needs of victims/survivors. The Safety Net team at the National Network to End Domestic Violence (NNEDV, 2014) demonstrates the promise these multifaceted collaborations have through its Tech Safety app, which aids individuals in recognizing warning signs of stalking, harassment, and abuse, while offering tips for documentation and resources for support. Beyond education and victim/survivor support, they have targeted perpetration via the Coalition Against Stalkerware (NNEDV, 2019), which advocates for the eradication of spyware due to this tool being used by perpetrators to track victims' locations and activities. Tech entrepreneurs can benefit from education and training on VAWG, which may then influence innovation design in ways that protect users, improve safeguards, lessen perpetration, and provide valuable resources to users, thereby creating a friendlier, sociotechnological space.

Instead of having the ultimate goal of keeping people logged in without consideration of their experience, satisfaction, and well-being, prioritization should be given to “regenerative technology,” which refers to healthy online interactions that emphasize having empathy and kindness at the foundation (Zaki, 2019). As such, with any technological advancement, it is important to consider ethics. That is, just because it can be done, does it mean that it should? What are the potential outcomes? Are there ways to offset or respond to harm? It is imperative that those at the helm of these technological innovations, such as software engineers, receive adequate ethics training while reflecting on human rights and the costly consequences of inequality and discrimination that may restrict or impede the digital inclusion and progress of all persons in this era (Beduschi, 2018). Beduschi (2018) warns that the very software engineers who are creating the algorithms behind programs that make our lives easier may also be breaching our rights without even knowing it, invading and recording private conversations, and channeling people into haphazard spaces. Thus, we need to ask technology creators to step up, listen to, and learn from users, including victims/survivors, in order to secure safe digital spaces that maximize experiences and overall connectedness. Tech companies should have clear policies in place for violators, but they should also be willing to work on restorative practices in ways that modify behavior and produce actual change. Further, although much offensive online content enjoys protections, software engineers should be charged with preventing harmful content from being freely shared and finding user-friendly and intuitive ways in which victims/survivors can opt to filter out or limit their exposure to triggering materials. Such collaborations between technology users and designers could prove transformative and reduce initial, repeat, and vicarious victimizations.


Healthcare providers can play vital roles in combating the effects of technology-facilitated VAWG. Physicians, primary care providers, counselors, and others involved in the healthcare system should be educated to recognize the signs and symptoms of tech-related violence, have screening tools available to aid in assessments, and have online and offline treatment plans in place. They may inform parents, as well as their patients, on ways to stay safe online and connect patients with other service providers, such as social workers, to identify sources of local support (Waseem et al., 2017). Much like with other illnesses and injuries, early intervention could be the key in producing more positive outcomes. Physical health, mental health, and wellness services can provide patients with vital services that promote healing and recovery in ways that work best for them. Medical interventions are often quite costly for domestic and sexual violence victims in general, which may deter some from seeking services. As such, it is critical to have online resources such as the Compensation Compass app to alert users of compensation funds to which they may be entitled, which helps alleviate financial strains (Masters, 2019). Such combined efforts signal to the public the serious nature of technological abuse.

Legal/Criminal Justice System

Increasing legislation in the US has been enacted to protect victims of technology-facilitated violence and abuse. There are now 45 laws related to adult cyberstalking/harassment (Working to Halt Online Abuse, 2020). In the US, there are two federal statutes, the Interstate Communications Act of 1994 and the Federal Interstate Stalking Punishment and Prevention Act of 1996, and many other underutilized statutes that can be used for technology-facilitated violence (Cox, 2014). Recognizing that these offenses can have severe consequences, almost every state in the US has implemented varied laws that criminalize technology-facilitated violence or modify existing law to include cyber-related provisions (Cox, 2014). Nonetheless, there are many challenges and limitations to law.

To prevent technology-facilitated violence and abuse, a clear understanding of specific offenses comprising it (e.g., cyberstalking, online harassment, and image-based sexual abuse) and articulation of what constitutes each type of offense is needed to inform justice systems and personnel. The disseminated information should also include information on the dynamics of these harms, including the cooccurrence of digital and in-person offenses (see, e.g., discussion of nonlethal events in Marganski & Melander, 2018; and lethal events in Todd, Bryce, & Franqueira, 2020). Such training has the potential to improve arrest and prosecution practices that impact victim safety and perpetrator accountability. Importantly, this means taking reports of digital harms seriously. Women's/girls' experiences of technology-facilitated violence have been minimized by law enforcement and others in power, much like in-person experiences of gender violence historically has been (Citron, 2009).

Even when recognition of the problem exists and reports are taken seriously, perpetrators of technology-facilitated crimes continue to evade detection and otherwise fall through the cracks of the justice system. Some cannot be traced; others may be dismissed. Conflicts in law further complicate matters, such as when a perpetrator resides in one jurisdiction while the target resides in another, thereby creating the need for expansion and collaboration when it comes to domestic and international law differences (Flynn & Henry, 2019; Henry, Flynn, & Powell, 2018). The multijurisdictional nature of many crimes calls for federal law enforcement and prosecutors to work in tandem with local agencies and offer resources necessary for investigations (Wilkinson, 2016). Beyond further criminalizing and responding to offenses, however, it is essential that the criminal justice system collaborates with victims' advocates and victims/survivors to outline effective remedies and protect human rights. Effective community responses, therefore, require the convergence of multiple stakeholders to share knowledge and purposively consider lived experiences in contemplating solutions.


Education on the nature and consequences of technology-facilitated VAWG is crucial and should come from a variety of sources. Teachers at all levels can raise awareness of this social problem. School-based cyberbullying prevention and intervention programs for children and university students, which includes individual-level, multi-level systemic, and universal school approaches, have demonstrable effectiveness in reducing perpetration and victimization (Doane, Kelley, & Pearson, 2016; Gaffney, Farrington, Espelage, & Ttofi, 2019). Cyber harassment programs have also been implemented in the workforce, and it is recommended that not only should there be education/awareness regarding how to respond to specific instances of cyber aggression but also attention to the role of organizational culture in shaping individual behavior (Faucher, Cassidy, & Jackson, 2015). It is imperative that prevention programming includes elements of bystander intervention to reduce these modern transgressions and hold those who may offend to a higher level.


Although technology shows promise in addressing VAWG, it has limitations. While it can provide access to much needed services for WG in remote locations, not all women can afford or even know how to operate the latest and greatest technological innovations (Al-Alosi, 2018, 2020). Further, technology is not failsafe, and routine maintenance, lack of internet access or connections, and other service disruptions can interfere with regular performance, isolating women from sources of support (Al-Alosi, 2020). There are also issues with the corporate model of technology that collects and sells user data, opening users up to privacy risks. As such, overreliance on these devices and platforms may contribute to erroneous conclusions about safety and security that endanger some and contribute to violence. Performance aside, technological advancements may have other unintended consequences, including victim blaming. Similar to asking a woman to switch off her phone if she does not want harassing text messages, it is not inconceivable that a victim would be asked why she did not activate her safety app or gather video evidence of her assault (Henry et al., 2018; Powell, 2017). This modern-day victim blaming may revictimize the woman as she is interrogated about why she did not alter her behavior to prevent the assault (Woodlock et al., 2019). The very technology women may utilize to reach out for help may also be used against them by partners, who may own and control access to their cell and mobile phone accounts, using the same devices and programs to track and monitor activity (Al-Alosi, 2020; Freed et al., 2018). Those who reach out online for community support may also face a different kind of backlash from anonymous strangers who seek to revictimize, shame, and denigrate victims of TFVA.

Nevertheless, technology has much to offer victims of gender violence. Victims/survivors may use digital means aimed at responding to and preventing VAWG, and there are benefits of using technology, including: providing victims access to vital online and in-person resources (e.g., educational materials, service provider support, etc.), warning others of predatory behavior, reducing feelings of isolation through social connections, and giving a voice to those often silenced. More intersectional research, however, is necessary to examine the effectiveness of technology in preventing technology-facilitated VAWG, both in the short and long term, as different groups have varied experiences online (Felmlee, Rodis, & Francisco, 2018; Powell et al., 2020).

Inclusive and culturally appropriate solutions for victims/survivors are necessary. Technological innovations that improve digital device security mechanisms and more accurately assess user patterns to determine that the primary owner is the only permitted user, such as enhanced authentication methods (Freed et al., 2018), and further engagement with online providers to curb harassing activities of their users online (Fascendini & Fialova, 2011; Dragiewicz et al., 2018) are warranted. We should not tell victims to simply stop using social media and electronics. Instead, we should be creating spaces that are safe and empower them to make decisions that promote healing, recovery, and solidarity. We see evidence of gender-based hate, misogyny, and other discriminatory practices in this digital era, which is increasingly recognized and confronted, yet we lag in reactive and proactive solutions. Until we target violence at its source (i.e., perpetrators), technology-facilitated VAWG will continue.


Just as vulnerable communities need a continuum of care that may include mentors, social and recreational activities, and therapy, so too do vulnerable tech users. WG worldwide face real threats to their health and well-being from strangers, classmates, coworkers, intimate partners, and family members, both offline and online. VAWG is a global phenomenon in dire need of attention. Rather than normalizing and dismissing these experiences, we need to collaborate to address perpetration, challenge problematic norms, and offer support to those most marginalized and harmed. Social and structural conditions contribute to technology-facilitated VAWG and allow perpetrators to thrive, perpetuating systemic injustice and maintaining inequalities. By educating and sensitizing various persons to technology-facilitated VAWG, providing safeguards and assistance in digital spaces, and specifying sanctions for law-breaking behavior (Tandon & Pritchard, 2015), toxic behaviors and norms can be diminished. Correcting institutional and systemic responses to address misinformation and propagate promising solutions will shift norms, policies, and practices in informed and responsible ways that exact positive cultural change.

The Anti-Defamation League's Center on Extremism (2018) also articulated specific recommendations for addressing VAWG including: building understanding among law enforcement officers and organizations about the nature of misogynist hate; taking legal and policy actions to recognize modern transgressions as harmful and promote gender equality; incorporating gender-based hate in antidiscrimination educational programming; receiving support from federal and state agencies/leaders for research and services on victims, perpetrators, and communities; forming partnerships between tech and the public; improving and enforcing Terms of Service; filtering out obscene and offensive content; and punitive actions. These proposed measures are an excellent start. However, alternative solutions that address the root causes of these behaviors and work in reintegrative ways are also needed.

Understanding the nature of contemporary technology-facilitated VAWG, along with the unique struggles victims/survivors face can create more compassionate, user-friendly digital spaces (Gjika & Marganski, 2020). By recognizing diverse needs and implementing victim/survivor-centered approaches, larger structures (e.g., media and social media platforms) can help those harmed gain a sense of control and empowerment that allows them to act in their best interests. For too long, victims have been the forgotten persons in our justice systems, and we are witnessing the same apply to the digital world. True justice requires that we work with those who have been injured, and the social forces and norms that permit the continued perpetration of harm, to develop solutions. This requires a systemic understanding of how violence and its regulation affect the tech milieu as a whole. By prioritizing the rights, needs, and wishes of those who are most vulnerable to harm, we can make progress in digital spaces and social interactions.



These programs aim to educate and engage community members such as students, faculty, and staff in bystander intervention strategies, so that individuals may recognize and respond to social problems in constructive ways that simultaneously help create positive cultural change.


Al-Alosi, 2018 Al-Alosi, H. (2018, June 17). Technology is both a weapon and a shield for those experiencing domestic violence. The Conversation. Retrieved from

Al-Alosi, 2020 Al-Alosi, H. (2020). Fighting fire with fire: Exploring the potential of technology to help victims combat intimate partner violence. Aggression and Violent Behavior, 52, 101376.

Anderson et al., 2017 Anderson, L. , Flynn, A. , & Pilgrim, L. (2017). A global epidemiological perspective on the toxicology of drug-facilitated sexual assault. Journal of Forensic and Legal Medicine, 47, 4654.

Anderson and Jiang, 2018 Anderson, M. , & Jiang, J. (2018, May 31). Teens, social media & technology 2018. PEW Research. Retrieved from

Association for Progressive Communications (APC), 2020 Association for Progressive Communications [APC] . (2020). Understanding technological-related violence against women. Retrieved from Accessed on March 1, 2020.

Bailey and Mathen, 2019 Bailey, J. , & Mathen, C. (2019). Technologically-facilitated violence against women and girls. The Canadian Bar Review, 97(3), 147.

Bailey and Steeves, 2015 Bailey, J. , & Steeves, V. (2015). egirls, ecitizens. Ottawa, ON: University of Ottawa Press.

Bates, 2017 Bates, S. (2017). Revenge porn and mental health: A qualitative analysis of the mental health effects of revenge porn on female survivors. Feminist Criminology, 12, 2242.

Beduschi, 2018 Beduschi, A. (2018, September 26). Technology dominates our lives - that's why we should teach human rights law to software engineers. The Conversation. Retrieved from

Bivens and Hasinoff, 2017 Bivens, R. , & Hasinoff, A. A. (2017). Rape: Is there an app for that? An empirical analysis of the features of anti-rape apps. Information, Communication & Society, 21(8), 10501067.

Black et al., 2011 Black, M. , Basile, K. , Breiding, M. , Smith, S. , Walters, M. , Merrick, M. , … Stevens, M. (2011). National intimate partner and sexual violence survey: 2010 summary report. Retrieved from

Campbell, 2005 Campbell, M. A. (2005). Cyber bullying: An old problem in a new guise?. Journal of Psychologists & Counsellors in Schools, 15(1), 6876.

Caneva, 2016 Caneva, L. (2016, June 30). Family violence app wins inaugural premier's iAward. Pro Bono Australia. Retrieved from

Capodilupo et al., 2010 Capodilupo, C. M. , Nadal, K. L. , Corman, L. , Hamit, S. , Lyons, O. B. , & Weinberg, A. (2010). The manifestation of gender microaggressions. In D. W. Sue (Ed.), Microaggressions and marginality: Manifestation, dynamics, and impact (pp. 193216). Hoboken, NJ: John Wiley & Sons Inc.

Center on Extremism, 2018 Center on Extremism . (2018). When women are the enemy: The intersection of misogyny and white supremacy. New York, NY: Anti-defamation League.

Chisholm, 2006 Chisholm, J. F. (2006). Cyberspace violence against girls and adolescent females. Annals of the New York Academy of Sciences, 1087(1), 7489.

Circle of 6, 2020 Circle of 6 . (2020). What: Circle of 6 is for everyone. Retrieved from Accessed on January 25, 2020.

Citron, 2009 Citron, D. K. (2009). Law's expressive value in combating cyber gender harassment. Michigan Law Review, 108, 373.

Citron, 2014 Citron, D. K. (2014). Hate crimes in cyberspace. Cambridge, MA: Harvard University Press.

Connell, 1987 Connell, R. W. (1987). Gender and power: Society, the person and sexual politics. Cambridge: Polity Press.

Cox, 2014 Cox, C. (2014, September 4). Former Facebook employee launches, the Internet's first women-only social network. The Mary Sue. Retrieved from

Crofts et al., 2015 Crofts, T. , Lee, M. , McGovern, A. , & Milivojevic, S. (2015). Sexting and young people. New York, NY: Palgrave MacMillan.

Cuddy, 2019 Cuddy, A. (2019, April 12). ‘Consent condoms' and ‘anti-rape wristbands': Do they work?. BBC News. Retrieved from

DeKeseredy and Schwartz, 2016 DeKeseredy, W. S. , & Schwartz, M. D. (2016). Thinking sociologically about image-based sexual abuse: The contribution of male peer support theory. Sexualization, Media, & Society, 2(4), 664696. doi:10.1177/2374623816684692

Doane et al., 2016 Doane, A. N. , Kelley, M. L. , & Pearson, M. R. (2016). Reducing cyberbullying: A theory of reasoned action-based video prevention program for college students. Aggressive Behavior, 42, 136146.

Dragiewicz et al., 2018 Dragiewicz, M. , Burgess, J. , Matamoros-Fernández, A. , Salter, M. , Suzor, N. P. , Woodlock, D. , & Harris, B. (2018). Technology facilitated coercive control: Domestic violence and the competing roles of digital media platforms. Feminist Media Studies, 18(4), 609625.

Dragiewicz et al., 2019 Dragiewicz, M. , Harris, B. , Woodlock, D. , Salter, M. , Easton, H. , Lynch, A. , … Milne, L. (2019). Domestic violence and communication technology. Sydney, NSW: Australian Communications Consumer Action Network.

Dunn et al., 2017 Dunn, S. , Lalonde, J. S. , & Bailey, J. (2017). Terms of silence: Weaknesses in corporate and law enforcement responses to cyberviolence against girls. Girlhood Studies, 10(2), 8096.

EFE, 2017 EFE . (2017, November 13). High heels with pepper spray, new self-defense weapon for Mexican women. Retrieved from

Euronews, 2013 Euronews . (2013, May 4). Indian students invent anti-rape electric shock underwear. Retrieved from

Fascendini and Fialová, 2011 Fascendini, F. , & Fialová, K. (2011). Voices from digital spaces: Technology related violence against women. Johannesburg: Association for Progressive Communications (APC).

Faucher et al., 2015 Faucher, C. , Cassidy, W. , & Jackson, M. (2015). From the sandbox to the inbox: Comparing the acts, impacts, and solutions of bullying in K-12, higher education, and the workplace. Journal of Education and Training Studies, 3(6), 111125.

Felmlee et al., 2018 Felmlee, D. , Rodis, P. , & Francisco, S. (2018). What a b!tch!: Cyber aggression toward women of color. Gender & the Media: Women’s Places, 26, 105123.

Fileborn, 2014 Fileborn, B. (2014). Online activism and street harassment: Digital justice or shouting into the ether?. Griffith Journal of Law & Human Dignity, 2(1), 3251.

Fileborn, 2016 Fileborn, B. (2016). Reclaiming the night-time economy: Unwanted sexual attention in pubs and clubs. London: Palgrave Macmillan.

Fileborn et al., 2019 Fileborn, B. , Wadds, P. , & Tomsen, S. (2019). Safety, sexual harassment and assault at Australian music festivals: Final report. Retrieved from

Flynn, 2015 Flynn, A. (2015). Sexual violence and innovative responses to justice. In A. Powell , N. Henry , & A. Flynn (Eds.), Rape justice (pp. 92111). New York, NY: Palgrave Macmillan.

Flynn and Henry, 2019 Flynn, A. , & Henry, N. (2019). Image-based sexual abuse: An Australian reflection. Women & Criminal Justice, online first. doi:10.1080/08974554.2019.1646190

Freed et al., 2018 Freed, D. , Palmer, J. , Michala, D. , Levy, K. , Ristenpart, T. , & Dell, N. (2018). “A stalker's paradise”: How intimate partner abusers exploit technology. In 2018 CHI conference on human factors in computing systems, Montreal, QC (pp. 113).

Gaffney et al., 2019 Gaffney, H. , Farrington, D. P. , Espelage, D. L. , & Ttofi, M. M. (2019). Are cyberbullying intervention and prevention programs effective? A systematic and meta-analytical review. Aggression and Violent Behavior, 45, 134153.

Ging, 2017 Ging, D. (2017). Alphas, betas, and incels. Men and Masculinities, 20(1), 120.

Gjika and Marganski, 2020 Gjika, A. , & Marganski, A. (2020). Silent voices, hidden stories: A review of sexual assault (non)disclosure literature, emerging issues, & call to action. International Journal for Crime, Justice, & Social Democracy, online first. doi:10.5204/ijcjsd.v9i4.1439

Gorman, 2019 Gorman, G. (2019). Troll hunting: Inside the world of online hate and its human fallout. Richmond, VIC: Hardie Grant Books.

Grant, 2020 Grant, H. (2020, January 3). Porn site to pay $12.7m to women who didn't know videos would be posted. The Guardian. Retrieved from

Hawken, 2019 Hawken, L. (2019, August 13). Take a swipe: Scorned mum is exposing Tinder love-rats by forwarding their d**k pics to their wives and girlfriends. The Sun. Retrieved from

Henry and Flynn, 2019 Henry, N. , & Flynn, A. (2019). Image-based sexual abuse: Online distribution channels and illicit communities of support. Violence Against Women, online first. doi:10.1177/1077801219863881

Henry et al., 2018 Henry, N. , Flynn, A. , & Powell, A. (2018). Policing image-based sexual abuse: Stakeholder perspectives. Police Practice and Research: An International Journal, 19(6), 565581.

Henry et al., 2020 Henry, N. , McGlynn, C. , Flynn, A. , Johnston, K. , Powell, A. , & Scott, A. J. (2020). Image based sexual abuse: A study on the causes and consequences of non-consensual nude or sexual imagery. London, NY: Routledge.

Henry and Powell, 2018 Henry, N. , & Powell, A. (2018). Technology-facilitated sexual violence: A literature review of empirical research. Trauma, Violence, & Abuse, 19(2), 195208.

Hindmarch and Brinkmann, 1999 Hindmarch, I. , & Brinkmann, R. (1999). Trends in the use of alcohol and other drugs in cases of sexual assault. Human Psychopharmacology: Clinical and Experimental, 14(4), 225231.

Hockaday, 2019 Hockaday, J. (2019, September 27). Creep who sent d**k pic to transgender woman gets sent one right back. Metro UK. Retrieved from

Jane, 2016 Jane, E. A. (2016). Online misogyny and feminist digilantism. Continuum, 30(3), 284297.

Jane, 2017 Jane, E. A. (2017). Feminist flight and fight responses to gendered cyberhate. In M. Segrave & L. Vitis (Eds.), Gender, technology and violence (pp. 1427). London: Routledge.

Jewkes et al., 2015 Jewkes, R. , Flood, M. , & Lang, J. (2015). From work with men and boys to changes of social norms and reduction of inequities in gender relations: A conceptual shift in prevention of violence against women and girls. The Lancet, 385(9977), 15801589.

Kahney, 2003 Kahney, L. (2003, May 22). Shocking new jacket hits street. Wired. Retrieved from

Karimi, 2010 Karimi, F. (2010, June 21). South African doctor invents female condoms with 'teeth' to fight rape. CNN. Retrieved from

Kelly, 1987 Kelly, L. (1987). The continuum of sexual violence. In J. Hanmer & M. Maynard (Eds.), Women, violence and social control. Explorations in sociology (British Sociological Association Conference Volume Series) (pp. 4660). London: Palgrave Macmillan.

Klein, 2006 Klein, J. (2006). An invisible problem: Everyday violence against girls in schools. Theoretical Criminology, 10(2), 147177.

Koizumi, 2019 Koizumi, M. (2019, August 27). Trial run in Japan of anti-groping UV stamps sells out within an hour. Retrieved from

Mantilla, 2013 Mantilla, K. (2013). Gendertrolling: Misogyny adapts to new media. Feminist Studies, 39(2), 563570.

Marganski, 2018 Marganski, A. (2018). Feminist theory and technocrime: Examining online harassment, stalking, and gender violence in contemporary society. In K. F. Steinmetz & M. R. Nobles (Eds.), Technocrime & criminological theory (pp. 11–34). New York, NY: Routledge/CRC Press.

Marganski, 2019a Marganski, A. (2019a). Making a murderer: The importance of gender and violence against women in mass murder events. Sociology Compass, 13(9), 115.

Marganski, 2019b Marganski, A. (2019b). Feminist theory and technological crimes: Examining and responding to modern day misogyny. In T. Holt & A. Bossler (Eds.) The Palgrave handbook of international cybercrime & cyberdeviance (pp. 129). Cham: Palgrave Macmillan.

Marganski and Melander, 2018 Marganski, A. , & Melander, L. (2018). Intimate partner violence victimization in the cyber and real world: Examining the extent of cyber aggression experiences and its association with in-person dating violence. Journal of Interpersonal Violence, 33(7), 10711095.

Masters, 2019 Masters, J. (2019, November 22). A new app helps survivors of domestic violence cover the cost of abuse. Advocate. Retrieved from

McGlynn and Downes, 2015 McGlynn, C. , & Downes, J. (2015). Why we need a new law to combat upskirting and downblousing. Inherently Human. Retrieved from

Melander and Hughes, 2018 Melander, L. , & Hughes, V. (2018). College partner violence in the digital age: Explaining cyber aggression using routine activities theory. Partner Abuse, 9(2), 158180.

Melander and Marganski, 2020 Melander, L. A. , & Marganski, A. J. (2020). Cyber and in-person intimate partner violence victimization: Examining maladaptive psychosocial and behavioral correlates. Cyberpsychology: Journal of Psychosocial Research on Cyberspace, 14(1). Article 1.

Mendes et al., 2019 Mendes, K. , Ringrose, J. , & Keller, J. (2019). Digital feminist activism: Girls and women fight back against rape culture. New York, NY: Oxford University Press.

Miller v. California. (1973). 413 U.S. 15, 24.

National Network to End Domestic Violence (NNEDV), 2014 National Network to End Domestic Violence [NNEDV] . (2014, April 29). Technology Abuse: Experiences of survivors and victim service agencies. Retrieved from

National Network to End Domestic Violence (NNEDV), 2019 National Network to End Domestic Violence (NNEDV) . (2019, November 19). Making strides to stop stalkerware. Retrieved from

Osterheldt, 2019 Osterheldt, J. (2019, May 7). She ordered an Uber. What she got was a stalker. Boston Globe. Retrieved from

Payton, 2014 Payton, M. (2014, September 19). Guy's creepy message to girl on Tinder about her breasts completely backfires. Metro UK. Retrieved from

PEW, 2017 PEW . (2017, January 11). Social media use by gender. Retrieved from

Plummer and Findley, 2012 Plummer, S. B. , & Findley, P. A. (2012). Women with disabilities' experience with physical and sexual abuse. Trauma, Violence, & Abuse, 13(1), 1529.

Pope v. Illinois. (1987). 481 U.S. 497.

Powell, 2017 Powell, A. (2017). Digital justice and feminist activism. In A. Powell & N. Henry (Eds.), Sexual violence in a digital age (pp. 271298). Basingstoke: Palgrave Macmillan.

Powell and Henry, 2017 Powell, A. , & Henry, N. (2017). Sexual violence in a digital age. London: Palgrave Macmillan.

Powell et al., 2018 Powell, A. , Henry, N. , & Flynn, A. (2018). Image-based sexual abuse. In W. S. DeKeseredy & M. Dragiewicz (Eds.), Routledge handbook of critical criminology (2nd ed., pp. 305–315). Abingdon and New York, NY: Routledge.

Powell et al., 2019 Powell, A. , Henry, N. , Flynn, A. , & Scott, A. J. (2019). Image-based sexual abuse: The extent, nature, and predictors of perpetration in a community sample of Australian adults. Computers in Human Behavior, 9, 393402.

Powell et al., 2020 Powell, A. , Scott, A. J. , Henry, N. , & Flynn, A. (2020). Image-based sexual abuse: An international study of victims and perpetrators. Melbourne, VIC: RMIT University.

Pryor, 2017 Pryor, L. (2017, August 10). Is naming and shaming rapists the only way to bring them to justice?. The New York Times. Retrieved from

Ridley, 2019 Ridley, D. (2019, August 5). New app developed by an N.J. woman will help domestic violence victims file charges against their abusers. Retrieved from

Rosay, 2016 Rosay, A. B. (2016). Violence against American Indian and Alaska native women and men: 2010 findings from the NIPSVS. National Institute of Justice Research Report. NCJ 249736. Washington, DC: U.S. Department of Justice, National Institute of Justice.

Scaptura and Boyle, 2019 Scaptura, M. N. , & Boyle, K. M. (2019). Masculinity threat, “incel” traits, and violent fantasies among heterosexual men in the United States. Feminist Criminology, 15(3), 278298. doi:10.1177/1557085119896415

Silver, 2019 Silver, L. (2019, February 5). Smartphone ownership is growing rapidly around the world, but not always equally. Pew Research. Retrieved from

Smith v. United States. (1977). 431 U.S. 291.

Staples, 2020 Staples, L. (2020, March 2). Newspaper shocks readers by publishing full names and picture of men threatening Greta Thunberg on Facebook. The Independent. Retrieved from

Steffen, 2020 Steffen, L. (2020, February 16). Three young girls invent straw to detect date rape drugs. Intelligent Living. Retrieved from

Stotzer, 2009 Stotzer, R. L. (2009). Violence against transgender people: A review of United States data. Aggression and Violent Behavior, 14(3), 170179.

Tandon and Pritchard, 2015 Tandon, N. , & Pritchard, S. (2015). Cyber violence against women and girls: A world-wide wake-up call. New York, NY: United Nations Broadband Commission for Digital Development.

Tarrant, 2009 Tarrant, S. (2009, September 18). Sexual assault prevention tips guaranteed to work!. Girl w/Pen!. Retrieved from

The Anti-Defamation League’s Center on Extremism, 2018 The Anti-Defamation League's Center on Extremism . (2018). Retrieved from

The rape of Mr. Smith, n.d. The rape of Mr. Smith . (n.d.). Retrieved from

Todd et al., 2020 Todd, C. , Bryce, J. , & Franqueira, V. N. (2020). Technology, cyberstalking and domestic homicide: Informing prevention and response strategies. Policing and Society, 118.

Vera-Gray, 2017 Vera-Gray, F. (2017). Men's intrusion, women's embodiment: A critical analysis of street harassment. Abingdon and New York, NY: Routledge.

Walters et al., 2013 Walters, M. L. , Chen, J. , & Breiding, M. J. (2013). The national intimate partner and sexual violence survey (NISVS): 2010 findings on victimization by sexual orientation. Atlanta, GA: National Center for Injury Prevention and Control, Centers for Disease Control and Prevention.

Waseem et al., 2017 Waseem, M. , Paul, A. , Schwartz, G. , Pauze, D. , Eakin, P. , Barata, I. , … Joseph, M. (2017). Role of pediatric emergency physicians in identifying bullying. The Journal of Emergency Medicine, 52, 246252.

Westmarland, 2002 Westmarland, L. (2002). Gender and policing. New York, NY: Routledge.

White and McMillan, 2019 White, D. , & McMillan, L. (2019). Innovating the problem away? A critical study of anti-rape technologies. Violence Against Women, 26(10), 11201140. doi:10.1177/1077801219856115

Wilkinson, 2016 Wilkinson, M. (2016). Cyber misbehavior. United States Attorney's Bulletin, 64(3), 163.

Woodlock et al., 2019 Woodlock, D. , McKenzie, M. , Western, D. , & Harris, B. (2019). Technology as a weapon in domestic violence: Responding to digital coercive control. Australia Social Work, 113.

Woolf, 2014 Woolf, N. (2014, May 30). 'PUAhate' and 'ForeverAlone': Inside Elliot Rodger's online life. The Guardian. Retrieved from

Working to Halt Online Abuse, 2020 Working to Halt Online Abuse . (2020). US laws and other countries. Retrieved from Accessed on March 1, 2020.

World Health Organization (WHO), 2013 World Health Organization [WHO] . (2013). Global and regional estimates of violence against women: Prevalence and health effects of intimate partner violence and non-partner sexual violence. Retrieved from

World Health Organization (WHO), 2017 World Health Organization [WHO] . (2017). Violence against women. Retrieved from

Zaki, 2019 Zaki, J. (2019, August 6). The technology of kindness: How social media can rebuild our empathy - and why it must. Scientific American. Retrieved from

Zikalala, 2017 Zikalala, Z. (2017, January 16). This high-tech nail polish can tell if your drink has been spiked. Women's Health. Retrieved from

Zimmerman et al., 2018 Zimmerman, S. , Ryan, L. , & Duriesmith, D. (2018). Recognizing the violent extremist ideology of ‘incels’. Women In International Security, 9, 15.

Zweig and Dank, 2013 Zweig, J. M. , & Dank, M. (2013). Teen dating abuse and harassment in the digital world. Washington, DC: Urban Institute.

Zweig et al., 2013 Zweig, J. M. , Dank, M. , Yahner, J. , & Lachman, P. (2013). The rate of cyber dating abuse among teens and how it relates to other forms of teen dating violence. Washington, DC: Urban Institute.

Technology-Facilitated Violence and Abuse: International Perspectives and Experiences
Section 1 TFVA Across a Spectrum of Behaviors
Chapter 1 Introduction
Chapter 2 Is it Actually Violence? Framing Technology-Facilitated Abuse as Violence
Chapter 3 “Not the Real World”: Exploring Experiences of Online Abuse, Digital Dualism, and Ontological Labor
Chapter 4 Polyvictimization in the Lives of North American Female University/College Students: The Contribution of Technology-Facilitated Abuse
Chapter 5 The Nature of Technology-Facilitated Violence and Abuse among Young Adults in Sub-Saharan Africa
Chapter 6 The Face of Technology-Facilitated Aggression in New Zealand: Exploring Adult Aggressors' Behaviors
Chapter 7 The Missing and Murdered Indigenous Women Crisis: Technological Dimensions
Chapter 8 Attending to Difference in Indigenous People's Experiences of Cyberbullying: Toward a Research Agenda
Section 2 Text-Based Harms
Chapter 9 Introduction
Chapter 10 “Feminism is Eating Itself”: Women's Experiences and Perceptions of Lateral Violence Online
Chapter 11 Claiming Victimhood: Victims of the “Transgender Agenda”
Chapter 12 Doxxing: A Scoping Review and Typology
Chapter 13 Creating the Other in Online Interaction: Othering Online Discourse Theory
Chapter 14 Text-Based (Sexual) Abuse and Online Violence Against Women: Toward Law Reform?
Section 3 Image-Based Harms
Chapter 15 Introduction
Chapter 16 Violence Trending: How Socially Transmitted Content of Police Misconduct Impacts Reactions toward Police Among American Youth
Chapter 17 Just Fantasy? Online Pornography's Contribution to Experiences of Harm
Chapter 18 Intimate Image Dissemination and Consent in a Digital Age: Perspectives from the Front Line
Section 4 Dating Applications
Chapter 19 Introduction
Chapter 20 Understanding Experiences of Sexual Harms Facilitated through Dating and Hook Up Apps among Women and Girls
Chapter 21 “That's Straight-Up Rape Culture”: Manifestations of Rape Culture on Grindr
Chapter 22 Navigating Privacy on Gay-Oriented Mobile Dating Applications
Section 5 Intimate Partner Violence and Digital Coercive Control
Chapter 23 Introduction
Chapter 24 Digital Coercive Control and Spatiality: Rural, Regional, and Remote Women's Experience
Chapter 25 Technology-Facilitated Violence Against Women in Singapore: Key Considerations
Chapter 26 Technology as Both a Facilitator of and Response to Youth Intimate Partner Violence: Perspectives from Advocates in the Global-South
Chapter 27 Technology-Facilitated Domestic Abuse and Culturally and Linguistically Diverse Women in Victoria, Australia
Section 6 Legal Responses
Chapter 28 Introduction
Chapter 29 Human Rights, Privacy Rights, and Technology-Facilitated Violence
Chapter 30 Combating Cyber Violence Against Women and Girls: An Overview of the Legislative and Policy Reforms in the Arab Region
Chapter 31 Image-Based Sexual Abuse: A Comparative Analysis of Criminal Law Approaches in Scotland and Malawi
Chapter 32 Revenge Pornography and Rape Culture in Canada's Nonconsensual Distribution Case Law
Chapter 33 Reasonable Expectations of Privacy in an Era of Drones and Deepfakes: Expanding the Supreme Court of Canada's Decision in R v Jarvis
Chapter 34 Doxing and the Challenge to Legal Regulation: When Personal Data Become a Weapon
Chapter 35 The Potential of Centralized and Statutorily Empowered Bodies to Advance a Survivor-Centered Approach to Technology-Facilitated Violence Against Women
Section 7 Responses Beyond Law
Chapter 36 Introduction
Chapter 37 Technology-Facilitated Violence Against Women and Girls in Public and Private Spheres: Moving from Enemy to Ally
Chapter 38 As Technology Evolves, so Does Domestic Violence: Modern-Day Tech Abuse and Possible Solutions
Chapter 39 Threat Modeling Intimate Partner Violence: Tech Abuse as a Cybersecurity Challenge in the Internet of Things
Chapter 40 Justice on the Digitized Field: Analyzing Online Responses to Technology-Facilitated Informal Justice through Social Network Analysis
Chapter 41 Bystander Apathy and Intervention in the Era of Social Media
Chapter 42 “I Need You All to Understand How Pervasive This Issue Is”: User Efforts to Regulate Child Sexual Offending on Social Media
Chapter 43 Governing Image-Based Sexual Abuse: Digital Platform Policies, Tools, and Practices
Chapter 44 Calling All Stakeholders: An Intersectoral Dialogue about Collaborating to End Tech-Facilitated Violence and Abuse
Chapter 45 Pandemics and Systemic Discrimination: Technology-Facilitated Violence and Abuse in an Era of COVID-19 and Antiracist Protest