The Potential of Centralized and Statutorily Empowered Bodies to Advance a Survivor-Centered Approach to Technology-Facilitated Violence Against Women

The Emerald International Handbook of Technology-Facilitated Violence and Abuse

ISBN: 978-1-83982-849-2, eISBN: 978-1-83982-848-5

Publication date: 4 June 2021

Abstract

As the means and harms of technology-facilitated violence have become more evident, some governments have taken steps to create or empower centralized bodies with statutory mandates as part of an effort to combat it. This chapter argues that these bodies have the potential to meaningfully further a survivor-centered approach to combatting technology-facilitated violence against women – one that places their experiences, rights, wishes, and needs at its core. It further argues that governments should consider integrating them into a broader holistic response to this conduct.

An overview is provided of the operations of New Zealand's Netsafe, the eSafety Commissioner in Australia, Nova Scotia's Cyberscan Unit, and the Canadian Centre for Child Protection in Manitoba. These types of centralized bodies have demonstrated an ability to advance survivor-centered approaches to technology-facilitated violence against women through direct involvement in resolving instances of violence, education, and research. However, these bodies are not a panacea. This chapter outlines critiques of their operations and the challenges they face in maximizing their effectiveness.

Notwithstanding these challenges and critiques, governments should consider creating such bodies or empowering existing bodies with a statutory mandate as one aspect of a broader response to combatting technology-facilitated violence against women. Some proposed best practices to maximize their effectiveness are identified.

Keywords

Citation

Hrick, P. (2021), "The Potential of Centralized and Statutorily Empowered Bodies to Advance a Survivor-Centered Approach to Technology-Facilitated Violence Against Women", Bailey, J., Flynn, A. and Henry, N. (Ed.) The Emerald International Handbook of Technology-Facilitated Violence and Abuse (Emerald Studies In Digital Crime, Technology and Social Harms), Emerald Publishing Limited, Leeds, pp. 595-615. https://doi.org/10.1108/978-1-83982-848-520211043

Publisher

:

Emerald Publishing Limited

Copyright © 2021 Pam Hrick. Published by Emerald Publishing Limited. This chapter is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of these chapters (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode.

License

This chapter is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of these chapters (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode.


Introduction

Gender-based violence is not a new phenomenon. However, with the proliferation of digital technology have come additional ways in which such violence can be committed. As the means and harms of technology-facilitated violence have become more evident, some governments have taken steps to create or empower centralized bodies with statutory mandates as part of an effort to combat it. This chapter argues that these bodies have the potential to meaningfully further a survivor-centered approach to combatting technology-facilitated violence against women and that governments should consider integrating them into a broader holistic response to this conduct.

This chapter begins by providing my working definition of technology-facilitated violence against women and argues that meaningfully responding to its significant harms requires a holistic and survivor-centered approach – one that places the experiences, rights, wishes, and needs of survivors at its core. Drawing on existing feminist literature and internationally adopted standards on “survivor-centered” approaches to domestic and sexual violence, several foundational elements of a “survivor-centered” approach to technology-facilitated violence against women are elucidated: intersectionality, choice, dignity and respect, prevention, and research.

The chapter then turns to examining the emerging trend of governments creating or funding centralized bodies with statutory mandates to address technology-facilitated violence. An overview is provided of the operations of New Zealand's Netsafe, the eSafety Commissioner in Australia, Nova Scotia's Cyberscan Unit, and the Canadian Centre for Child Protection in Manitoba. These types of centralized bodies have demonstrated an ability to advance survivor-centered approaches to technology-facilitated violence against women through education, research, and direct involvement in resolving instances of violence. However, these bodies are not a panacea. Drawing from their experiences, I outline critiques of their operations and the challenges they face in maximizing their effectiveness.

Notwithstanding these challenges and critiques, I conclude by arguing that governments should consider creating such bodies or empowering existing bodies with a statutory mandate as one aspect of a broader response to combatting technology-facilitated violence against women. Some proposed best practices to maximize their effectiveness are identified.

The Need for a Survivor-Centered Approach to Technology-Facilitated Violence Against Women

This chapter uses “technology-facilitated violence against women” to describe a broad range of conduct targeting women, defined as individuals who self-identify as such. 1 This conduct includes cyber harassment or stalking, monitoring or surveillance, image-based abuse (creating, distributing, or threatening to distribute intimate images without consent), impersonation, doxing (publishing private or identifying information online without consent), and deep fakes (digital falsification of images, video, and audio to simulate participation in pornography) (Chesney & Citron, 2019; Wong, 2019; Woodlock, 2017). This type of conduct disproportionately targets and impacts women, among other marginalized groups (Bailey, 2013; Bailey & Mathen, 2019; Powell & Henry, 2019), with women of color, women with precarious status, women with disabilities, women whose first language is not English, and Indigenous women being particularly vulnerable (e.g., Woodlock, 2015). This chapter refers to an individual who commits technology-facilitated violence as a “perpetrator” and to the target of that violence as a “survivor,” although various other terms are used in the literature, including “responsible person,” “victim,” and “victim-survivor.”

Technology-facilitated violence can lead to real harms to real women (Powell & Henry, 2017, p. 62), 2 including well-documented harms to their privacy, security, autonomy, and equality interests (e.g., Marganski & Melander, 2018; Powell & Henry, 2017). However, law and society have tended to trivialize technology-facilitated violence as a form of gender-based violence and blame women for bringing this abuse on themselves (e.g., Citron, 2014). This response is arguably linked not just to how we have historically responded to violence against women, but to the belief that harms caused by “online” conduct are less serious than those caused by “offline” conduct (Powell & Henry, 2017, p. 66; Citron, 2014, p. 102; see also Gosse, this volume). This has led to responses that minimize the former and that tend to place responsibility for mitigating or avoiding those harms on survivors.

The serious harms caused by technology-facilitated violence against women, as a form of gender-based violence rooted in misogyny, call for an effective survivor-centered approach to combatting it. “Survivor-centered” has been defined as meaning “that the survivor, not the advocate, guides the intervention, in both what needs are to be met and how to go about meeting them” (Allen, Larsen, Trotter, & Sullivan, 2013). The concept of survivor-centered approaches (sometimes referred to as “victim-centered” or “survivor-centric” responses) has been explored in feminist approaches to domestic violence and sexual violence (e.g., Goodman & Epstein, 2008; Nova Scotia Provincial Sexual Violence Prevention Committee, 2019; UN Women Virtual Knowledge Centre to End Violence Against Women and Girls, 2011; Spangler & Brandl, 2007). It has also been advocated in the context of post-conflict mechanisms of transitional justice, such as truth commissions and prosecutions (e.g., Soueid, Willhoite, & Sovcik, 2017).

In conceptualizing a survivor-centered approach to domestic violence, Goodman and Epstein (2008) emphasize the imperative to honor the differences in domestic violence survivors' particular needs by creating opportunities for them to be heard and to play an active role in shaping the assistance they receive (the principle of “voice”); to recognize the importance of their relationships and community ties (the principle of “community”); and, in expanding resources available to them, to focus on those whose socioeconomic status limits their opportunities to be safe (the principle of “economic empowerment”) (pp. 90 and 135). Survivors and their needs differ based on many factors, including mental and physical well-being; religious, ethnic, and cultural background; immigration status; sexual orientation; embeddedness in social networks; and socio-economic status (Goodman & Epstein, 2008). Important to a survivor-centered approach is ensuring the survivor is able to control the decisions that affect her life (Goodman & Epstein, 2008).

The United Nations has also encouraged a survivor-centered approach to violence against women, meaning that all those engaged in related programming should “prioritize the rights, needs, and wishes of the survivor” (UN Women Virtual Knowledge Centre to End Violence Against Women and Girls, 2011, para 1; see also UN High Commissioner for Refugees, 2016; UN Security Council, 2019). Training materials produced by the UN High Commissioner for Refugees (2016) describe “a survivor-centered approach” as “recogniz[ing] the fact that each person is unique, reacts differently to [sexual and gender-based violence] and has different needs” and as “promot[ing] respect for the survivors' rights by placing them at the centre of the support system” (Module 2, p. 16). In the context of health-care provision, survivors' rights enumerated by the UN Women Virtual Knowledge Centre to End Violence Against Women and Girls (2011) are the rights to:

  • be treated with dignity and respect instead of being exposed to victim-blaming attitudes;

  • choose the course of action in dealing with the violence instead of being powerless;

  • privacy and confidentiality instead of exposure;

  • non-discrimination instead of discrimination based on gender, age, race/ethnicity, ability, sexual orientation, HIV status, or any other characteristic; and

  • receive comprehensive information to help her make her own decision instead of being told what to do.

These principles have also been invoked in the context of providing guidance on the development of survivor-centered sexual violence policies and responses in the context of post-secondary institutions (Nova Scotia Provincial Sexual Violence Prevention Committee, 2019).

In the context of transitional justice, Soueid et al. (2017) have described some of the most important components of a survivor-centered approach: gender-sensitive mechanisms that empower women in the society; incorporating cultural sensitivities that allow ethnic, racial, and religious minorities to meaningfully participate; providing social, medical, psychological, and other rehabilitative services; and ensuring access to effective legal representation.

Drawing from this context-specific existing feminist literature and guidance, this chapter proposes that there are several foundational aspects of a holistic survivor-centered approach to technology-based violence against women.

First, a survivor-centered approach is intersectional. Women are not a monolithic group. Women who also identify as racialized, Indigenous, trans, disabled, and/or immigrant (among other identities) will often experience technology-facilitated violence and its harms in diverse ways. The same holds true for women who live in predominantly English-speaking societies, but who do not speak English, as well as women who live in remote communities (see Woodlock & Harris, this volume). Women who exist at these and other intersections may also face different barriers to accessing information about technology-facilitated violence (see Woodlock & Harris, this volume). Recognizing the diversity of women and the extent to which their needs and experiences are likely to diverge based on social location is necessary to an effective survivor-centered approach.

Second, this approach permits and empowers survivors to choose their own course of action in addressing their individual experience of violence. This requires that multiple courses of action be available to survivors to address technology-facilitated violence in the manner in which they deem most appropriate. By way of example, a woman experiencing violence perpetrated by a former partner may want this conduct to stop, but may not want the perpetrator to be criminalized. Having a broad range of options available to a survivor is not only appropriate in light of the unique insight she may have into how the perpetrator might react and what is best for her safety, but it also provides an appropriate way to try to return control of the situation to the survivor. Related to this is the need to ensure survivors are actually informed about these various courses of action so that they may choose the one that is best suited to their own needs and wishes.

Third, a survivor-centered approach seeks to ensure that survivors are treated with dignity and respect, rather than blamed for the violence they have experienced. This principle should underlie the development and delivery of both services and information related to technology-facilitated violence against women. Goodman and Epstein (2008) have highlighted the need to better educate communities about ways to assist survivors of domestic violence and to “reach out to community leaders in religious institutions, health care agencies, educational institutions, workplaces, and other community settings to transform these places into supportive environments” for survivors (pp. 121 and 123).

Fourth, a survivor-centered approach incorporates prevention as a key goal. Reducing instances of technology-facilitated violence, and therefore reducing the number of survivors who need to rely on services and supports to address this conduct, should be prioritized. The burden of prevention should not be placed on survivors; rather, it should be a collective responsibility that encourages a cultural shift in attitudes toward technology-facilitated violence specifically and gender-based violence more generally, including through public education and the education of those who perpetrate this violence or may do so in the future.

Fifth, the implementation of a survivor-centered approach is informed by research, evidence, and the perspectives of survivors. Research is crucial to understanding the nature and impacts of recent and emerging forms of technology-facilitated violence against women. Understanding the experiences of survivors who are subjected to this conduct as well as their specific needs must inform the preventative, informational, and remedial aspects of a survivor-centered approach.

The Potential of Centralized and Statutorily Empowered Bodies to Advance a Survivor-Centered Approach

In recent years, several national and sub-national governments have taken steps to create centralized agencies or entrust a designated organization with a statutory mandate to address various aspects of technology-facilitated violence. This section provides an overview of several of these entities and the ways in which they have demonstrated their potential to further a survivor-centered response to technology-facilitated violence. However, this potential is not limitless. They are susceptible to a number of challenges and critiques, which are also explored.

The Proliferation of Centralized and Statutorily Empowered Bodies

In New Zealand, the idea to bestow upon an organization a statutory mandate related to technology-facilitated violence was raised in an August 2012 report of its Law Commission, recommending the government adopt a suite of reforms to address harmful digital communications. Among the recommended reforms was to designate an “approved agency” to receive and attempt to resolve complaints related to harmful online communications (Law Commission, 2012, p. 110). The impetus for this designation, which the Law Commission (2012) recommended pairing with an independent tribunal, was to enhance access to justice and respond to submissions of key stakeholders that “New Zealand users need access to a complaints body that is accessible and that has some teeth to negotiate with global entities” (Law Commission, 2012, pp. 100, 104, and 134). The Law Commission (2012) recognized that “[m]any complaints will be much better handled by less formal means: by techniques of mediation, negotiation and persuasion” (p. 128). It also identified education, research, and policy oversight as needed general functions for an “approved agency” (Law Commission, 2012, p. 130). The Law Commission (2012) recommended that Netsafe, an independent non-profit organization founded in 1998, be designated the “approved agency,” as it was already partly funded by government, performed many of these tasks, and had an established relationship with offshore operations such as Google and Facebook (Law Commission, 2012, p. 130; see also Pacheco & Melhuish, this volume).

In late 2013, the New Zealand government introduced what would eventually become the Harmful Digital Communications Act, 2015 (HDCA). The legislation's purposes are to deter, prevent, and mitigate harm caused to individuals by digital communications, and to provide survivors with a quick and efficient means of redress (HDCA, 2015, s. 3). In June 2016, the government appointed Netsafe as the “approved agency” under the HDCA (Government of New Zealand, 2016). Netsafe's legislative mandate includes receiving and assessing complaints about harm caused to individuals by digital communications; investigating complaints; using advice, negotiation, mediation, and persuasion (as appropriate) to resolve complaints; establishing and maintaining relationships with domestic and foreign service providers, online content hosts, and agencies (as appropriate) to achieve the HDCA's purposes; and providing education and advice on policies for online safety and conduct on the internet (HDCA, 2015, s. 8(1) (a)–(c), (e)). While New Zealand did not implement the Law Commission's recommendation for an independent tribunal, the HDCA does require that a complaint about a harmful digital communication first be made to Netsafe before an individual applies to the District Court for certain enumerated civil remedies such as a takedown order (HDCA, 2015, ss. 12(1), 18, and 19). Netsafe itself has no authority to order the takedown of harmful communications.

Shortly after the HDCA was introduced, the Australian government issued a consultation paper on enhancing online safety for children as part of a September 2013 election commitment to establish a “Children's e-Safety Commissioner” (Government of Australia, 2014, p. 5). This proposal was part of a larger commitment to enhance the online safety of children, with a view to ensuring that content and cyber-bullying concerns were handled faster; that children could quickly access assistance with online safety concerns; that criminal laws relating to cyberbullying were appropriate and effective; and that there was clear and expert leadership in online safety (Government of Australia, 2014). Pointing to the New Zealand example, the Australian government recognized “the need for an accessible and centralized point of contact to deal with online safety” (Government of Australia, 2014, p. 5).

Australia considered following New Zealand’s model of designating a non-governmental organization to act as the Commissioner. However, it ultimately established the Office of the Children's eSafety Commissioner under the Enhancing Online Safety for Children Act 2015 in July 2015 as an independent statutory office within the Australian Communications and Media Authority (Office of the eSafety Commissioner [Commissioner], 2015). Notwithstanding the Commissioner's child-focused mandate, in 2016, it launched eSafetyWomen “to help empower and encourage women to take control of the technology in their lives” in response to an “increase in the use of technology to control, stalk and abuse Australian women” (Office of the eSafety Commissioner, 2016). The Commissioner's governing legislation was amended in 2017 to re-establish its title as the Enhancing Online Safety Act 2015 (EOS Act), to rename the Office, and to reflect that the Office's mandate extends beyond the ambit of children (Reichert, 2017).

Today, the Office of the eSafety Commissioner is “the only government agency in the world solely dedicated to the online safety of its citizens” (Office of the eSafety Commissioner, 2019, p. 3). The Commissioner's legislative functions include collecting, analyzing, interpreting, and disseminating information relating to online safety for Australians; supporting, encouraging, conducting, accrediting, and evaluating educational, promotional, and community awareness programs relevant to online safety for Australians; supporting, encouraging, conducting, and evaluating research about online safety for Australians; publishing reports and papers relating to online safety for Australians; administering a complaints system for cyberbullying against children; and administering a complaints and objections system for the nonconsensual sharing of intimate images (EOS Act, ss. 15(1) (e), (f), (h), (i), 18, and 19A).

Around the same time that New Zealand and Australia took action, at the sub-national level, two Canadian provinces created legislative mandates for agencies to address some forms of technology-facilitated violence, including violence that disproportionately impacts women.

On April 25, 2013, Nova Scotia's provincial government introduced legislation intended “to better protect victims and hold cyberbullies accountable for their harmful behaviour” (Nova Scotia, n.d., p. 1). The Cyber-Safety Act was introduced in the wake of the suicide death of 17-year-old Nova Scotian Rehtaeh Parsons on April 7, 2013, after she was subjected to acts of sexual violence, image-based abuse, and cyber-harassment (CBC News, 2013). The legislation created CyberScan, a unit within the Public Safety Division of the provincial Department of Justice. The unit consisted of a director and investigators whose authority included receiving and investigating complaints about cyberbullying from anyone in the province, attempting to resolve the complaint by agreement or informal action, writing a warning letter to the perpetrator, and filing protection orders (Nova Scotia House of Assembly, 2013, p. 1483; Cyber-Safety Act, 2013, s. 26A–26G). The legislation received Royal Assent just over two weeks later on May 10, 2013.

However, on December 10, 2015, the Cyber-Safety Act was struck down in its entirety as an unconstitutional incursion on freedom of expression (the definition of “cyberbullying” was ruled to be too broad) and liberty interests (because failure to comply with a protection order under the Act could lead to imprisonment) ( Crouch v. Snell, 2015). In 2017, this legislation was replaced by the Intimate Images and Cyber-Protection Act (IICPA), a purpose of which is to “provide assistance to Nova Scotians in responding to non-consensual sharing of intimate images and cyber-bullying” (IICPA, 2017, s. 2(c)). Under the IICPA (2017), CyberScan's narrowed mandate includes providing public information and education regarding harmful online conduct; providing support and assistance to survivors of nonconsensual distribution of intimate images and cyber-bullying, including with respect to the criminal justice system and civil proceedings under the legislation; and providing voluntary dispute-resolution services, including advice, negotiation, mediation, and restorative justice approaches concerning harmful online conduct (IICPA, 2017, ss. 12(1) (a), (c)–(f); Nova Scotia, 2018).

Finally, on June 9, 2015, Manitoba's provincial government introduced The Intimate Image Protection Act (IIPA). Regarding the factors motivating the legislation, the Minister of Justice cited the death of Parsons and several other young women in similar circumstances, as well as the desire of survivors for help in getting intimate images removed without always having to go to court (Legislative Assembly of Manitoba, 2015). In January 2016, the Canadian Centre for Child Protection (C3P) was designated as the “authorized agency” to provide certain services and supports under the legislation to individuals whose intimate images have been or may be shared without their consent (Intimate Image Protection Act Regulation, s. 2). The C3P is “a national charity dedicated to the personal safety of all children” and its purposes relate primarily to reducing the sexual abuse and exploitation of children (Canadian Centre for Child Protection [C3P], 2019). To that end, it administers a tip line for reporting child sexual abuse and exploitation of children online, as well as various intervention, prevention, and education services to the Canadian public (C3P, 2019).

Advancing a Survivor-Centered Approach to Technology-Facilitated Violence Against Women

In and of itself, establishing or recognizing a centralized and statutorily empowered body to address technology-facilitated violence has certain survivor-centered benefits. It signals that the government takes this conduct seriously and condemns it, which can contribute to preventing this conduct and messaging that survivors ought to be treated with dignity and respect, rather than being blamed for it. It can also provide a single entry point for survivors to seek redress, thereby facilitating access to justice and available remedial options, empowering them to pursue the remedy they judge to be best suited to their circumstances. Furthermore, legislative empowerment creates a more permanent authority to address this conduct than a mere funding commitment to a non-government entity, meaning that a change in government is less likely to impact the availability of services and supports to survivors.

Broadly categorized, there are at least three additional ways in which centralized bodies with statutory mandates have demonstrated their potential to further a survivor-centered response to technology-facilitated violence against women: direct service-provision to resolve instances of violence; delivering education and information on technology-facilitated violence; and conducting research on the forms and harms of technology-facilitated violence against women.

Direct Involvement in Resolving Instances of Technology-Facilitated Violence

Particularly in relation to the nonconsensual distribution of intimate images, existing statutorily empowered bodies have been able to provide services related directly to assisting survivors in addressing instances of technology-facilitated violence.

Legislative amendments in 2018 empowered Australia's eSafety Commissioner to address image-based abuse, defined as nonconsensual sharing of intimate images or threatening to share intimate images without consent (Office of the eSafety Commissioner, 2019). It has implemented a “world-first government-led reporting service for victims of image-based abuse” through which it received 950 reports in 2018–2019, leading to the removal of material from over 1,700 locations online during that period (Office of the eSafety Commissioner, 2019). This represented a 90% success rate for removal, notwithstanding most material being hosted overseas (Office of the eSafety Commissioner, 2019). A civil penalties scheme, which allows the Commissioner's office to issue warnings, infringement notices, removal notices, or fines to those who post or threaten to post the content, as well as the host site, provides significant leverage in targeting and remedying this abuse on behalf of survivors (Office of the eSafety Commissioner, 2019). The Office has previously attributed its success in part to its close working relationship with social media partners and online platforms to ensure quick removal of material (Office of the eSafety Commissioner, 2018a).

The Commissioner is also mandated to administer a cyberbullying reporting regime for Australian children under 18 years of age, which endows her with powers to take remedial steps similar to those she possesses to address image-based abuse. Although the Commissioner does not have the same enforcement powers to address cyberbullying against adults, she does offer support to assist in attempting to resolve concerns (Office of the eSafety Commissioner, n.d.).

In New Zealand, Netsafe provides a free and confidential online service, as well as a helpline, for reporting harmful content, online abuse and bullying, and illegal content (Netsafe, n.d.). Where a report relates to the organization's mandate under the HDCA, Netsafe is empowered to assist in resolving the report, which may include liaising with website hosts, internet service providers, and other content hosts (whether in New Zealand or abroad) to request that impugned content be taken down or moderated (Netsafe, 2019c; HDCA, 2015, s. 25(1)). In resolving reports related to harmful digital content, Netsafe does not advocate for or favor anyone involved in the incident (Netsafe, 2019a). Rather, it assesses whether the report falls within the scope of the HDCA and the extent of the serious emotional distress, then develops a resolution plan to remove or reduce the alleged harm, which may include giving advice and using persuasion, negotiation, and mediation to resolve the issue (Netsafe, 2019a).

In Manitoba, the C3P is authorized under the IIPA (2015) to assist any person targeted by the nonconsensual distribution or threatened distribution of intimate images by receiving requests for assistance; provide information or assistance to enable a person to have their intimate images returned, destroyed, deleted, or removed from the internet or any other place where they may be viewed by others; provide information or assistance that may facilitate the resolution of a dispute between a person depicted in an intimate image and a person who may be in possession of the image or who may have distributed the image; and provide information about the legal remedies and protections available [including a civil action created by the IIPA (C3P, 2016)] where there has been a nonconsensual distribution of an intimate image or where there is a concern that an intimate image is about to be distributed without consent (IIPA, 2015, ss. 3–4; Intimate Image Protection Regulation, 2016, ss. 2–3).

If the identity of a person in possession of an intimate image is known and the C3P has reason to believe the person has distributed or will distribute the image without consent, it may send a written notice to the person that states the person depicted in the image does not consent to its distribution that includes a summary of the legal consequences that may result from its nonconsensual distribution (IIPA, 2015, s. 8). It appears, however, that the C3P may interpret this mandate to be limited to “[a]ssist[ing] with language to reach out to the individual who shared (or may share) the intimate image” (C3P, 2016, January 18). It will also “[p]rovide instructions on getting content removed from online sites or social media” (C3P, 2016). The C3P may also assist a person who made a request for assistance to make a request to police (IIPA, 2015, s. 9). The C3P reported in 2018 that since 2016, 1,300 people in Manitoba had used its online resources to seek help on this issue and 50 people (nearly half of them adults) had sought help directly from its staff (Kubinec, 2018).

In Nova Scotia, individuals who have experienced cyberbullying (which includes harassment, threats, impersonation, and revealing personal facts or confidential information using electronic communication) or nonconsensual distribution of intimate images can contact CyberScan for assistance in resolving a dispute (Cyberscan, n.d.). Staff may contact the person who distributed the intimate images or engaged in cyberbullying to try to resolve the matter informally using restorative practices or other approaches (Cyberscan, n.d.).

To various extents, these mandates to directly engage in dispute resolution further a survivor-centered approach to technology-facilitated violence. They contribute to increasing survivors' choices to meaningfully address at least some forms of technology-facilitated violence without needing to resort to potentially costly, complex, and emotionally draining civil or criminal legal processes. Moreover, some conduct that causes harm may fall below thresholds for civil or criminal prosecution. Extra-legal remedial measures such as mediation and negotiation can also ensure that harmful digital content or conduct is addressed in a more expeditious manner than would be the case in the legal system. This type of approach has the potential to promote treating survivors with dignity and respect in addressing the violence they have encountered.

Education and Information Distribution

Centralized and statutorily empowered bodies also have a demonstrated ability to advance a survivor-centered approach by providing education and distributing information related to technology-facilitated violence to survivors, frontline workers, the broader public, and law enforcement.

To at least some extent, each of the centralized agencies examined in this chapter educates and provides information to survivors on the options available to them if they experience forms of technology-facilitated violence. By way of example, CyberScan has produced a guide on the provincial Intimate Images and Cyber-Protection Act, including the definitions of cyberbullying and non-consensual distribution of intimate images, what assistance CyberScan can provide, and how to obtain a cyber-protection order under the legislation to address this conduct (Cyberscan, n.d.). It is important that survivors be educated about their potential avenues of recourse when they experience this conduct and in some cases, there is also a need to inform survivors that the conduct they have experienced is, in fact, abusive and in many cases illegal (e.g., Woodlock, 2017). Netsafe also provides guidance on what constitutes image-based abuse, its illegality, and what to do if someone experiences it (Netsafe, 2018).

Although the fault for committing technology-facilitated violence always lies with the perpetrator, educating women about how best to protect themselves against technology-facilitated abuse is also important to enhancing women's safety and is responsive to their expressed needs. In the context of intimate partner violence, one study found that survivors identified their own lack of understanding of technology as compared to that of their partners as a factor that made them more vulnerable to abuse (Douglas, Harris, & Dragiewicz, 2019). In another study, survivors reported wanting to learn about technology and expressed a desire for better tools and trainings to increase their awareness and education regarding technology (Freed et al., 2017).

To this end, in 2016, the Office of the eSafety Commissioner launched eSafetyWomen as part of the Australian government's “Women's Safety Package to Stop the Violence” (Office of the eSafety Commissioner, 2018b, para 6). The program “empower[s] Australian women to manage technology risks and abuse” (Office of the eSafety Commissioner, 2019, p. 215). Through the program, the Office has developed how-to videos to provide guidance on privacy and security features of popular platforms and devices, as well as a personal technology check-up and virtual tours of technologies commonly found in homes, in cars, and on mobile devices (Office of the eSafety Commissioner, 2019). A range of guides have been released in 12 community languages, responding to research that demonstrated women from culturally and linguistically diverse communities face barriers in seeking support (Office of the eSafety Commissioner, 2019; see also Louie, this volume). Netsafe has similarly developed guidance on how to “stay safe online,” though this is not specifically targeted to women (Netsafe, 2020).

Centralized bodies have also demonstrated a potential to educate and provide information to frontline service providers, who have expressed a desire for more and better technology-focused training (Freed et al., 2017). In their study, Freed et al. (2017) concluded that there is “a deep and urgent need for better information and training when it comes to technology and abuse – both for clients and professionals” (p. 18). In an example of filling this need, Australia's Office of the eSafety Commissioner delivered face-to-face training through eSafety Women on how technology-facilitated violence manifests and what action can be taken to more than 3,400 frontline workers in 2018–2019, while more than 1,900 frontline workers became registered users of the Office's online training (Office of the eSafety Commissioner, 2019).

Education of the broader public is also an important aspect of operations of these centralized bodies. In the context of image-based abuse, Flynn and Henry (2019) have emphasized the importance of educational campaigns to raise awareness of the causes, harms, and impacts of this conduct; to promote proactive and safe bystander interventions to challenge problematic behaviors and attitudes; and to address cultures of nonconsensual dissemination of intimate images and victim-blaming that excuse perpetrator behavior and prevent survivors from seeking help.

These centralized bodies have demonstrated their capacity to engage in this type of productive public education. In New Zealand, the Ministry of Education has an agreement with Netsafe to provide online safety services to schools (Netsafe, 2019a). Australia's Office of the eSafety Commissioner also provides online safety education for youth (Office of the eSafety Commissioner, 2019). In addition, it considers its specific responsibilities to include supporting, encouraging, and conducting educational, promotional, training, and community awareness programs that are relevant to online safety for people at risk of family or domestic violence (Office of the eSafety Commissioner, 2019). As part of its own efforts to address image-based abuse, the C3P has collaborated with the Winnipeg Police Service on a campaign to inform youth that help is available if their intimate image is being shared (C3P, 2019). It has also run a public awareness campaign “on the consequences of sharing someone else's nudes without their consent” which “reached hundreds of thousands of Canadians through bus stop ads, in-mall videos, as well as a pre-show video on 16 Landmark Cinema movie screens across Manitoba” (C3P, 2019, p. 52).

Finally, at least one of the aforementioned centralized agencies engages with law enforcement to provide tailored training on issues related to technology-facilitated violence against women. The Office of the eSafety Commissioner offers evidence-based, targeted advice to law enforcement on issues including cyber abuse, image-based abuse, and other technology-facilitated abuse (Office of the eSafety Commissioner, n.d.; Office of the eSafety Commissioner, 2019). The importance of training law enforcement relates both to effective enforcement of existing laws as well as ensuring survivors see the justice system as a forum in which they can seek redress. Ultimately, perceptions of law enforcement attitudes impact survivors' willingness to report abuse, as “they fear being blamed for having taken or shared an intimate photo” or perhaps for sharing their cell phone password with the perpetrator of the violence (Powell & Henry, 2017, p. 203).

To the extent that centralized and statutorily empowered bodies engage in these types of education and information-distribution initiatives, they have a demonstrated potential to advance a survivor-centered approach to technology-facilitated violence against women. These efforts can arm survivors with information about what constitutes technology-facilitated violence, how to safeguard themselves against it, and how to address it when it occurs. This can empower survivors with more choices about how to confront this conduct, while also giving them certain tools they have indicated they would like to assist in preventing this sort of abuse while recognizing that responsibility for this conduct always lies with the perpetrator. Education of frontline service providers further contributes to ensuring survivors are aware of the choices available to them and are supported in accessing them with dignity and respect. In the case of the eSafety Commissioner, providing guides in multiple languages is an important measure to render information more accessible to a broader range of survivors, embodying an intersectional approach that recognizes survivors at certain social locations may experience linguistic barriers to accessing information. Public education plays an important role in encouraging a culture shift that condemns, rather than trivializes or normalizes, technology-facilitated violence, thereby contributing to preventing this conduct and ensuring survivors are treated with dignity and respect in its wake. Finally, educating and partnering with law enforcement can help ensure survivors are treated with dignity and respect when they determine that engaging with the criminal justice system is the right choice for them.

Conducting Research on Issues Related to Technology-Facilitated Violence

Centralized agencies have also demonstrated they can play an important role in advancing a survivor-centered approach through conducting and commissioning research related to technology-facilitated violence against women. This can inform a broader understanding of its prevalence and impact, as well as specific educational and remedial responses.

Australia's eSafety Commissioner “produce[s] world-leading research into online safety issues” which “provide valuable insights for key stakeholders working in this space, while also boosting the evidence base that informs [its] service and program delivery and targeted communities” (Office of the eSafety Commissioner, 2019, p. 195). In 2017–2018, the Office released numerous research reports, including a national survey summary report on image-based abuse and a qualitative research summary report on image-based abuse (Office of the eSafety Commissioner, 2019, p. 133). In September 2019, it published new research aimed at understanding the beliefs, attitudes, and motivations of adults who commit image-based abuse. This research suggests that image-based abuse is normalized and that few perpetrators are aware their behavior is illegal, while recommending possible strategies aimed at helping to improve the visibility of image-based abuse (Mortreux, Kellard, Henry, & Flynn, 2019). Qualitative research published in October 2019 on online safety for Aboriginal and Torres Strait Islander women living in urban areas identified social and system barriers to seeking support for technology-facilitated abuse, as well as service provider recommendations for addressing those barriers (Office of the eSafety Commissioner, 2019). Similar research has been conducted about the experiences of women from culturally and linguistically diverse backgrounds with technology-facilitated violence (Office of the eSafety Commissioner, 2019).

Netsafe also produces and funds this type of research. In 2019 it released a report called “Image-based sexual abuse: A snapshot of New Zealand adults' experiences” (Netsafe, 2019b). Among other things, the organization has funded a research project exploring public attitudes toward image-based abuse and documentary shorts telling stories about cyber-bullying, internet safety, and image-based abuse (Netsafe, 2019a). Netsafe representatives Edgar Pacheco and Neil Melhuish have also contributed Chapter 6 to this volume focused on recent findings relating to adult perpetration of technology-facilitated violence.

A well-developed body of research and evidence in which to ground services and initiatives is essential to a survivor-centered approach. Research on the experiences and perspectives of diverse communities can ensure that the development of services is intersectional and responsive to the needs of survivors at a range of social locations. It can also inform effective approaches to prevention. For example, the above-mentioned research on perpetrators can inform improved education efforts aimed at preventing individuals from becoming perpetrators.

Critiques and Challenges

Notwithstanding centralized agencies' demonstrated potential to advance a survivor-centered response to technology-facilitated violence against women, this potential is not limitless. They are both susceptible to legitimate critiques and must be prepared to confront certain inherent challenges.

Centralized agencies are not a panacea, nor can they be expected to be ubiquitous in their activities and services. The independent review of Australia's EOS Act noted the need for the eSafety Commissioner to work across sectors, including with non-governmental organizations, to collaborate on education initiatives, as well as to avoid overlap and duplication of efforts (Briggs, 2018). The existence of a centralized agency does not negate the vital role that community-based frontline service providers, for example, play in addressing technology-facilitated violence and providing ongoing support to survivors. This role is particularly important for members of racialized or Indigenous communities, who have been frequently victimized by the state (e.g., Bobo & Thompson, 2006; Fast & Collin-Vézina, 2010) and may be skeptical of state-based responses. In this regard, centralized agencies should be seen as a complementary component of a broader survivor-centered approach to effectively addressing technology-facilitated violence against women.

The effectiveness of these bodies in combatting this problem is also linked to the extent to which their mandate expressly includes technology-facilitated violence against women and their funding to do so. For example, it is not a coincidence that the depth and breadth of work done by the eSafety Commissioner (whose statutory mandate and funding is expressly linked in many ways to violence against women) in this field far outstrips that done by C3P (whose statutory mandate is much narrower and whose primary corporate objectives relate to child protection).

Concerning inherent challenges, tensions between the proposed mandate of a centralized agency and freedom of expression were raised in the lead-up to adopting enabling legislation in Australia (Government of Australia, 2014) and New Zealand (Law Commission, 2012). Nova Scotia's statute was struck down for unconstitutionally infringing rights to free speech ( Crouch v. Snell, 2015). This highlights the extent to which legislators and agencies exercising legislative authority in this area must be conscious of acting in ways that respect applicable speech protections while effectively addressing instances of technology-facilitated violence.

The critiques and challenges considered here are not exhaustive. Further research and analysis about both the benefits and limitations of these bodies (particularly from the perspectives of the survivors whom they are supposed to serve) is warranted. However, the work of the bodies discussed in this chapter offers promising evidence of their potential to advance, even if imperfectly, a survivor-centered approach.

Best Practices for Centralized and Statutorily Empowered Bodies

Centralized and statutorily empowered bodies merit careful consideration by governments as one aspect of a broader survivor-centered approach to technology-facilitated violence against women. The experiences of these bodies suggest a number of best practices to consider in creating or empowering such a body.

  • An express mandate to address technology-facilitated violence against women: At least initially, many of the bodies discussed in this chapter were designed for and had as their primary focus combatting the abuse of children online. An explicit mandate to address the abuse of women and the ways in which the body is empowered to address it (e.g., through education, research, and intervention in resolving disputes) will help ensure the body's focus remains on this work.

  • Survivor-centered by design: While this chapter identifies and evaluates these bodies against indicia of a survivor-centered approach to technology-facilitated violence against women, none appear to have been designed with this explicitly in mind. Integrating a survivor-centered approach in the design of such bodies and/or their mandates will help ensure their effectiveness in contributing to this project is maximized. For example, in its submissions on Australia's current review of online safety laws, the Australian Women Against Violence Alliance recommends that an intersectional gender lens be embedded in policy and legislation aimed at responding to online abuse (Andrew, 2020, p. 4). Governments can also ensure women with experience in the field of gender-based technology-facilitated violence are involved in designing the body and/or its mandate.

  • Adequate funding relative to agency mandate: In multiple contexts, the need has been expressed to adequately fund centralized agencies to properly carry out the statutory mandates given to them (Briggs, 2018; Law Commission, 2012; Legislative Assembly of Manitoba, 2015). Ensuring that agencies are properly funded is relevant to ensuring they are able to advance a survivor-centered approach.

  • Regular review of statutory mandate and operations: Provision is made in most of these bodies' enabling statutes for a review a certain period of time after it has come into effect (EOS Act, s. 107; IIPA, 2015, s. 17; IICPA, 2017, s. 14). Conducting a review of the statutory scheme, as well as ensuring formal or informal review of the agency's operations informed by survivors' experiences with it, provides a mechanism to measure the extent to which an agency is advancing a survivor-centered approach. The statutory review conducted under the EOS Act has led the government to propose an expanded role for the Commissioner.

  • Ensuring relationships with external organizations: As stated above, centralized and statutorily empowered bodies are not a panacea. Where such bodies exist, it is important that they maintain relationships and collaborate with, among others, expert frontline service organizations for the benefit of survivors. An instructive example is the eSafety Commissioner's collaboration with WESNET (Australia's peak women's advocacy body working on behalf of women and children who have experienced or are experiencing domestic or family violence) on training materials rolled out at the time of the launch of eSafety Women (Office of the eSafety Commissioner, 2016).

Maintaining and leveraging relationships with technology and social media organizations is also essential to advancing survivors' interests. For example, the Office of the eSafety Commissioner has cited its productive working relationship with these companies as a key reason for its ability to get harmful material quickly removed from certain platforms (Office of the eSafety Commissioner, 2018a). At the same time, corporate mandates must be kept in check so as not to overshadow input from grassroots women's organizations or to effectively usurp the body's own authority.

These best practices are not exhaustive, but are intended to provide some foundational guidance for governments to consider in potentially designing or empowering a body with a statutory mandate to address technology-facilitated violence against women.

Conclusion

Technology-facilitated violence against women is a form of gender-based violence that causes significant and varied harms. These harms necessitate a survivor-centered approach to this conduct whose foundational aspects include intersectionality, choice, dignity and respect, prevention, and research.

Centralized bodies with legislative mandates in several jurisdictions around the world have shown a promising potential to advance this approach to the benefit of survivors, women, and society more broadly, through direct engagement in resolving incidents of violence, education, and research. While not a cure-all for this scourge of gender-based violence, their demonstrated benefits suggest that they merit careful consideration by governments as part of a holistic approach to effectively combat technology-facilitated violence against women.

Notes

1

Many women exist at the intersection of multiple identities, including trans women, women of color, and immigrant women. This chapter acknowledges that these intersecting identities impact the extent to which women may be targeted by this conduct and the severity of the harms it causes.

2

While Powell and Henry (2017, 2019) write specifically of “technology-facilitated sexual violence,” the same observation applies to conduct falling within the scope of “technology-facilitated violence against women,” as that term is used in this chapter.

References

Allen et al., 2013 Allen, N. , Larsen, S. , Trotter, J. , & Sullivan, C. (2013). Exploring the core service delivery processes of an evidence-based community advocacy program for women with abusive partners. Journal of Community Psychology, 41 (1), 118. doi:10.1002/jcop.21502

Andrew, 2020 Andrew, M. (2020, February 19). Submission on the review of online safety laws. Australian Women Against Violence Alliance. Retrieved from https://www.communications.gov.au/sites/default/files/submissions/consultation_on_a_new_online_safety_act_-_submission_-_australian_women_against_violence_alliance.pdf

Bailey, 2013 Bailey, J. (2013). ‘Sexualized online bullying’ through an equality lens: Missed opportunity in AB v. Bragg? . McGill Law Journal, 59 (3), 709737. doi:10.7202/1025142ar

Bailey and Mathen, 2019 Bailey, J. , & Mathen, C. (2019). Technology-facilitated violence against women and Girls: Assessing the Canadian criminal law response. The Canadian Bar Review, 97 (3), 664696. Retrieved from https://cbr.cba.org/index.php/cbr/article/view/4562

Bobo and Thompson, 2006 Bobo, L. D. , & Thompson, V. (2006). Unfair by design: The war on drugs, race, and the legitimacy of the criminal justice system. Social Research: International Quarterly, 73 (2), 445472. Retrieved from https://muse.jhu.edu/article/527464/pdf

Briggs, 2018 Briggs, L. (2018). Report of the statutory review of the enhancing online safety act 2015 and the review of schedules 5 and 7 to the broadcasting services act 1992 (online content scheme). Commonwealth of Australia. Retrieved from https://www.communications.gov.au/publications/report-statutory-review-enhancing-online-safety-act-2015-and-review-schedules-5-and-7-broadcasting

Canadian Centre for Child Protection, 2016 Canadian Centre for Child Protection . (2016, January 18). Statement: Canadian centre working with the Manitoba government to help Manitobans take back control of their image. Retrieved from https://www.cybertip.ca/app/en/media_release_201601_statement_ncdii. Accessed on May 7, 2020.

Canadian Centre for Child Protection, 2019 Canadian Centre for Child Protection . (2019). The power of eleven: 2018-19 social value report. Retrieved from https://www.protectchildren.ca/pdfs/C3P_SocialValueReport_2018-2019_en.pdf

CBC News, 2013 CBC News . (2013, August 7). N.S. cyberbullying legislation allows victims to sue. Retrieved from https://www.cbc.ca/news/canada/nova-scotia/n-s-cyberbullying-legislation-allows-victims-to-sue-1.1307338. Accessed on June 8, 2020.

Chesney and Citron, 2019 Chesney, B. , & Citron, D. (2019). Deep fakes: A looming challenge for privacy, democracy, and national security. California Law Review, 107 (6), 17531820. doi:10.15779/Z38RV0D15J

Citron, 2014 Citron, D. (2014). Hate crimes in cyberspace. Cambridge, MA: Harvard University Press.

Crouch v. Snell, 2015 Crouch v. Snell . (2015). NSSC 340.

Cyber-safety Act, 2013 Cyber-safety Act . (2013). S.N.S., c. 2.

CyberScan, n.d. CyberScan . (n.d.). What you need to know about the intimate images & cyber-protection act. Retrieved from https://novascotia.ca/cyberscan/documents/What%20You%20Need%20To%20Know%20about%20the%20Intimate%20Images%20and%20Cyber-Protection%20Act.pdf

Douglas et al., 2019 Douglas, H. , Harris, B. , & Dragiewicz, M. (2019). Technology-facilitated domestic and family violence: Women's experiences. The British Journal of Criminology, 59 (3), 551570. doi:10.1093/bjc/azy068

Enhancing Online Safety Act, 2015 Enhancing Online Safety Act 2015 (Cth).

Fast and Collin-Vé zina, 2010 Fast, E. , & Collin-Vézina, D. (2010). Historical trauma, race-based trauma, and resilience of Indigenous peoples: A literature review. First Peoples Child & Family Review, 5 (1), 126136.

Flynn and Henry, 2019 Flynn, A. , & Henry, N. (2019). Image-based sexual abuse: An Australian reflection. Women & Criminal Justice, 114. doi:10.1080/08974454.2019.1646190

Freed et al., 2017 Freed, D. , Palmer, J. , Minchala, D. , Levy, K. , Ristenpart, T. , & Dell, N. (2017). Digital technologies and intimate partner violence: A qualitative analysis with multiple stakeholders. PACM: Human-Computer Interaction: Computer-Supported Cooperative Work and Social Computing (CSCW), 1(No. 2), 46. doi:10.1145/3134681

Goodman and Epstein, 2008 Goodman, L. , & Epstein, D. (2008). Listening to battered women: A survivor-centered approach to advocacy. Washington, DC: American Psychological Association.

Government of Australia, 2014 Government of Australia . (2014). Enhancing online safety for children: Public consultation on key election commitments, January 2014.

Government of New Zealand, 2016 Government of New Zealand . (2016, June 1). NetSafe appointed to cyberbullying role. Retrieved from https://www.beehive.govt.nz/release/netsafe-appointed-cyberbullying-role

Harmful Digital Communications Act, 2015 Harmful Digital Communications Act 2015.

Intimate Images and Cyber-protection Act, 2017 Intimate Images and Cyber-Protection Act, S.N.S. 2017, c. 7.

Intimate Image Protection Regulation, 2016 Intimate Image Protection Regulation, Man. Reg. 3/2016, ss. 2–3.

Kubinec, 2018 Kubinec, V. (2018, April 27). More than 1,300 Manitobans seek help after intimate images shared. CBC News. Retrieved from https://www.cbc.ca/news/canada/manitoba/revenge-porn-help-online-1.4637615

Law Commission, 2012 Law Commission . (August 2012). Harmful Digital Communications: The adequacy of the current sanctions and remedies. Wellington: Ministerial Briefing Paper. Retrieved from https://www.lawcom.govt.nz/sites/default/files/projectAvailableFormats/NZLC%20MB3.pdf

Legislative Assembly of Manitoba, 2015 Legislative Assembly of Manitoba . (2015, March 26). Hansard debates and proceedings of the Legislative Assembly of Manitoba. Retrieved from https://www.gov.mb.ca/legislature/hansard/40th_4th/hansardpdf/71.pdf

Marganski and Melander, 2018 Marganski, A. , & Melander, L. (2018). Intimate partner violence victimization in the cyber and real world: Examining the extent of cyber aggression experiences and its association with in-person dating violence. Journal of Interpersonal Violence, 33 (7), 10711095. doi:10.1177/0886260515614283

Mortreux et al., 2019 Mortreux, C. , Kellard, M. , Henry, N. , & Flynn, A. (2019). Understanding the attitudes and motivations of adults who engage in image-based abuse. eSafety Commissioner. Retrieved from https://www.esafety.gov.au/sites/default/files/2019-10/Research_Report_IBA_Perp_Motivations.pdf

Netsafe, 2018 Netsafe . (2018, March 20). What is image based abuse?. Retrieved from https://www.netsafe.org.nz/image-based-abuse/

Netsafe, 2019a Netsafe . (2019a). Annual report 2018/2019. Retrieved from https://www.netsafe.org.nz/wp-content/uploads/2019/12/2019-Annual-Report-R174WEB-1.pdf

Netsafe, 2019b Netsafe . (2019b). Image-based sexual abuse: A snapshot of New Zealand adults' experiences. Retrieved from https://www.netsafe.org.nz/wp-content/uploads/2019/01/IBSA-report-2019_Final.pdf

Netsafe, 2019c Netsafe . (2019c, July 1). What is the HDCA?. Retrieved from https://www.netsafe.org.nz/what-is-the-hdca/. Accessed on May 7, 2020.

Netsafe, 2020 Netsafe . (2020, April 17). Staying safe online: Quick reference guide. Retrieved from https://www.netsafe.org.nz/staying-safe-online/

Netsafe, n.d. Netsafe . (n.d.). Make a report. Retrieved from https://report.netsafe.org.nz/hc/en-au/requests/new. Accessed on May 7, 2020.

Nova Scotia House of Assembly, 2013 Nova Scotia House of Assembly . (2013, April 26). Hansard debates and proceedings of the house of assembly. Retrieved from https://nslegislature.ca/legislative-business/hansard-debates/assembly-61-session-5/house_13apr26

Nova Scotia, 2018 Nova Scotia . (2018, July 5). Intimate images and cyber-protection legislation proclaimed. Retrieved from https://novascotia.ca/news/release/?id=20180705004

Nova Scotia, n.d. Nova Scotia . (n.d.). Cyber-safety act. Retrieved from https://novascotia.ca/just/global_docs/Cyberbullying_EN.pdf. Accessed on May 7, 2020.

Nova Scotia Provincial Sexual Violence Prevention Committee, 2019 Nova Scotia Provincial Sexual Violence Prevention Committee . (2019). Guidelines and recommendations for Nova Scotia universities and the Nova Scotia Community College: Development of survivor-centric sexual violence policies and responses. Retrieved from https://novascotia.ca/lae/pubs/docs/development-of-survivor-centric-sexual-violence-policies-guidelines-for-universities-nscc.pdf

Office of the eSafety Commissioner, 2015 Office of the eSafety Commissioner . (2015, January 7). Children's eSafety Commissioner launches cyberbullying complaints scheme. Retrieved from https://www.esafety.gov.au/about-us/newsroom/childrens-esafety-commissioner-launches-cyberbullying-complaints-scheme. Accessed on May 7, 2020.

Office of the eSafety Commissioner, 2016 Office of the eSafety Commissioner . (2016). Giving women tech tools to take control. Retrieved from https://www.esafety.gov.au/about-us/newsroom/giving-women-tech-tools-take-control

Office of the eSafety Commissioner, 2018a Office of the eSafety Commissioner . (2018a). Annual report: 2017-2018. Retrieved from https://www.esafety.gov.au/sites/default/files/2019-07/ACMA_OeSC_AR2017_18.pdf

Office of the eSafety Commissioner, 2018b Office of the eSafety Commissioner . (2018b, October 10). Acclaimed program empowers and protects victims of domestic violence. Retrieved from https://www.esafety.gov.au/about-us/newsroom/acclaimed-program-empowers-and-protects-victims-domestic-violence

Office of the eSafety Commissioner, 2019 Office of the eSafety Commissioner . (2019). Annual report 2018-2019. Retrieved from https://www.esafety.gov.au/sites/default/files/2019-10/ACMA_and_eSafety_annual_reports_2018_19.pdf

Office of the eSafety Commissioner, n.d. Office of the eSafety Commissioner . (n.d.). Law Enforcement: Presentations about online safety tailored for law enforcement workers. Retrieved from https://www.esafety.gov.au/educators/training-for-professionals/law-enforcement

Powell and Henry, 2017 Powell, A. , & Henry, N. (2017). Sexual violence in a digital age. Melbourne, VIC: Macmillan Publishers.

Powell and Henry, 2016 Powell, A. , & Henry, N. (2019). Technology-facilitated sexual violence victimization: Results from an online survey of Australian adults. Journal of Interpersonal Violence, 34 (17), 36373665. doi:10.1177/0886260516672055

Reichert, 2017 Reichert, C. (2017, June 20). Australian Parliament passes eSafety expansion Bill. ZDNet. Retrieved from https://www.zdnet.com/

Soueid et al., 2017 Soueid, M. , Willhoite, A. , & Sovcik, A. E. (2017). The survivor-centered approach to transitional justice: Why trauma-informed handling of witness testimony is a necessary component. George Washington International Law Review, 50 (1), 125180.

Spangler and Brandl, 2007 Spangler, D. , & Brandl, B. (2007). Abuse in later life: Power and control dynamics and a victim-centered response. Journal of the American Psychiatric Nurses Association, 12 (6), 322331. doi:10.1177/1078390306298878

The Intimate Image Protection Act, 2015 The Intimate Image Protection Act . (2015). C.C.S.M. c. 187.

UN High Commissioner for Refugees, 2016 UN High Commissioner for Refugees . (2016). SGBV prevention and response: Training package. Retrieved from https://www.unhcr.org/publications/manuals/583577ed4/sgbv-prevention-response-training-package.html

UN Security Council, 2019 UN Security Council . (2019). Security Council Resolution 2467. Adopted on April 23, 2019. Retrieved from https://www.securitycouncilreport.org/atf/cf/%7B65BFCF9B-6D27-4E9C-8CD3-CF6E4FF96FF9%7D/s_res_2467.pdf

UN Women Virtual Knowledge Centre to End Violence Against Women and Girls, 2011 UN Women Virtual Knowledge Centre to End Violence Against Women and Girls . (2011, February 25). Survivor-centered approach. Retrieved from http://www.endvawnow.org/en/articles/652-survivor-centred-approach.html

Wong, 2019 Wong, R. (2019). A guide for Canadian women experiencing technology-facilitated violence: Strategies for Enhancing Safety. BC Society of Transition Houses. Retrieved from https://bcsth.ca/wp-content/uploads/2019/03/BCSTH-A-guide-for-Canadian-women-experiencing-technology-facilitated-violence-2019-1.pdf

Woodlock, 2015 Woodlock, D. (2015). ReCharge: Women's technology safety, legal resources, research and training. Women’s Legal Service NSW, Domestic Violence Resource Centre Victoria and WESNET, Collingwood. Retrieved from https://www.dvrcv.org.au/sites/default/files/ReCharge_0.pdf

Woodlock, 2017 Woodlock, D. (2017). The abuse of technology in domestic violence and stalking. Violence Against Women, 23 (5), 584602. doi:10.1177/1077801216646277

Acknowledgments

I would like to thank Jane Bailey and Suzanne Dunn, both of the University of Ottawa, for reviewing and providing helpful comments on drafts of this chapter.

Prelims
Technology-Facilitated Violence and Abuse: International Perspectives and Experiences
Section 1 TFVA Across a Spectrum of Behaviors
Chapter 1 Introduction
Chapter 2 Is it Actually Violence? Framing Technology-Facilitated Abuse as Violence
Chapter 3 “Not the Real World”: Exploring Experiences of Online Abuse, Digital Dualism, and Ontological Labor
Chapter 4 Polyvictimization in the Lives of North American Female University/College Students: The Contribution of Technology-Facilitated Abuse
Chapter 5 The Nature of Technology-Facilitated Violence and Abuse among Young Adults in Sub-Saharan Africa
Chapter 6 The Face of Technology-Facilitated Aggression in New Zealand: Exploring Adult Aggressors' Behaviors
Chapter 7 The Missing and Murdered Indigenous Women Crisis: Technological Dimensions
Chapter 8 Attending to Difference in Indigenous People's Experiences of Cyberbullying: Toward a Research Agenda
Section 2 Text-Based Harms
Chapter 9 Introduction
Chapter 10 “Feminism is Eating Itself”: Women's Experiences and Perceptions of Lateral Violence Online
Chapter 11 Claiming Victimhood: Victims of the “Transgender Agenda”
Chapter 12 Doxxing: A Scoping Review and Typology
Chapter 13 Creating the Other in Online Interaction: Othering Online Discourse Theory
Chapter 14 Text-Based (Sexual) Abuse and Online Violence Against Women: Toward Law Reform?
Section 3 Image-Based Harms
Chapter 15 Introduction
Chapter 16 Violence Trending: How Socially Transmitted Content of Police Misconduct Impacts Reactions toward Police Among American Youth
Chapter 17 Just Fantasy? Online Pornography's Contribution to Experiences of Harm
Chapter 18 Intimate Image Dissemination and Consent in a Digital Age: Perspectives from the Front Line
Section 4 Dating Applications
Chapter 19 Introduction
Chapter 20 Understanding Experiences of Sexual Harms Facilitated through Dating and Hook Up Apps among Women and Girls
Chapter 21 “That's Straight-Up Rape Culture”: Manifestations of Rape Culture on Grindr
Chapter 22 Navigating Privacy on Gay-Oriented Mobile Dating Applications
Section 5 Intimate Partner Violence and Digital Coercive Control
Chapter 23 Introduction
Chapter 24 Digital Coercive Control and Spatiality: Rural, Regional, and Remote Women's Experience
Chapter 25 Technology-Facilitated Violence Against Women in Singapore: Key Considerations
Chapter 26 Technology as Both a Facilitator of and Response to Youth Intimate Partner Violence: Perspectives from Advocates in the Global-South
Chapter 27 Technology-Facilitated Domestic Abuse and Culturally and Linguistically Diverse Women in Victoria, Australia
Section 6 Legal Responses
Chapter 28 Introduction
Chapter 29 Human Rights, Privacy Rights, and Technology-Facilitated Violence
Chapter 30 Combating Cyber Violence Against Women and Girls: An Overview of the Legislative and Policy Reforms in the Arab Region
Chapter 31 Image-Based Sexual Abuse: A Comparative Analysis of Criminal Law Approaches in Scotland and Malawi
Chapter 32 Revenge Pornography and Rape Culture in Canada's Nonconsensual Distribution Case Law
Chapter 33 Reasonable Expectations of Privacy in an Era of Drones and Deepfakes: Expanding the Supreme Court of Canada's Decision in R v Jarvis
Chapter 34 Doxing and the Challenge to Legal Regulation: When Personal Data Become a Weapon
Chapter 35 The Potential of Centralized and Statutorily Empowered Bodies to Advance a Survivor-Centered Approach to Technology-Facilitated Violence Against Women
Section 7 Responses Beyond Law
Chapter 36 Introduction
Chapter 37 Technology-Facilitated Violence Against Women and Girls in Public and Private Spheres: Moving from Enemy to Ally
Chapter 38 As Technology Evolves, so Does Domestic Violence: Modern-Day Tech Abuse and Possible Solutions
Chapter 39 Threat Modeling Intimate Partner Violence: Tech Abuse as a Cybersecurity Challenge in the Internet of Things
Chapter 40 Justice on the Digitized Field: Analyzing Online Responses to Technology-Facilitated Informal Justice through Social Network Analysis
Chapter 41 Bystander Apathy and Intervention in the Era of Social Media
Chapter 42 “I Need You All to Understand How Pervasive This Issue Is”: User Efforts to Regulate Child Sexual Offending on Social Media
Chapter 43 Governing Image-Based Sexual Abuse: Digital Platform Policies, Tools, and Practices
Chapter 44 Calling All Stakeholders: An Intersectoral Dialogue about Collaborating to End Tech-Facilitated Violence and Abuse
Chapter 45 Pandemics and Systemic Discrimination: Technology-Facilitated Violence and Abuse in an Era of COVID-19 and Antiracist Protest