Search results

1 – 10 of over 1000
Open Access
Article
Publication date: 1 April 2024

Basmah Almekhled and Helen Petrie

This study investigated the attitudes and concerns of Saudi higher educational institution (HEI) academics about privacy and security in online teaching during the COVID-19…

Abstract

Purpose

This study investigated the attitudes and concerns of Saudi higher educational institution (HEI) academics about privacy and security in online teaching during the COVID-19 pandemic.

Design/methodology/approach

Online Questionnaire questionnaire was designed to explore Saudi HEI academic’s attitudes and concerns about privacy and security issues in online teaching. The questionnaire asked about attitudes and concerns held before the pandemic and since the pandemic. The questionnaire included four sections. At the beginning of the questionnaire, participants were asked what the phrase “online privacy and security” meant to them, to gain an initial understanding of what it meant to academics. A definition for what we intended for the survey was then provided: “that a person’s data, including their identity, is not accessible to anyone other than themselves and others whom they have authorised and that their computing devices work properly and are free from unauthorised interference” (based on my reading of a range of sources, e.g. Schatz et al., 2017; Steinberg, 2019; NCS; Windley, 2005). This was to ensure that participants did understand what I was asking about in subsequent sections.

Findings

This study investigated the attitudes and concerns of Saudi HEI academics about privacy and security in online teaching during the COVID-19 pandemic. The findings provide several key insights: Key aspects of online privacy and security for Saudi HEI academics: Saudi HEI academic’s notion of online privacy and security is about the protection of personal data, preventing unauthorized access to data and ensuring the confidentiality and integrity of data. This underscores the significance of robust measures to safeguard sensitive information in online teaching, but also the need to make academics aware of the other aspects of online privacy and security. Potential to improve policies and training about online privacy and security in Saudi HEIs: Although many participants were aware of the online privacy and security policies of their HEI, only a small percentage had received training in this area. Thus, there is a need to improve the development and dissemination of policies and to provide academics with appropriate training in this area and encourage them to take available training. Use of videoconferencing and chat technologies and cultural sensitivities: The study highlighted moderate levels of concern among Saudi HEI academics regarding the use of videoconferencing and online chat technologies, and their concerns about cultural factors around the use of these technologies. This emphasizes the need for online teaching and the growing use of technologies in such teaching to respect cultural norms and preferences, highlighting the importance of fostering a culturally sensitive approach to technology deployment and use. Surprising low webcam use: An unexpected finding is the low use of webcams by both academics and students during online teaching sessions, prompting a need for a deeper understanding of the dynamics surrounding webcam engagement in such sessions. This calls for a reevaluation of the effectiveness of webcam use in the teaching process and underscores the importance of exploring methods for enhancing engagement and interaction in online teaching. In summary, this paper investigated the attitudes and concerns about privacy and security in the online teaching of Saudi HEI academics during the coronavirus pandemic. The study reveals areas where further research and policy development can enhance the online teaching experience. As the education landscape continues to evolve, institutions must remain proactive in addressing the concerns of their academics while fostering a culturally sensitive approach to technology deployment.

Research limitations/implications

One limitation of this study is the relatively small qualitative data sample, despite the adequate size of the sample including 36 academics from various Saudi Arabian HEIs for quantitative analysis. It was necessary to make the most of the open-ended questions optional – participants did not have to answer about concerns if they did not want to, as we did not want to make the questionnaire too long and onerous to complete. Consequently, the number of academics responding to the open-ended questions was limited, emphasizing the need for additional data and alternative research methods to further these issues. The study was focused on investigating the concerns of HEI Saudi academics, recognizing that the attitudes and concerns of academics in other countries may differ. Furthermore, the research also includes an exploration of the changes in academic attitudes and concerns before and since the COVID-19 pandemic, which will be the subject of further data analysis.

Originality/value

This research delves into Saudi HEI academics' perceptions and concerns regarding privacy and security in online education during the COVID-19 Pandemic. Notably, it highlights the moderate priority placed on online privacy and security, the unexpectedly low usage of webcams and the potential for enhancing policies and training. The study emphasizes the necessity for comprehensive measures to protect sensitive data and the importance of tailored policies for educators. It also underscores the need for a more nuanced understanding of webcam usage dynamics, offering valuable insights for institutions aiming to improve online education and address educators' concerns amidst evolving educational landscapes.

Open Access
Article
Publication date: 1 February 2024

Meenakshi Handa, Ronika Bhalla and Parul Ahuja

Increasing incidents of privacy invasion on social networking sites (SNS) are intensifying the concerns among stakeholders about the misuse of personal data. However, there seems…

Abstract

Purpose

Increasing incidents of privacy invasion on social networking sites (SNS) are intensifying the concerns among stakeholders about the misuse of personal data. However, there seems to be limited research on exploring the impact of specific privacy concerns on users’ intention to engage in various privacy protection behaviors. This study aims to examine the role of social privacy concerns, institutional privacy concerns and privacy self-efficacy as antecedents of privacy protection–related control activities intention among young adults active on SNS.

Design/methodology/approach

Data collected from 284 young adults active on SNS was analyzed through partial least squares structural equation modeling using Smart PLS.

Findings

The results indicate that institutional privacy concerns, social privacy concerns and privacy self-efficacy positively influence the control activities intention of SNS users. The extent of privacy self-efficacy and privacy protection-related control activities intention differs among users based on gender.

Research limitations/implications

This study is limited to a population of young adults in the age group of 18–25 years.

Practical implications

The findings of this study form the basis for specific recommendations addressing the different types of privacy concerns experienced by social media users, promoting responsible privacy control behaviors on online platforms and discouraging the possible misuse of information by third parties.

Originality/value

This study validates a theoretical framework that can contribute to future investigations concerning the use of SNS. The study findings form the basis for a set of practical recommendations for policymakers, SNS platforms and users.

Details

Vilakshan - XIMB Journal of Management, vol. 21 no. 1
Type: Research Article
ISSN: 0973-1954

Keywords

Open Access
Article
Publication date: 14 February 2024

Chao Lu and Xiaohai Xin

The promotion of autonomous vehicles introduces privacy and security risks, underscoring the pressing need for responsible innovation implementation. To more effectively address…

Abstract

Purpose

The promotion of autonomous vehicles introduces privacy and security risks, underscoring the pressing need for responsible innovation implementation. To more effectively address the societal risks posed by autonomous vehicles, considering collaborative engagement of key stakeholders is essential. This study aims to provide insights into the governance of potential privacy and security issues in the innovation of autonomous driving technology by analyzing the micro-level decision-making processes of various stakeholders.

Design/methodology/approach

For this study, the authors use a nuanced approach, integrating key stakeholder theory, perceived value theory and prospect theory. The study constructs a model based on evolutionary game for the privacy and security governance mechanism of autonomous vehicles, involving enterprises, governments and consumers.

Findings

The governance of privacy and security in autonomous driving technology is influenced by key stakeholders’ decision-making behaviors and pivotal factors such as perceived value factors. The study finds that the governmental is influenced to a lesser extent by the decisions of other stakeholders, and factors such as risk preference coefficient, which contribute to perceived value, have a more significant influence than appearance factors like participation costs.

Research limitations/implications

This study lacks an investigation into the risk sensitivity of various stakeholders in different scenarios.

Originality/value

The study delineates the roles and behaviors of key stakeholders and contributes valuable insights toward addressing pertinent risk concerns within the governance of autonomous vehicles. Through the study, the practical application of Responsible Innovation theory has been enriched, addressing the shortcomings in the analysis of micro-level processes within the framework of evolutionary game.

Details

Asia Pacific Journal of Innovation and Entrepreneurship, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2071-1395

Keywords

Open Access
Article
Publication date: 15 January 2024

Christine Prince, Nessrine Omrani and Francesco Schiavone

Research on online user privacy shows that empirical evidence on how privacy literacy relates to users' information privacy empowerment is missing. To fill this gap, this paper…

1061

Abstract

Purpose

Research on online user privacy shows that empirical evidence on how privacy literacy relates to users' information privacy empowerment is missing. To fill this gap, this paper investigated the respective influence of two primary dimensions of online privacy literacy – namely declarative and procedural knowledge – on online users' information privacy empowerment.

Design/methodology/approach

An empirical analysis is conducted using a dataset collected in Europe. This survey was conducted in 2019 among 27,524 representative respondents of the European population.

Findings

The main results show that users' procedural knowledge is positively linked to users' privacy empowerment. The relationship between users' declarative knowledge and users' privacy empowerment is partially supported. While greater awareness about firms and organizations practices in terms of data collections and further uses conditions was found to be significantly associated with increased users' privacy empowerment, unpredictably, results revealed that the awareness about the GDPR and user’s privacy empowerment are negatively associated. The empirical findings reveal also that greater online privacy literacy is associated with heightened users' information privacy empowerment.

Originality/value

While few advanced studies made systematic efforts to measure changes occurred on websites since the GDPR enforcement, it remains unclear, however, how individuals perceive, understand and apply the GDPR rights/guarantees and their likelihood to strengthen users' information privacy control. Therefore, this paper contributes empirically to understanding how online users' privacy literacy shaped by both users' declarative and procedural knowledge is likely to affect users' information privacy empowerment. The study empirically investigates the effectiveness of the GDPR in raising users' information privacy empowerment from user-based perspective. Results stress the importance of greater transparency of data tracking and processing decisions made by online businesses and services to strengthen users' control over information privacy. Study findings also put emphasis on the crucial need for more educational efforts to raise users' awareness about the GDPR rights/guarantees related to data protection. Empirical findings also show that users who are more likely to adopt self-protective approaches to reinforce personal data privacy are more likely to perceive greater control over personal data. A broad implication of this finding for practitioners and E-businesses stresses the need for empowering users with adequate privacy protection tools to ensure more confidential transactions.

Details

Information Technology & People, vol. 37 no. 8
Type: Research Article
ISSN: 0959-3845

Keywords

Open Access
Book part
Publication date: 9 December 2021

Paul Spicker

The received wisdom underlying many guides to ethical research is that information is private, and research is consequently seen as a trespass on the private sphere. Privacy

Abstract

The received wisdom underlying many guides to ethical research is that information is private, and research is consequently seen as a trespass on the private sphere. Privacy demands control; control requires consent; consent protects privacy. This is not wrong in every case, but it is over-generalised. The distorted perspective leads to some striking misinterpretations of the rights of research participants, and the duties of researchers. Privacy is not the same thing as data protection; consent is not adequate as a defence of privacy; seeking consent is not always required or appropriate. Beyond that, the misinterpretation can lead to conduct which is unethical, limiting the scope of research activity, obstructing the flow of information in a free society, and failing to recognise what researchers’ real duties are.

Details

Ethical Issues in Covert, Security and Surveillance Research
Type: Book
ISBN: 978-1-80262-414-4

Keywords

Open Access
Book part
Publication date: 4 June 2021

Kristen Thomasen and Suzie Dunn

Perpetrators of technology-facilitated gender-based violence are taking advantage of increasingly automated and sophisticated privacy-invasive tools to carry out their abuse…

Abstract

Perpetrators of technology-facilitated gender-based violence are taking advantage of increasingly automated and sophisticated privacy-invasive tools to carry out their abuse. Whether this be monitoring movements through stalkerware, using drones to nonconsensually film or harass, or manipulating and distributing intimate images online such as deepfakes and creepshots, invasions of privacy have become a significant form of gender-based violence. Accordingly, our normative and legal concepts of privacy must evolve to counter the harms arising from this misuse of new technology. Canada's Supreme Court recently addressed technology-facilitated violations of privacy in the context of voyeurism in R v Jarvis (2019) . The discussion of privacy in this decision appears to be a good first step toward a more equitable conceptualization of privacy protection. Building on existing privacy theories, this chapter examines what the reasoning in Jarvis might mean for “reasonable expectations of privacy” in other areas of law, and how this concept might be interpreted in response to gender-based technology-facilitated violence. The authors argue the courts in Canada and elsewhere must take the analysis in Jarvis further to fully realize a notion of privacy that protects the autonomy, dignity, and liberty of all.

Details

The Emerald International Handbook of Technology-Facilitated Violence and Abuse
Type: Book
ISBN: 978-1-83982-849-2

Keywords

Open Access
Article
Publication date: 29 December 2023

Priya C. Kumar

This article advocates that privacy literacy research and praxis mobilize people toward changing the technological and social conditions that discipline subjects toward advancing…

Abstract

Purpose

This article advocates that privacy literacy research and praxis mobilize people toward changing the technological and social conditions that discipline subjects toward advancing institutional, rather than community, goals.

Design/methodology/approach

This article analyzes theory and prior work on datafication, privacy, data literacy, privacy literacy and critical literacy to provide a vision for future privacy literacy research and praxis.

Findings

This article (1) explains why privacy is a valuable rallying point around which people can resist datafication, (2) locates privacy literacy within data literacy, (3) identifies three ways that current research and praxis have conceptualized privacy literacy (i.e. as knowledge, as a process of critical thinking and as a practice of enacting information flows) and offers a shared purpose to animate privacy literacy research and praxis toward social change and (4) explains how critical literacy can help privacy literacy scholars and practitioners orient their research and praxis toward changing the conditions that create privacy concerns.

Originality/value

This article uniquely synthesizes existing scholarship on data literacy, privacy literacy and critical literacy to provide a vision for how privacy literacy research and praxis can go beyond improving individual understanding and toward enacting social change.

Details

Information and Learning Sciences, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2398-5348

Keywords

Open Access
Book part
Publication date: 4 June 2021

Ari Ezra Waldman

Mobile dating apps are widely used in the queer community. Whether for sexual exploration or dating, mobile and geosocial dating apps facilitate connection. But they also bring…

Abstract

Mobile dating apps are widely used in the queer community. Whether for sexual exploration or dating, mobile and geosocial dating apps facilitate connection. But they also bring attendant privacy risks. This chapter is based on original research about the ways gay and bisexual men navigate their privacy on geosocial dating apps geared toward the LGBTQI community. It argues that, contrary to the conventional wisdom that people who share semi-nude or nude photos do not care about their privacy, gay and bisexual users of geosocial dating apps care very much about their privacy and engage in complex, overlapping privacy navigation techniques when sharing photos. They share semi-nude and nude photos for a variety of reasons, but generally do so only after building organic trust with another person. Because trust can easily break down without supportive institutions, this chapter argues that law and design must help individuals protect their privacy on geosocial dating apps.

Details

The Emerald International Handbook of Technology-Facilitated Violence and Abuse
Type: Book
ISBN: 978-1-83982-849-2

Keywords

Open Access
Article
Publication date: 2 January 2024

Renata Monteiro Martins, Sofia Batista Ferraz and André Francisco Alcântara Fagundes

This study aims to propose an innovative model that integrates variables and examines the influence of internet usage expertise, perceived risk and attitude toward information…

Abstract

Purpose

This study aims to propose an innovative model that integrates variables and examines the influence of internet usage expertise, perceived risk and attitude toward information control on privacy concerns (PC) and, consequently, in consumers’ willingness to disclose personal information online. The authors also propose to test the mediation role of trust between PCs and willingness to disclose information. Trust is not a predictor of PC but a causal mechanism – considering that the focus is to understand consumers’ attitudes and behavior regarding the virtual environment (not context-specific) (Martin, 2018).

Design/methodology/approach

The authors developed a survey questionnaire based on the constructs that compose the proposed model to collect data from 864 respondents. The survey questionnaire included the following scales: internet usage expertise from Ohanian (1990); perceived risk, attitude toward information control, trust and willingness to disclose personal information online from Malhotra et al. (2004); and PC from Castañeda and Montoro (2007). All items were measured on a Likert seven-point scale (1 = totally disagree; 7 = totally agree). To obtain Westin’s attitudinal categories toward privacy, respondents answered Westin’s three-item privacy index. For data analysis, the authors applied covariance-based structural equation modeling.

Findings

First, the proposed model explains the drivers of consumers’ disposition to provide personal information at a level that surpasses specific contexts (Martin, 2018), bringing the analysis to consumers’ level and considering their general perceptions toward data privacy. Second, the findings provide inputs to propose a better definition of Westin’s attitudinal categories toward privacy, which used to be defined only by individuals’ information privacy perception. Consumers’ perceptions about their abilities in using the internet, the risks, their beliefs toward information control and trust also help to delimitate and distinguish the fundamentalists, the pragmatics and the unconcerned.

Research limitations/implications

Some limitations weigh the theoretical and practical implications of this study. The sample size of pragmatic and unconcerned respondents was substantially smaller than that of fundamentalists. It might be explained by applying Westin’s self-report index to classify the groups according to their score regarding PCs. Most individuals affirm having a great concern for their data privacy but still provide online information for the benefit of personalization – known as the privacy paradox (Zeng et al., 2021). It leads to another limitation of this research, given the lack of measures that classify respondents by considering their actual behavior toward privacy.

Practical implications

PC emerges as an important predictor of consumer trust and willingness to disclose their data online, and trust also influences this disposition. Managers need to implement actions that effectively reduce consumers’ concerns about privacy and increase their trust in the company – e.g. adopting a clear and transparent policy on how the data collected is stored, treated, protected and used to benefit the consumer. Regarding the perception of risk, if managers convince consumers that the data collected on the internet is protected, they tend to be less concerned about privacy.

Social implications

The results suggest different aspects influencing the willingness to disclose personal information online, including different responses considering consumers’ PCs. Through their policies and legislation, the authors understand that governments must be attentive to this aspect, establishing regulations that protect consumers’ data in the virtual environment. In addition to regulatory policies, education campaigns can be carried out for both consumers and managers to raise the discussion about privacy and the availability of information in the online environment, demonstrating the importance of protecting personal data to benefit the government, consumers and organizations.

Originality/value

Although there is increasing research on consumers’ privacy, studies have not considered their attitudinal classifications – high, moderate and low concern – as moderators of willingness to disclose information online. Researchers have also increased attention to the antecedents of PCs and disclosure of information but overlooked possible mechanisms that explain the relationship between them.

Details

RAUSP Management Journal, vol. 59 no. 1
Type: Research Article
ISSN: 2531-0488

Keywords

Open Access
Article
Publication date: 19 July 2023

Magnus Söderlund

Service robots are expected to become increasingly common, but the ways in which they can move around in an environment with humans, collect and store data about humans and share…

1103

Abstract

Purpose

Service robots are expected to become increasingly common, but the ways in which they can move around in an environment with humans, collect and store data about humans and share such data produce a potential for privacy violations. In human-to-human contexts, such violations are transgression of norms to which humans typically react negatively. This study examines if similar reactions occur when the transgressor is a robot. The main dependent variable was the overall evaluation of the robot.

Design/methodology/approach

Service robot privacy violations were manipulated in a between-subjects experiment in which a human user interacted with an embodied humanoid robot in an office environment.

Findings

The results show that the robot's violations of human privacy attenuated the overall evaluation of the robot and that this effect was sequentially mediated by perceived robot morality and perceived robot humanness. Given that a similar reaction pattern would be expected when humans violate other humans' privacy, the present study offers evidence in support of the notion that humanlike non-humans can elicit responses similar to those elicited by real humans.

Practical implications

The results imply that designers of service robots and managers in firms using such robots for providing service to employees should be concerned with restricting the potential for robots' privacy violation activities if the goal is to increase the acceptance of service robots in the habitat of humans.

Originality/value

To date, few empirical studies have examined reactions to service robots that violate privacy norms.

Details

Journal of Service Theory and Practice, vol. 33 no. 7
Type: Research Article
ISSN: 2055-6225

Keywords

1 – 10 of over 1000