Organizational insiders play a critical role in protecting sensitive information. Prior research finds that moral beliefs influence compliance decisions. Yet, it is less clear what factors influence moral beliefs and the conditions under which those factors have stronger/weaker effects. Using an ethical decision-making model and value congruence theory, this study aims to investigate how moral intensity and organizational criticality influence moral beliefs and intentions to perform information protection behaviors.
The hypotheses were tested using a scenario-based survey of 216 organizational insiders. Two of the scenarios depict low criticality information security protection behaviors and two depict high criticality behaviors.
A major finding is that users rely more on perceived social consensus and magnitude of consequences when organizational criticality is low and on temporal immediacy and proximity when criticality is high. In addition, the moral intensity dimensions explain more variance in moral beliefs when organizational criticality is low.
The study is limited by its sample, which is organizational insiders at a mid-size university. It is also limited in that it only examined four of the six moral intensity dimensions.
The findings can guide management about which moral intensity dimensions are more important to focus on when remediating tone at the top and other leadership weaknesses relating to information security.
This study adds value by investigating the separate dimensions of moral intensity on information protection behaviors. It also is the first to examine moral intensity under conditions of low and high organizational criticality.
Lankton, N.K., Stivason, C. and Gurung, A. (2019), "Information protection behaviors: morality and organizational criticality", Information and Computer Security, Vol. 27 No. 3, pp. 468-488. https://doi.org/10.1108/ICS-07-2018-0092
Emerald Publishing Limited
Copyright © 2019, Emerald Publishing Limited
A recent survey finds that 90 per cent of organizations feel vulnerable to insider security breaches, with a majority reporting at least one insider breach in the previous 12 months (Schulze, 2018). Many insider incidents result from accidents, negligence or from not complying with policies (Heimer, 2018). In fact, while employees know that protecting data is important, they may not do so if it hinders their work (Masters, 2018). Insider incidents can cause a loss of competitive data or intellectual property, and can lead to decreases in productivity, damages to equipment and other assets, and additional costs to remediate systems and core business processes (Ponemon Institute, 2016). Some estimate that insider breaches cost from $100,000 to $500,000 or higher (Schulze, 2018). This makes understanding the information protection behaviors (IPBs) of organizational insiders a priority.
Researchers have used a diverse set of theories to investigate IPBs including the theory of planned behavior, rational choice theory, protection motivation theory and deterrence theory (Moody et al., 2018). Some studies using these theories also explore the concept of morality (Sommestad et al., 2015 for a review). Morality is relevant to IPBs because employees are often faced with situations where taking the wrong action can put sensitive information at risk. Hence, researchers have studied how factors such as moral beliefs, attitudes, disengagement and intensity can affect the intention to engage in IPBs (D’Arcy and Devaraj, 2012; D’Arcy et al., 2014; Chatterjee et al., 2015).
This study builds on prior work by exploring the impact of moral intensity and organizational criticality on moral beliefs. Moral intensity is a theoretical construct that influences the ethical decision-making process (Jones, 1991). It represents the characteristics of an issue that influence whether decisions about the issue are perceived as ethically significant (Barnett, 2001; Jones, 1991). Researchers find that moral intensity can influence information security policy compliance (Crossler et al., 2017; Peslak, 2008) and other information technology behaviors (Chatterjee et al., 2015; Goles et al., 2006; Shaw, 2003).
This study contributes in two ways. First, while moral intensity is theorized to consist of multiple dimensions (Jones, 1991), information security research often treats moral intensity as one construct (Chatterjee et al., 2015; Crossler et al., 2017; Goles et al., 2006). Studies that examine the dimensions separately have limited samples (Shaw, 2003) or do not relate the dimensions to moral judgments about IPBs (Peslak, 2008). Given the number and cost of insider security incidents, additional work is needed to more fully understand how the separate dimensions of moral intensity work in an ethical decision-making context involving IPBs.
A second contribution of this work is using value congruence theory to examine how the influence of the moral intensity dimensions varies depending on the IPB’s organizational criticality or its importance to the organization (Posey et al., 2013). Aligning employee and organizational values can result in behavior that is principled, honest and upright (Meglino et al., 1989; Newman et al., 2017). When there is misalignment or incongruence, employees may not act in the best interests of the organization (Liedtka, 1989). This suggests that the moral intensity dimensions will have stronger effects on moral beliefs when there is a match between moral intensity and organizational criticality. While prior research has examined these constructs separately, none to our knowledge has investigated their interactive effects.
The hypotheses are tested using an online scenario-based survey. Two scenarios depict less critical IPBs, and two scenarios represent more critical IPBs. By comparing data from the two scenario types, this study examines the nuances of organizational criticality on individuals’ beliefs about doing the right thing. The findings offer researchers a more detailed look at moral intensity and the situations where certain dimensions of intensity are more or less influential. Practitioners can gain from the results by knowing which areas to focus on to increase security policy compliance.
This research uses the theoretical lenses of ethical decision-making theory and value congruence theory to examine how moral intensity and organizational criticality influence moral beliefs. Researchers have used other theories to examine information security issues. However, much of this work focuses on explaining intention (Moody et al., 2018). The current research examines only one predictor of intention: moral beliefs. It does so because its focus is on understanding how notions of principled behavior and the fit between employee and organizational values affect individual judgments about right and wrong, rather than on explaining intention. The following sections describe the theories used in this research.
Ethical decision-making theory
Ethical decision-making theory describes a process that involves four distinct psychological stages. Moral recognition is the first stage and means “the recognition that an ethical problem exists” (Johnson, 2007, p. 60). It involves becoming aware of how a behavior might harm or benefit others and identifying possible courses of action and the consequences of each action (Johnson, 2007). The next stage, moral judgement, refers to “formulating and evaluating which possible solutions to the moral issue have moral justification” (Lincoln and Holmes, 2011, p. 57) that is, determining whether the solutions are justly right or wrong (Johnson, 2007; Lincoln and Holmes, 2011). Moral intention is the next stage and “refers to the intention to choose the moral decision over another solution representing a different value” (Lincoln and Holmes, 2011, p. 57). Moral intention can lead to moral behavior, which is the last stage and refers to an individual’s “action in the situation” (Lincoln and Holmes, 2011, p. 57).
This study examines the moral judgment and moral intention stages of the ethical decision-making process. Consistent with other information security research, the moral judgment stage is called moral beliefs, and is defined in this study as the extent that one believes not performing an IPB is virtuous (D’Arcy and Devaraj, 2012; Hovav and D’Arcy, 2012; Vance and Siponen, 2012). The moral intention stage is referred to in this study as negative intent or the intention to not perform an IPB. This is consistent with other behavioral security researchers who examine the effects of moral beliefs and moral intensity on negative intentions such as misuse intentions (Hovav and D’Arcy, 2012; D’Arcy and Devaraj, 2012).
This study also examines moral intensity, which Jones (1991) added as an antecedent to the ethical decision-making process. He defined moral intensity as the characteristics of the issue that influence ethical decision-making. Jones (1991) theorized moral intensity as consisting of six dimensions including magnitude of consequences, temporal immediacy, proximity, social consensus, concentration of effect, and probability of effect. The strength of the moral intensity dimensions captures the imperative or significance of the issue’s principled nature (Barnett, 2001; Jones, 1991). When individuals perceive that an issue has higher moral intensity they will be more likely to recognize the issue’s moral component, judge the issue as being principled, and form intentions and take actions to do the right thing (Barnett, 2001; Jones, 1991). Likewise, if individuals perceive an issue has less moral intensity, they will be less likely to recognize the moral component, judge the issue as being right or wrong and form intentions and behaviors to do the right thing (Barnett, 2001; Jones, 1991).
Moral intensity has been examined in the information systems and security literatures. Some researchers examine moral intensity as one construct consisting of multiple items that represent all or a subset of the dimensions. For example, Chatterjee et al. (2015) uses a five-item moral intensity construct and finds that moral intensity influences attitudes about unethical information technology use (Chatterjee et al., 2015). Similarly, Crossler et al. (2017) uses a four-item scale and finds that moral intensity significantly influences BYOD policy compliance intentions, and Goles et al. (2006) use a six-item scale and find that moral intensity influences perceptions of an ethical problem and intentions to act in a principled manner. Peslak (2008) also investigates moral intensity by asking subjects to agree or disagree with both a positive and a negative statement for each of five moral intensity dimensions (all but probability of effect). The author finds the significance of the positive and negative dimensions depends on the information technology use situation. Finally, surveying webmasters, Shaw (2003) finds that magnitude of consequences significantly influences attitudes about whether violating fair information practices is right or wrong, and that proximity significantly moderates the influence of social consensus on these attitudes. Based on this research, the current study is one of the first to use and analyze multiple-item scales for the separate moral intensity dimensions of IPBs.
Organizational criticality and value congruence theory
Organizations use a variety of IPBs to protect their information assets. Posey et al. (2013) developed a taxonomy that classifies IPBs along the dimensions of criticality, promotion difficulty and common sense. The current study examines criticality, which refers to how important the behavior is for employees to perform on an ongoing basis to protect an organization’s information assets (Posey et al., 2013). Critical IPBs may involve those that are generally accepted standards or protocols, are vital to the organization’s security practices, and should be practiced by all employees—not just information technology employees (Posey et al., 2013). A statistical analysis to classify low and high criticality behaviors found not discussing sensitive information in public, for example, to be a high criticality behavior and documenting system changes to be a low criticality behavior (Posey et al., 2013).
Value congruence theory explains the relationship between morality and criticality. It suggests that employees have more favorable behavioral outcomes when their values are aligned with those of other persons like a supervisor or the organization itself (Meglino et al., 1989). Value congruence is similar to person–organization fit that refers to the fit between employees’ personal values and organizational values (Chatman, 1989; Judge and Bretz, 1992). Congruence allows employees to rely on pre-determined scripts they develop based on organizational cues (Liedtka, 1989). When congruence is absent, employees must think on their own, and conflicts can easily arise (Liedtka, 1989). Prior research finds that value congruence relates to higher job satisfaction and organizational commitment (Vigoda and Cohen, 2002), and lower intentions to leave (Christiansen et al., 1997). Some studies also suggest that employees who feel their values are consistent with organizational values (i.e. person–organization fit) may engage in organizational citizenship behaviors such as protecting and defending the organization (Han et al., 2015; Judge and Bretz, 1992; Van Dyne and Pierce, 2004). This suggests that value congruence may be related to IPBs.
Value congruence theory also examines ethical decision-making among employees. Researchers discuss how organizational policies and rules can convey organizational values, which if consistent with employee values, can promote ethical awareness, intentions and behaviors (Newman et al., 2017). Research finds that value incongruence among individuals and the organization negatively impacts decision-making about doing the right thing (Liedtka, 1989). Based on these prior research findings, the current study uses value congruence theory for explaining the moderating role of criticality on ethical decision-making about IPBs.
The research model is shown in Figure 1. The first hypothesis is based on ethical decision-making theory. It predicts that moral beliefs will influence intention to not perform the protective behavior (Jones, 1991). The more individuals believe that not performing the IPB is the right (wrong) thing to do, the more (less) likely they will be to form intentions to not perform the behavior. Beliefs that an action is not honest or right can reduce intention because the beliefs can produce feelings of guilt and shame (D’Arcy and Devaraj, 2012). Likewise, beliefs that justify doing the right thing can produce self-approval, virtue, or pride. Researchers find that individuals who perceive that information systems misuse is wrong have lower misuse intentions, whereas those who feel misuse is acceptable will have higher misuse intentions (D’Arcy and Devaraj, 2012; Hovav and D’Arcy, 2012). Researchers find a similar relationship for noncompliance with information security policies, in which the more individuals believe noncompliance is acceptable and honest behavior, the higher their noncompliance intentions (Crossler et al., 2017; Vance and Siponen, 2012):
Individuals who believe that not performing the IPB is ethical (unethical) will be more likely (less likely) to intend to not perform the IPB.
The next set of hypotheses predicts that the moral intensity dimensions will influence moral beliefs. We follow Barnett (2001) who, based on the work by Barnett et al. (1999), used only four of the six moral intensity dimensions (magnitude of consequences, temporal immediacy, proximity, social consensus). Barnett et al. (1999) reviewed the literature to develop approximately 75 items representing moral intensity. Exploratory and confirmatory factor analyses resulted in four dimensions with three items each that demonstrated internal consistency, unidimensionality and nomological validity. Consistent with this work and other researchers (Valentine and Hollingworth, 2012), we adopted these four dimensions to represent the main moral intensity dimensions.
Magnitude of consequences means “the sum of the harms (or benefits) done to victims (or beneficiaries) of the moral act in question” (Jones, 1991, p. 374). Some actions have minor consequences, whereas others have more significant economic or physical consequences (Barnett and Valentine, 2004). In the information security literature, the construct magnitude of consequences is similar to other concepts like severity that means the “degree of harm associated with the threat” (Herath and Rao, 2009, p. 111), and awareness of consequences that means “the awareness that an employee has regarding how their ISP-related behavior affects the welfare of their coworkers or the organization as a whole” (Yazdanmehr and Wang, 2016, p. 38). Yet these constructs according to the theoretical bases in those studies are predicted to have a direct influence on intention rather than on moral beliefs.
In the current study, magnitude of consequences is predicted to impact moral beliefs (i.e. moral judgments) as suggested by the theory of reasoned action that describes how evaluations about a behavior are based on outcome beliefs (Fishbein and Ajzen, 1975). Applied to ethical decision-making, this suggests that judgments about whether an action is right or wrong are based on their consequences (Dubinsky and Loken, 1989). If a behavior is thought to result in harm (benefits), the behavior will be judged to be less (more) virtuous. Other researchers suggest that individuals may not realize they are faced with a decision involving right or wrong principles if they are unaware of its consequences (Van Liere and Dunlap, 1978; Yazdanmehr and Wang, 2016). Being aware of the consequences can help them evaluate the principles of the action (Yazdanmehr and Wang, 2016). Prior research finds that the greater the beliefs that the consequences of not following information security policies are severe, the more individuals feel obligated by a sense of right and wrong to comply with the protection policies (Yazdanmehr and Wang, 2016):
Individuals who believe the magnitude of consequences from not performing the IPB is more severe will be less likely to believe that not performing the IPB is ethical.
Temporal immediacy is defined as the “length of time between the present and the onset of consequences of the moral act in question” (Jones, 1991, p. 376). It is the length of time between when the action occurs and when the consequences occur (Peslak, 2008). Individuals will typically pay more attention to short-term consequences than long-term consequences (Barnett, 2001), especially when evaluating situational issues like security violations in the workplace (Guo et al., 2011). This will increase moral intensity (Barnett, 2001). Time can allow situations to change and become less of a dilemma between doing the right and wrong thing, thereby decreasing moral intensity (Jones, 1991). Temporal immediacy is an important component of moral intensity that leads to less favorable attitudes toward information technology misuse (Chatterjee et al., 2015). It also significantly affects whether or not users will engage in wrongful copying of software, distributing unauthorized videos and not correcting inaccurate customer information (Peslak, 2008):
Individuals who believe that the consequences from not performing the IPB are more immediate will be less likely to believe that not performing the IPB is ethical.
Proximity is defined as “the feeling of nearness (social, cultural, psychological, or physical) that the moral agent has for victims (beneficiaries) of the evil (beneficial) act in question” (Jones, 1991, p. 376). The closer individuals feel to those affected by their actions, the greater the action’s moral intensity because people care more about those that are close (Jones, 1991). For example, inappropriately accessing a customer’s personal information may have less proximity and moral intensity if the employee does not know the customer and feels far removed from them. Employees who feel closer to their organization or other parties in terms of membership, a sense of belonging, common identity, or shared values will be more likely to view behaviors that could harm the organization or other party, as unprincipled (Shaw, 2003):
Individuals who believe that the consequences from not performing the IPB are more proximate will be less likely to believe that not performing the IPB is ethical.
Social consensus is defined as the “degree of social agreement that a proposed act is evil (or good)” (Jones, p. 375). People are more likely to do the right thing in a situation if they know what is right for the situation (Jones, 1991). Society agreeing that an action is honest and principled can provide reassurance, and increase moral intensity and moral beliefs (Barnett, 2001; Hofmann et al., 2007).
Similarly, information security researchers study subjective norms because people often consult others to define social reality, and to decide on appropriate behaviors (Herath and Rao, 2009; Guo et al., 2011). Employees interact with their coworkers and immediate supervisors on a daily basis and may use them or significant others as role models to decide on principled beliefs, attitudes and behaviors (Guo et al., 2011). Further, end users may need to rely on others due to a lack of information security knowledge and skills (Herath and Rao, 2009; Guo et al., 2011). Researchers find subjective norms influence attitudes and intentions toward non-malicious security violations (Guo et al., 2011), intentions to comply (Herath and Rao, 2009; Bulgurcu et al., 2010) and moral beliefs (referred to as personal norms) (Yazdanmehr and Wang, 2016). While the construct subjective norms refers to seeking agreement to perform a behavior and social consensus refers to seeking agreement that a behavior is the right thing to do, the two variables may have similar effects on moral beliefs.
Individuals who believe there is social consensus that not performing the IPB is ethical are more likely to believe that not performing the IPB is ethical.
The final hypothesis predicts that criticality moderates the effects of the moral intensity dimensions on moral beliefs such that moral intensity will have stronger effects on moral beliefs if the IPB is critical to the organization. For example, the influence of magnitude of consequences on moral beliefs will be stronger if the employee perceives that performing the behavior is critical to the organization. Consistent with value congruence theory, high levels of both moral intensity and organizational criticality will provide person-organization fit. With fit, not only does the individual value the protective behavior by believing it is principled, but they also perceive the organization values the behavior. Organizational criticality will then strengthen the influence of moral intensity on moral beliefs. In other words, the interaction of these variables will increase the effects of the moral intensity dimensions on beliefs. This prediction is consistent with findings that organizational importance (i.e. criticality) of public service moderates the influence of motivation levels on behavioral evaluations (Coursey et al., 2012).
The moral intensity dimensions (a-magnitude of consequences, b-temporal immediacy, c-proximity, d-social consensus) will have a stronger effect on moral beliefs for high criticality IPBs than for low criticality IPBs.
The hypotheses were tested using an online scenario-based survey. Scenarios are commonly used in information systems security research and are beneficial because they help control what respondents are exposed to, and they provide a nonintrusive and unintimidating way to study noncompliant behaviors (D’Arcy and Devaraj, 2012). The authors created four scenarios, each about a different IPB (Appendix 1). Two behaviors depict low criticality behaviors (reminding others about security protocols and notifying sender of wrongly sent e-mails that contain sensitive information), and two depict high criticality behaviors (backing up data, and properly destroying and disposing of sensitive documents). The choice of these behaviors and their level of criticality were based on a taxonomy developed by Posey et al. (2013) that classifies IPBs as having either high or low organizational criticality. Each scenario described the setting (i.e. an employee working at an insurance company that handles sensitive data) and the company policy or guideline for performing the IPB. It also discussed how the employee was not always performing the IPB, which could be interpreted by respondents as an unethical decision. The scenarios were pilot tested in various phases with approximately 40 faculty and students. Adjustments in wording were made after each test to clarify meaning. Manipulation checks for the final survey show that respondents found the low criticality IPBs depicted in scenarios one and four significantly lower in criticality than the high criticality IPBs depicted in scenarios two and three (Appendix 1).
The sample consists of faculty, staff and students mostly from a mid-Atlantic University in the USA. To obtain the faculty and staff responses, the authors received permission to e-mail the survey link to all current faculty and staff via university communications. For the student responses, extra credit was offered in eight classes, two of which were taught by the authors, and seven of which were graduate classes. We did not gather data on whether the respondents were students or employees (faculty or staff), but rather used work experience and information security awareness to proxy for whether the respondents had more or less knowledge about data security issues. Amount of real-world knowledge is usually a concern when using student samples. Using work experience and information security awareness to proxy for this knowledge rather than student status seems appropriate for our study because the majority of the student sample were graduate students who probably had at least some and perhaps quite a bit of work experience. In addition, employees, especially staff, could conceivably have had less work experience than the students. By measuring work experience and information security awareness including them in the model as control variables, we were able to determine any effects they might have on moral beliefs and intention. The resulting sample was 68 per cent female, 38 years old on average and had average work experience (4.03/7.00) and high information security awareness (5.82/7.00) (Table I).
Upon clicking on the survey link respondents saw a consent form. After agreeing to take the survey by clicking on the consent button, the respondents each saw one of the four scenarios. After reading the scenario, respondents were asked to answer questions pertaining to the scenario. The questionnaire items were mostly adapted from prior research including items for moral intensity (Barnett, 2001), intention to not perform the protective behavior (D’Arcy et al., 2009), and moral beliefs (Barnett, 2001; D’Arcy et al., 2009). Item wording was changed to reflect the particular IPB for each scenario. For example, for the first intention item we changed the D’Arcy et al. (2009) item: “If you were Taylor, what is the likelihood that you would have sent the e-mail?” to “If you were Jamie, what is the likelihood that you wouldn’t bother to back up important data and documents on a regular basis?” While criticality was analyzed based on the scenario respondents received, the authors developed a four-item measure of perceived criticality as a manipulation check. The items were based on the definition of organizational criticality from Posey et al. (2013) that refers to how critical a behavior is for protecting an organization’s information assets. Four semantic endpoints based on the term critical and its synonyms were used for the response scales. Data was also collected for three control variables, with items adapted for information security awareness (Bulgurcu et al., 2010), and one-item scales used to measure gender and work experience. The survey items were pilot tested along with the scenarios and all constructs demonstrated adequate reliability.
In all, 241 responses were received; 25 responses were eliminated for being incomplete, for having the same numbered response for all items, or for having short completion times. The resulting sample size is 216, with 108 respondents completing the survey for the high criticality scenarios and 108 for the low criticality scenarios. An a priori power analysis resulted in a minimum required sample size of 103, meaning the sample sizes of 108 for each criticality group are large enough to detect small to medium effects.
A principle components analysis was run with all survey items using SPSS and direct oblim rotation to allow for correlated factors. Results showed that all items loaded at 0.70 or above on their respective constructs, and no items cross-loaded at more than 0.30. This supported the factor structure of the study and was an initial test that the constructs, including those newly developed for this study, had convergent and discriminant validity.
The means were also tabulated for each variable for the complete data set and for the low and high criticality groups (Table I). Four out of the five moral intensity dimensions (magnitude of consequences, proximity, and social consensus) have means consistent with identifying a moral component in the scenarios. Further, the means for moral beliefs show that respondents generally thought that not performing the activity was unethical. Table I also shows as expected, that the means for magnitude of consequences, temporal immediacy and criticality are significantly higher in the high criticality group, and the means for intention, moral beliefs and social consensus are significantly higher in the low criticality groups.
Measurement model analysis
The measurement model was run using SmartPLS. Supporting convergent validity, the PLS item loadings for each variable exceeded 0.70 in all data sets. Also, the Cronbach’s alphas, composite reliabilities and average variance extracted exceeded the recommended minimums (Fornell and Larcker, 1981; Nunnally and Bernstein, 1994) (Table I). Supporting discriminate validity, in all data sets each correlation is lower than the square root of the AVEs of the two correlated variables (Fornell and Larcker, 1981) (Table I), and PLS cross-loadings are lower than each item loading.
Multicollinearity was not a problem as variance inflation factors ranged from 1.01 to 1.59, which are well below the suggested cutoff of 5.00 (Menard, 1995). We used a method from Polites and Karahanna (2012) to test for common method bias. We found the lowest correlation is 0.008, which represents the upper limit for common method bias in the data (Lindell and Whitney, 2001; Malhotra et al., 2006). Further, there are a large percentage of nonsignificant correlations between items (162 or 46 per cent) that shows common method bias is not widespread. We also tried to preemptively reduce common method bias in the survey design. To accomplish this, the survey was developed using both Likert and semantic differential scales, scale midpoint labels and varying scale labels, which are all suggested as procedural methods to reduce bias (Podsakoff et al., 2012).
Structural model analysis
The hypotheses were then tested with results shown in Figures 2 and 3. Using the complete data set, H1 is supported because moral beliefs significantly influence intention (Figure 2). H2 is also supported as magnitude of consequences significantly influences moral beliefs. There was no support for H3 and H4 as the relationships between temporal immediacy and moral beliefs, and proximity and moral beliefs were not significant. Social consensus had a significant influence on moral beliefs, providing support for H5.
A PLS MultiGroup analysis was then used to test H6. The MultiGroup analysis tests the model for each group separately and then performs significance tests on the differences between path coefficients. Results show that in the high criticality group temporal immediacy, proximity and social consensus have significant relationships with moral beliefs (Figure 3). In the low criticality group, the results are much different. While social consensus has a significant effect on moral beliefs, temporal immediacy and proximity do not. In addition, magnitude of consequences has a significant effect on moral beliefs in the low intensity group, whereas it did not have a significant effect in the high intensity group. The group difference tests show that three of the four differences are significant. The path coefficients for the relationships between temporal immediacy and moral beliefs, and between proximity and moral beliefs are significantly different (p < 0.014 and p < 0.013, respectively). This means these factors have a significantly greater influence on moral beliefs in the high criticality group than in the low criticality group and H6b and H6c are supported. We also find a significant difference in the path coefficients from social consensus to moral beliefs (p < 0.009). However, opposite to what we predicted, the coefficient is stronger in the low criticality group than in the high criticality group therefore H6d is not supported. H6a is also not supported because the difference between the coefficients for the magnitude of consequences to moral beliefs paths is not significant.
While not hypothesized, the ethical decision-making model predicts that the moral intensity dimensions might impact intention directly and indirectly (Jones, 991). Further, as explained earlier some dimensions of moral intensity (magnitude of consequences and social consensus) are similar to constructs in other theories that typically predict intentions (Chen et al., 2009; Herath and Rao, 2009; Yoon, 2011). Therefore, we analyzed the direct effects of the moral intensity dimensions on intention in addition to their indirect effects via moral beliefs. Findings show that only social consensus has a significant direct effect on intention (β = 0.12, p < 0. 05). This result is found only in the complete and low criticality data sets. Further, the previously reported results do not change.
IPBs are important for reducing insider security incidents and keeping sensitive information safe. This study examines the role of morality and organizational criticality in individuals’ intentions to not perform four specific IPBs. The majority of hypotheses are supported, explaining 16 per cent-48 per cent of the variance in moral beliefs and 24-33 per cent of the variance in intentions. While the focus of the current study is on explaining how the moral intensity dimensions affect moral beliefs rather than on explaining variance in intentions, we find a strong influence of moral beliefs on intention in both the data sets. Because previous research finds that moral beliefs have a significant effect on intention even when other predictor variables are added to the model, the results from this study may transfer to future work that uses other theoretical models to explain IPB intention (Chen et al., 2009; Vance and Siponen, 2012).
The first contribution of this study is examining the separate dimensions of moral intensity on moral beliefs about IPBs. The study uses multiple-item scales to measure four moral intensity dimensions. Prior information system security studies examine moral intensity as one construct (Crossler et al., 2017) or use single-item measures for the dimensions (Peslak, 2008). Using separate dimensions allows for a finer-grained look at moral intensity. For example, results show that social consensus and magnitude of consequences have the strongest effects on moral beliefs. These two dimensions were found to be the most significant in prior research (Shaw, 2003). However, rather than social consensus, Shaw (2003) uses a construct called organizational consensus that refers to co-worker’s opinions rather than the opinions of society as a whole. In addition, Shaw’s sample consists of webmasters, whereas the current study uses a more general sample of organizational insiders. Thus, the current research extends prior work by showing that a variety of organizational insiders rely on opinions from a wider group than just organizational members to decide whether IPBs are the right thing to do.
Findings also show that proximity and temporal immediacy have no significant influence on moral beliefs in the complete data set. This contradicts some research showing these factors are important in an information security context (Shaw, 2003; Goles et al., 2006; Peslak, 2008). These differences could arise because of context or model differences. Because Shaw (2003) finds an interactive effect of proximity on social consensus, a supplementary analysis was run to see if proximity or temporal immediacy had interactive effects with the other dimensions on moral beliefs. Findings show that proximity has no significant interactive effects, but the interaction between temporal immediacy and magnitude of consequences is significant (β = −0.15, p < 0.014). This means that magnitude of consequences has an even greater effect on moral beliefs when the harm is more immediate. Future researchers should consider this when predicting how moral intensity impacts intentions and behavior. Overall, this study finds that examining the separate dimensions provides a greater understanding of moral intensity and moral beliefs about IPBs. It helps generalize some early findings to wider audiences, and it provides some novel understanding about how the moral intensity dimensions might interact.
The second contribution is investigating the fit between moral intensity and organizational criticality by analyzing differences in path coefficients between the high and low criticality groups. The low and high criticality scenarios are based on prior work that finds IPBs cluster around criticality, promotion difficulty, and common sense (Posey et al., 2013). The current study is one of the first to examine these factors. While it only examines criticality, future research can investigate how promotion difficulty and common sense might interact with moral intensity and IPBs.
Magnitude of consequences and social consensus has stronger effects in the low criticality model, albeit the difference is only significant for social consensus. This contradicts the predictions about value congruence in which the moral intensity dimensions would have stronger effects when criticality is high. Instead, these two factors have stronger influences on moral beliefs when organizational criticality is lower. The strong influence of these factors largely contributes to the low criticality model explaining 48 per cent of the variance in moral beliefs and the high criticality model explaining only 24 per cent of the variance.
That social consensus has a stronger impact on moral beliefs when IPBs are less critical to the organization could mean that determining whether the behaviors are right or wrong is less straightforward when they are less critical. Individuals may need to look beyond the organizational boundaries to get a sense of right or wrong. They may feel society is knowledgeable about the behaviors because there have been so many highly publicized information security incidents.
Similar to results in the complete data set, magnitude of consequences has the second strongest impact on moral beliefs in the low criticality group. However, surprisingly when the behavior is more critical to the organization and the potential harm of not performing the behavior is high, it is not the deciding factor in making an honest and principled judgment. Prior research has shown that formal sanctions like certainty and severity of punishment to the individual can influence moral beliefs (D’Arcy and Devaraj, 2012). It could be that when organizational criticality is high, certainty and severity of punishment have more of an influence on moral beliefs, overriding the impact of moral intensity and value congruence. In addition, prior research shows that policies and training programs influence moral beliefs (Hovav and D’Arcy, 2012). Organizations may emphasize these countermeasures more when criticality is high, resulting in their influence outweighing that of the moral intensity factors. Future research can investigate these issues further to try to tease out which effects are working under which conditions.
Proximity and temporal immediacy had no direct influences on moral beliefs in the low criticality group, but they do have significant effects on moral beliefs in the high criticality group. The effects are also significantly different between groups. This means that the more immediately the harm happens and the more it has effects similar people, the more not performing the behavior is perceived as wrong. The moral intensity of these factors plays a significant and larger role when an individual’s values coincide with organizational criticality.
Overall, these findings make a theoretical contribution about the role “fit” plays between individual morality and organizational importance of IPBs. It reveals situations where fit between criticality and morality work to increase beliefs that not performing IPBs is unprincipled. It also reveals that fit may not be the only factor in increasing these beliefs. Future research should explore other factors that can increase moral beliefs about IPBs as well as factors that can strengthen fit between individual values and organizational criticality.
Practical implications and limitations
This research also has practical implications. Unethical information security behavior is widespread and can have damaging effects on organizations. Because findings in this study show that individuals rely on their moral beliefs when forming intentions to not perform protective behaviors, it will be important for organizations to understand how to help individuals make appropriate moral judgments. Management will need to provide strong leadership that signals the magnitude of consequences, social consensus, proximity and temporal immediacy of behaviors congruent with the criticality of the protective behavior. Reports suggest that tone at the top is a continuing weakness for leadership in information security practices (Benjamin, 2014; Roboff, 2016). Researchers propose that ethics can be improved through leadership actions such as how resources are allocated, what behavior is rewarded, and hiring practices (Hu et al., 2012; Schein, 2004). Our findings can guide management about which moral intensity dimensions are more important to focus on when remediating tone at the top weaknesses.
Another practical implication arises from integrating our findings with research that investigates ethics programs. Prior research has examined the role of both formal ethics programs and informal ethical cultures in influencing individuals’ moral beliefs. Formal ethics programs consist of codes of ethics, training sessions, whistle blowing policies and monitoring systems, whereas the informal ethical culture consists of ethical role modeling by management and supervisors, the capability and commitment of employees to behave ethically and openness to discuss ethical issues (Kaptein, 2011). The informal culture can help make the formal programs seem less like mere window dressing and reduce inappropriate and unprincipled behavior (Kaptein, 2011). Our findings suggest that there may be differences in the effectiveness of these programs based on the types of moral intensity they affect, and the organizational criticality of the information security issues.
Some researchers suggest in fact that compliance-oriented programs focusing on rules and laws may be less effective at influencing moral intensity dimensions than values-based programs that focus on the fundamental beliefs, concepts and principles that underlie the culture of an organization (O’Leary-Kelly and Bowes-Sperry, 2001). The primary objective of values-based training programs would be to increase the moral intensity of security issues, especially those issues with higher organizational criticality (O’Leary-Kelly and Bowes-Sperry, 2001). The findings of the current study show that increasing certain moral intensity dimensions will have stronger effects when organizational criticality is lower, and increasing others will have stronger effects when criticality is higher. Activities could include coming up with a common definition of information protection and security, emphasizing the harm of even a single security incident and encouraging discussions about common security behavior violations. These types of practices are all aimed at helping individuals make more appropriate judgments about what is right and what is wrong (O’Leary-Kelly and Bowes-Sperry, 2001). They could also lead to transformative organizational change by strengthening the effects of the moral intensity dimensions on moral beliefs.
One limitation of this study is that it is based on a convenience sample of faculty, administrators, staff and students at a medium-sized mid-Atlantic university. While other studies use similar samples (Peslak, 2008), it could make the results less generalizable to other populations. However, universities maintain large amounts of sensitive data, making IPBs important in this context. Privacyrights.org shows that 30 educational institutions experienced data breaches in 2014, and five of the thirty schools actually had larger data breaches than the Sony hacking case (McCarthy, 2015). Also, even though prior research does so (Barnett, 2001), omitting two of the six moral intensity dimensions could have affected results. Finally, while the sample size afforded enough power to test the high/low criticality models, it was not large enough to test the four scenarios separately. However, mean differences by scenario show similar differences to those reported earlier.
This study’s main purpose was to examine morality, which is important because employees are often faced with deciding whether or not to follow company prescribed IPBs. Using data collected from organizational insiders, we discovered that organizational criticality is an important factor in shaping how moral intensity impacts moral beliefs about protective behaviors. While social consensus and magnitude of consequences have the strongest effects on moral beliefs, this is true only under conditions of low criticality. Under high criticality, proximity and temporal immediacy of consequences have stronger effects. In conclusion, these findings can stimulate research and provide new directions in understanding IPBs in organizations. This study provides interesting insights and outlines important implications for organizations that can contribute to decreasing the threats caused knowingly and unknowingly by organizational insiders.
Correlations and square root of average variance extracted on diagonal
|Intention to not perform protection behavior||2.80||0.86||0.92||0.78||0.89|
|Magnitude of consequences||4.87||0.95||0.97||0.91||−0.32||−0.46||0.96|
|Information security awareness||5.82||0.88||0.93||0.81||−0.17||−0.20||0.13||0.05||−0.05||−0.06||0.12||0.90|
|Low criticality group|
|Intention to not perform protection behavior||3.33||0.83||0.90||0.74||0.86|
|Magnitude of consequences||4.35||0.95||0.97||0.91||−0.25||−0.40||0.96|
|Information security awareness||5.97||0.85||0.91||0.77||−0.23||−0.32||0.27||0.02||−0.14||−0.17||0.15||88|
|High criticality group|
|Intention to not perform protection behavior||2.27||0.87||0.92||0.80||0.89|
|Magnitude of consequences||5.39||0.92||0.95||0.86||−0.20||−0.19||0.93|
|Information security awareness||5.67||0.91||0.94||0.84||−0.23||−0.25||0.11||0.13||0.05||−0.05||0.22||0.92|
Consequences can be related to oneself and/or to others such as peers and others in the organization. We follow Barnett (2001) and conceptualize magnitude of consequences without specifying the target of the consequences.
The Posey et al. (2013) taxonomy also classifies IPBs based on promotion difficulty and degree of common sense. Because these factors were not considered in this research, we picked criticality items that were classified similarly on these other two factors.
Approximately 20 responses were obtained from a master-level class at a different University.
To reduce any positive response bias from using students from courses the authors taught, we administered the survey instrument anonymously online and we worded the scenarios and survey items in the third person. These practices can make respondents feel like they can answer more accurately and truthfully (Lord and Melvin, 1997; SurveyMethods, 2014).
While the mean criticality for the low criticality group is significantly lower than that for the high criticality group, it is still above average (5.30/7.00). This means this study is actually examining “lower” versus “higher” criticality groups.
Scenario 1: notify senders of unintended receipt of sensitive personal information (low criticality)
Jamie is employed at an insurance company. Many employees working at the company handle sensitive personal information including demographic information, medical histories, tests and laboratory results, credit and financial information and other data. While most of the e-mails sent by employees do not contain this sensitive personal information, some employees may have a need to transmit this information via e-mail as part of their duties.
From time to time company e-mail may be accidentally sent to a recipient who is not intended to receive the message. The company provides guidelines about using disclaimers appended to the bottom of e-mails to ensure sensitive personal information is protected. Guidelines suggest e-mails sent by company employees should include the following unintended recipient disclaimer:
This email is intended for the addressee and may contain privileged information. If you are not the addressee you are not permitted to use or copy this email or its attachments nor may you disclose the same to any third party. If this has been sent to you in error please delete the email and notify us by replying to this email immediately.
Jamie sometimes receives e-mails that contain sensitive personal information that are unintended for her. This happens when e-mails have been forwarded to her or she has been inadvertently copied on a previous chain of e-mails. Even though the e-mails have the above disclaimer most of the time she does not delete the e-mail or notify the sender she received the e-mail.
Scenario 2: properly destroy and dispose of sensitive information (high criticality)
Jamie is employed at an insurance company. She handles sensitive personal information including demographic information, medical histories, tests and laboratory results, credit and financial information and other data, some of which ends up unneeded.
Her company has a policy for properly destroying and disposing of all unneeded sensitive personal information. The policy states that sensitive personal information provided to the company must be destroyed when no longer needed for the specific purpose for which it was provided. The sensitive personal information may be non-electronic (paper) documents or may reside electronically on one’s computer, and in system backups, temporary files, or other storage media. For non-electronic documents, the company policy requires employees to render the documents unreadable and safe for disposal or recycling using a crosscut shredder. For electronic data, the company policy requires employees to render it irretrievable using physical destruction or an approved wiping method.
Even though Jamie is required to properly destroy and dispose of all unneeded sensitive personal information, she seldom does so. In fact, sometimes unneeded paper documents containing sensitive personal information lay around on her desk so others can see and access the information. There are also numerous electronic files on her computer that are unneeded but have not been destroyed and disposed of according to policy.
Scenario 3: backup sensitive information (high criticality)
Jamie is employed at an insurance company. She handles sensitive personal information including demographic information, medical histories, tests and laboratory results, credit and financial information and other data.
Jamie is in charge of backing up important data and documents on a regular basis. In fact, her company has a backup policy that includes the identification of critical systems and data, frequency of incremental and full backups, storage of backups, offsite rotation and restoration procedures. This policy represents the company’s last line of defense against data loss stemming from a hardware failure, data corruption or a security incident.
While Jamie’s company has network-based backup meaning they have a few backup servers with tape devices that read data across the network from all selected servers, all users are required to back up data and documents that are stored on their individual computers and tablet devices. This must be performed on a weekly basis.
Even though Jamie is required to back up her computer files on a regular basis, she seldom does so. In fact, sometimes she goes as long as two months without backing up her computer and tablet device. There are also numerous electronic files on her computer that are do not get backed up during the network backup and therefore are vulnerable to being lost or destroyed.
Scenario 4: promote security guidelines (low criticality)
Jamie is employed at an insurance company. Employees at the company handle sensitive personal information including demographic information, medical histories, tests and laboratory results, credit and financial information and other data.
Jamie’s company has a detailed set of information security guidelines and protocols that provide management direction and support for information security in accordance with business requirements and relevant laws and regulations. These guidelines and protocols help the organization maintain appropriate protection of its information assets and ensure that information receives an appropriate level of protection. Many of them relate to mechanisms and procedures that are primarily implemented by people rather than systems. They include guidelines related to acceptable use of computing and information resources, desktop hoaxes, file sharing, managing sensitive data, phishing attacks, social engineering, desktop security, passwords and security awareness.
Jamie’s organization has adopted these information security guidelines and protocols, and communicated them to all employees. However, her organization also hopes that employees will continue to promote these practices by reminding each other of them as needed.
Despite this, Jamie seldom reminds her coworkers about the information security guidelines and protocols.
This shows an example of the survey instrument used for collecting data for Scenario 3. The questions were reworded for each scenario. The respondents were randomized to see only one of the four scenarios and answered only one set of questions based on their scenarios.
Intention to not perform protection behavior (D’Arcy et al., 2009)
If you were Jamie, what is the likelihood that you would not bother to back up important data and documents on a regular basis? (1 = Very Unlikely to 7 = Very Likely).
What is the probability that you would not bother to back up important data and documents on a regular basis if you were Jamie? (1 = Improbable to 7 = Probable).
If I were like Jamie, I could see myself not bothering to back up important data and documents on a regular basis? (1 = Strongly Disagree to 7 = Strongly Agree).
Moral beliefs (D’Arcy et al., 2009, item 1: Barnett, 2001, items 2 and 3)
It is morally acceptable for Jamie to not back up important data and documents on a regular basis. (1 = Strongly Disagree to 7 = Strongly Agree).
Jamie not backing up important data and documents on a regular basis is: (1 = Unethical to 7 = Ethical).
Jamie not backing up important data and documents on a regular basis is: (1 = Not Principled to 7 = Principled).
Magnitude of consequences (Barnett, 2001)
Do you believe any harm resulting from Jamie not backing up important data and documents on a regular basis will be______________:
1 = Minor to 7 = Severe.
1 = Insignificant to 7 = Significant.
1 = Slight to 7 = Great.
Temporal immediacy (Barnett, 2001)
Do you anticipate that any consequences of Jamie not backing up important data and documents on a regular basis are likely to occur______________:
1 = After a long time to 7 = Immediately.
1 = Slowly to 7 = Quickly.
1 = Gradually to 7 = Rapidly.
Proximity (Barnett, 2001)
Compared to yourself, do you believe those potentially affected by Jamie not backing up important data and documents on a regular basis are:
1 = Dissimilar to you to 7 = Similar to you.
1 = Not like you to 7 = Like you.
1 = Different from you to 7 = Same as you.
Social consensus (Barnett, 2001)
Please indicate the degree to which you believe society as a whole views Jamie’s actions of not backing up important data and documents on a regular basis as.
1 = Unethical to 7 = Ethical.
1 = Wrong to 7 = Right.
1 = Inappropriate to 7 = Appropriate.
Criticality (author developed)
Employees backing up important data and documents on a regular basis is _____________ to protect an organization’s information assets.
1 = Not Very Critical to 7 = Very Critical.
1 = Insignificant to 7 = Significant.
1 = Trivial to 7 = Vital.
1 = Unimportant to 7 = Important.
Information security awareness (Bulgurcu et al., 2010)
(1 = Strongly Disagree to 7 = Strongly Agree).
Overall, I am aware of the potential security threats and their negative consequences.
I have sufficient knowledge about the cost of potential security problems.
I understand the concerns regarding information security and the risks they pose in general.
Please indicate the amount of work experience you have in years. (0, <1 year, 1-2 years, 4-5 years, 5-10 years, 10-20 years, 20+years).
Barnett, T. (2001), “Dimensions of moral intensity and ethical decision making: an empirical study”, Journal of Applied Social Psychology, Vol. 31 No. 5, pp. 1038-1057.
Barnett, T. and Valentine, S. (2004), “Issue contingencies and marketers: recognition of ethical judgments and behavioral intentions”, Journal of Business Research, Vol. 57 No. 4, pp. 338-346.
Barnett, T., Brown, G., Bass, K. and Hebert, F.J. (1999), “New measures for proposed dimensions of the moral intensity of ethical issues”, Paper presented at the Academy of Management, Chicago.
Benjamin, R. (2014), “Tone at the top: today’s biggest cyber-security weakness”, eForensics Magazine, September 4, 2014, available at https://eforensicsmag.com/tone-at-the-top-todays-biggest-cyber-security-weakness-by-rob-benjamin/ (accessed 20 October 2018).
Bulgurcu, B., Cavusoglu, H. and Benbasat, I. (2010), “Information security policy compliance: an empirical study of rationality-based beliefs and information security awareness”, MIS Quarterly, Vol. 34 No. 3, pp. 523-548.
Chatman, J.A. (1989), “Improving interactional organizational research: a model of person-organization fit”, Academy of Management Review, Vol. 14 No. 3, pp. 333-349.
Chatterjee, S., Sarker, S. and Valacich, J.S. (2015), “The behavioral roots of information systems security: exploring key factors related to unethical IT use”, Journal of Management Information Systems, Vol. 31 No. 4, pp. 49-87.
Chen, M.-F., Pan, C.-T. and Pan, M.-C. (2009), “The joint moderating impact of moral intensity and moral judgment on consumer’s use intention of pirated software”, Journal of Business Ethics, Vol. 90 No. 3, pp. 361-373.
Christiansen, N.D., Villanova, P. and Mikulay, S. (1997), “Political influence compatibility: fitting the person to the climate”, Journal of Organizational Behavior, Vol. 18 No. 6, pp. 709-730.
Coursey, D., Yang, K. and Pandey, S.K. (2012), “Public service motivation (PSM) and support for citizen participation: a test of Perry and Vandenabeele’s reformulation of PSM theory”, Public Administration Review, Vol. 72 No. 4, pp. 572-582.
Crossler, R.E., Long, J.H., Loraas, T.M. and Trinkle, B.S. (2017), “The impact of moral intensity and ethical tone consistency on policy compliance”, Journal of Information Systems, Vol. 31 No. 2, pp. 49-64.
D’Arcy, J. and Devaraj, S. (2012), “Employee misuse of information technology resources: testing a contemporary deterrence model”, Decision Sciences, Vol. 43 No. 6, pp. 1091-1124.
D’Arcy, J., Herath, T. and Shoss, M.K. (2014), “Understanding employee responses to stressful information security requirements: a coping perspective”, Journal of Management Information Systems, Vol. 31 No. 2, pp. 285-318.
D’Arcy, J., Hovav, A. and Galletta, D. (2009), “User awareness of security countermeasures and its impact on information systems misuse”, Information Systems Research, Vol. 20 No. 1, pp. 79-98.
Dubinsky, A.J. and Loken, B. (1989), “Analyzing ethical decision making in marketing”, Journal of Business Research, Vol. 19 No. 2, pp. 83-107.
Fishbein, M. and Ajzen, I. (1975), Belief, Attitude, Intention, and Behavior: An Introduction to Theory and Research, Addison-Wesley, Reading.
Fornell, C. and Larcker, D.F. (1981), “Evaluating structural equations with unobservable variables and measurement error”, Journal of Marketing Research, Vol. 18 No. 1, pp. 39-50.
Goles, M., White, G.B., Beebe, N., Dorantes, C.A. and Hewitt, B. (2006), “Moral intensity and ethical decision-making: a contextual extension”, ACM SIGMIS Database, Vol. 37 Nos 2/3, pp. 86-95.
Guo, K.H., Yuan, Y., Archer, N.P. and Connelly, C. (2011), “Understanding nonmalicious security violations in the workplace: a composite behavior model”, Journal of Management Information Systems, Vol. 28 No. 2, pp. 203-236.
Han, T.-S., Chiang, H.-H., McConville, D. and Chiang, C.-L. (2015), “A longitudinal investigation of person–organization fit, person–job fit, and contextual performance: the mediating role of psychological ownership”, Human Performance, Vol. 28 No. 5, pp. 425-439.
Heimer, J.-L. (2018), “Insider threats are very real – and they’re in your organization”, available at: https://insight.nttsecurity.com/post/102elw8/insider-threats-are-very-real-and-theyre-in-your-organization (accessed 22 April 2018).
Herath, T. and Rao, H.R. (2009), “Protection motivation and deterrence: a framework for security policy compliance in organizations”, European Journal of Information Systems, Vol. 18 No. 2, pp. 106-125.
Hovav, A. and D’Arcy, J. (2012), “Applying an extended model of deterrence across cultures: an investigation of information systems misuse in the US and South Korea”, Information and Management, Vol. 49 No. 2, pp. 99-110.
Hu, Q., Tamara, D., Hart, P. and Cooke, D. (2012), “Managing employee compliance with information security policies: the critical role of top management and organizational culture”, Decision Sciences, Vol. 43 No. 4, pp. 615-660.
Johnson, C.E. (2007), Ethics in the Workplace: Tools and Tactics for Organizational Transformation, Sage Publications, Thousand Oaks, CA.
Jones, T.M. (1991), “Ethical decision making by individuals in organizations: an issue-contingent model”, Academy of Management Review, Vol. 16 No. 2, pp. 366-395.
Judge, T.A. and Bretz, R.D. (1992), “Effects of work values on job choice decisions”, Journal of Applied Psychology, Vol. 77 No. 3, pp. 261-271.
Kaptein, M. (2011), “Understanding unethical behavior by unraveling ethical culture”, Human Relations, Vol. 64 No. 6, pp. 843-869.
Liedtka, J.M. (1989), “Value congruence: the interplay of individual and organizational value systems”, Journal of Business Ethics, Vol. 8 No. 10, pp. 805-815.
Lincoln, S.H. and Holmes, E.K. (2011), “Ethical decision making: a process influenced by moral intensity”, Journal of Healthcare, Science and the Humanities, Vol. 1 No. 1, pp. 55-69.
Lindell, M.K. and Whitney, D.J. (2001), “Accounting for common method variance in cross-sectional research designs”, Journal of Applied Psychology, Vol. 86 No. 1, pp. 114-121.
Lord, A.T. and Melvin, K.B. (1997), “The attitudes of accounting students, faculty and employers towards cheating”, Research on Accounting Ethics, Vol. 3, pp. 1-20.
Malhotra, N.K., Kim, S.S. and Patil, A. (2006), “Common method variance in IS research: a comparison of alternative approaches and a reanalysis of past research”, Management Science, Vol. 52 No. 12, pp. 1865-1883.
McCarthy, K. (2015), “5 Colleges with data breaches larger than Sony’s in 2014”, available at: www.huffingtonpost.com/kyle-mccarthy/five-colleges-with-data-b_b_6474800.html (accessed 19 July 2018).
Masters, G. (2018), “Report: insider threat more dangerous than external risks”, available at: www.scmagazine.com/report-insider-threat-more-dangerous-than-external-risks/article/533061/ (accessed 22 April 2018).
Meglino, B.M., Ravlin, E.C. and Adkins, C.L. (1989), “A work values approach to corporate culture: a field test of the value congruence process and its relationship to individual outcomes”, Journal of Applied Psychology, Vol. 74 No. 3, pp. 424-432.
Menard, S. (1995), Applied Logistic Regression Analysis, Sage University Series on Quantitative Applications in the Social Science, Sage, Thousand Oaks, CA.
Moody, G.D., Siponen, M. and Pahnila, S. (2018), “Toward a unified model of information security policy compliance”, MIS Quarterly, Vol. 42 No. 1, pp. 285-A22.
Newman, A., Round, H., Bhattacharya, S. and Roy, A. (2017), “Ethical climates in organizations: a review and research agenda”, Business Ethics Quarterly, Vol. 27 No. 4, pp. 475-512.
Nunnally, J.C. and Bernstein, I.H. (1994), Psychometric Theory, 3rd Ed., McGraw-Hill, New York, NY.
O’Leary-Kelly, A.M. and Bowes-Sperry, L. (2001), “Sexual harassment as unethical behavior: the role of moral intensity”, Human Resource Management Review, Vol. 11, pp. 73-92.
Peslak, A.R. (2008), “Current information technology issues and moral intensity influences”, Journal of Computer Information Systems, Vol. 48 No. 4, pp. 77-86.
Podsakoff, P.M., Mackenzie, S.B. and Podsakoff, N.P. (2012), “Sources of method bias in social science research and recommendations on how to control it”, Annual Review of Psychology, Vol. 63, pp. 539-569.
Polites, G.L. and Karahanna, E. (2012), “Shackled to the status quo: the inhibiting effects of incumbent system habit, switching costs, and inertia on new system acceptance”, MIS Quarterly, Vol. 36 No. 1, pp. 21-42.
Ponemon Institute (2016), “Cost of insider threats: benchmark study of organizations in the United States”, Ponemon Institute Research Report.
Posey, C., Roberts, T.L., Lowry, P.B., Bennett, R.J. and Courtney, J.F. (2013), “Insiders’ protection of organizational information assets: development of a systematics-based taxonomy and theory of diversity for protection-motivated behaviors”, MIS Quarterly, Vol. 37 No. 4, pp. 1189-1210.
Roboff, G. (2016), “The tone at the top: assessing the board’s effectiveness”, ISACA Journal, Vol. 6, pp. 1-8.
Schein, E. (2004), Organizational Culture and Leadership, 3rd Ed., Wiley, San Francisco.
Schulze, H. (2018), “Insider threats: 2018 results”, available at: www.ca.com/content/dam/ca/us/files/ebook/insider-threat-report.pdf (accessed 30 July 2018).
Shaw, T.R. (2003), “The moral intensity of privacy: an empirical study of Webmasters’ attitudes”, Journal of Business Ethics, Vol. 46 No. 4, pp. 301-318.
Sommestad, T., Hallberg, J., Lundholm, K. and Bengtsson, J. (2015), “Variables influencing information security policy compliance: a systematic review of quantitative studies”, Information Management and Computer Security, Vol. 22 No. 1, pp. 42-75.
SurveyMethods (2014), “What is extreme response bias?”, available at: www.surveymethods.com/blog/what-is-extreme-response-bias/ (accessed 21 December 2018).
Valentine, S. and Hollingworth, D. (2012), “Moral intensity, issue importance, and ethical reasoning in operations situations”, Journal of Business Ethics, Vol. 108 No. 4, pp. 509-523.
Van Dyne, L. and Pierce, J.L. (2004), “Psychological ownership and feelings of possession: three field studies predicting employee attitudes and organizational citizenship behavior”, Journal of Organizational Behavior, Vol. 25 No. 4, pp. 439-459.
Van Liere, K.D. and Dunlap, R.E. (1978), “Moral norms and environmental behavior: an application of Schwartz’s norm-activation model to yard burning”, Journal of Applied Social Psychology, Vol. 8 No. 2, pp. 174-188.
Vance, A. and Siponen, M. (2012), “IS security policy violations: a rational choice perspective”, Journal of Organizational and End User Computing, Vol. 24 No. 1, pp. 21-41.
Vigoda, E. and Cohen, A. (2002), “Influence tactics and perceptions of organizational politics: a longitudinal study”, Journal of Business Research, Vol. 55 No. 4, pp. 311-324.
Yazdanmehr, A. and Wang, J. (2016), “Employee’s information security policy compliance: a norms activation perspective”, Decision Support Systems, Vol. 92 No. 12, pp. 36-46.
Yoon, C. (2011), “Ethical decision-making in the internet context: development and test of an initial model based on moral philosophy”, Computers in Human Behavior, Vol. 27 No. 6, pp. 2401-2409.