Optimism bias in susceptibility to phishing attacks: an empirical study

Purpose – Researchers looking for ways to change the insecure behaviour that results in phishing have considered multiple possible reasons for such behaviour. Therefore, the purpose of this paper is to understand the role of optimism bias (OB – de ﬁ ned as a cognitive bias), which characterises overly optimistic or unrealistic individuals, to ensure secure behaviour. Research that focused on issues such as personality traits, trust, attitude and Security, Education, Training and Awareness (SETA) was considered. Design/methodology/approach – This study built on a recontextualized version of the theory of planned behaviour to evaluate the in ﬂ uence that optimism bias has on phishing susceptibility. To model the data, an analysis was performed on 226 survey responses from a South African ﬁ nancial services organisationusingpartial leastsquares (PLS)path modelling. Findings – This study found that overly optimistic employees were inclined to behave insecurely, while factors such as attitude andtrustsigni ﬁ cantly in ﬂ uencedtheintentionto behave securely. Practical implications – Our contribution to practice seeks to enhance the effectiveness of SETA by identifying and addressing the optimism bias weakness to deliver a more successful training outcome. Originality/value – Our study enriches the Information Systems literature by evaluating the effect of a cognitive bias on phishing susceptibility and offers a contextual explanation of the resultant behaviour.


Introduction
According to the Cyber Security Breaches Survey (Ell and Gallucci, 2022), 83% of attacks were phishing based.The 2021 State of the Phish report (Proofpoint, 2021) states that more than 75% of organisations experienced a phishing attack in 2020, and 57% of these fell victim to a phishing attack.
This study considers the impact of optimism bias.Unrealistic optimism, or optimism bias (OB), is a cognitive bias where a person believes "that negative events are less likely to happen to them than to others, and they believe that positive events are more likely to happen to them than to others" (Weinstein, 1980, p. 807).Individuals characterised by optimism bias underestimate the likelihood of bad events happening (e.g. a car accident, losing their jobs or suffering from a terminal illness) whilst overestimating the likelihood of good events such as individual longevity or job aptitude (Sharot, 2011).It is therefore likely that an individual who is unrealistically optimistic could behave insecurely as they do not think they will be part of a phishing scam.
Prior phishing studies have attempted to understand why certain people are more susceptible to falling victim to a phishing attack (Chen et al., 2020;Parsons et al., 2019;Williams et al., 2017).A logical place to start is by understanding human behaviour, which is the central theme for many studies that focus on persuasion techniques.
Based on the phishing statistics mentioned above, it appears that the implementation of technical security controls and Security, Education, Training and Awareness (SETA) is ineffective in eradicating phishing attacks completely.To make matters worse, some people consider themselves well versed in cyber security awareness and, owing to their confidence or optimistic outlook, do not think a breach will happen to them (Boddy, 2018).
Thus, the following research question was formulated: To what extent does optimism bias influence phishing susceptibility?
Accordingly, the aim of this study was: to propose alternative methods for effective SETA specifically considering optimism bias; to understand the influence of optimism bias on secure behaviour; and to understand how the relationship between attitude, trust and awareness influences the intention to behave securely (i.e.not falling victim to a phishing attack).
The remainder of this paper is structured as follows: We review previous literature on optimism bias, phishing, the theory of planned behaviour (TPB) and SETA.These findings are used to develop our research model and hypotheses.The research methodology described and the results presented.To conclude, we discuss how our findings can be used for research and practice, as well as the limitations of this study and future directions for research.

Theoretical background 2.1 Optimism bias overview
Optimism bias is a cognitive bias in the social sciences and is referred to in various studies linked to social-reasoning and decision-making (Sweldens et al., 2014).A cognitive bias is defined as "an inaccurate view of the world" that might produce a rational behaviour or result in an outcome bias (Marshall et al., 2013, p. 469).There is evidence that people believe they have a better than average chance of experiencing a variety of desirable future outcomes (Weinstein, 1980).Houston et al. (2012, p. 173) define optimism as an "overestimation of the expected gains from future outcomes".People therefore believe that they are at less risk compared to other people.Unrealistic optimism can be classified into one of two broad categories: (1) unrealistic absolute optimism; and (2) unrealistic comparative optimism (Shepperd et al., 2017).
Both types of unrealistic optimism can be applied on both an individual and a group level.
The belief that a personal outcome will be more favourable than it should be according to some quantitative objective standard is referred to as unrealistic absolute optimism (Shepperd et al., 2017).Researchers have compared people's predictions about an event to the actual outcomes to assess unrealistic absolute optimism at the individual level (Shepperd et al., 2017).Previous studies using this approach in comparing predictions to actual behaviour have shown that people are unrealistically optimistic about, for example, the time it will take them to complete a task, the starting salary for their first job and their grades for college exams (Buehler et al., 1994).Another method to determine unrealistic absolute optimism at the individual level is to compare an individual's personal risk estimate to a risk calculator, which is defined as a "validated individualised risk-assessment algorithm" (Shepperd et al., 2017).An example would be when a woman rates herself as having a 5% chance of getting breast cancer when the risk calculator suggests 21%.
The tendency for people to report that they are less likely to experience a negative event compared to others is known as unrealistic comparative optimism (Baek et al., 2014).Previous studies that explained comparative optimism have included risk rejection, which is the belief that people disregard the likelihood of experiencing negative events (Arnett, 2000); ego safety, which is the belief that people want to defend themselves against a negative selfimage (Helweg-Larsen et al., 2002); and a sense of control, which is the belief that people are overconfident in their ability to control future events (Weinstein, 1980).Researchers contend that a group of people exhibits unrealistic comparative optimism if the group's mean comparative risk estimate for an unfavourable outcome is less than "average".The logic is that if a group of people accurately estimate their risk, their estimates should balance out overall by taking the accurate below and above average into account (Shepperd et al., 2017).
2.1.1The influence of optimism bias on attitude.Optimism bias could lead to a careless attitude of "it won"t happen to me', thus placing the responsibility for secure behaviour on other people or on technology.Not paying attention to possible cyberthreats owing to a lack of conscious awareness can result in risky behaviour.For example, a phishing email may be seen as merely another request to be complied with, motivated either through fear or reward.
The current findings indicate that optimism bias and the illusion of control are widespread phenomena (Rhee et al., 2011).A person's attitude (belief) towards an object influences the overall pattern of their responses to the object, as demonstrated in numerous studies that tested the contention of strong attitude (belief) and subsequent behaviour relations (Ajzen and Fishbein, 1977).As a result of this strong inclination to underestimate their own risk and overrate their controllability observed in the domain of cyber security, we may argue that people may see little point in changing their current behaviour and have little motivation to be attentive to potential threats (Rhee et al., 2011).
2.1.2The influence of optimism bias on trust.Since trust is based on the positive expectations of others this can increase the outlook for trust in the good disposition of others (Andersson, 2012).Marsh (1994) argues that an optimist is likely to be one whose trust in others is high and who is inflexible in a downward direction.Thus, even though an optimist is abused by another, their trust in that other will not reduce by much.In contrast, the amount of trust a pessimist has in others will be relatively uncompromising Optimism bias in susceptibility in an upward direction, and a small manipulation by another will result in an extreme loss of trust (Marsh, 1994).Hence, if an unrealistic optimist's trust is not reduced by a previous security breach it could lead to further insecure behaviour.

Theory of planned behaviour
TPB posits that behaviour is influenced by attitude, subjective norms, and perceived behavioural control (Ajzen, 1991).Attitude is described as a taught inclination to judge things in a certain way and refers to positive or negative feelings about a specific behaviour.If the inclination changes, then attitude will change, leading to changed behaviour.A positive feeling about a certain behaviour will result in more motivation to perform the behaviour in question.'Subjective norms' refers to the assumed social burden involved in performing a specific behaviour.Perceived behavioural control is one's perception of the effort, considering one's skills and abilities, required to implement the behaviour of interest.Undertaking a specific behaviour may favour perceptions that are related to either a desirable or an undesirable outcome.An individual's intention to undertake a certain behaviour is a key aspect of the TPB, as intentions are thought to capture the motivating variables that influence behaviour; these are indicators of how hard someone is willing to try and how much work they intend to put in to complete the behaviour.In general, the stronger the desire to engage in a behaviour, the more likely it is that it will be carried out.
According to the TPB, behaviour is a function of salient beliefs about the behaviour in question.People may have various beliefs about an activity, but they can only concentrate on a few of them at a time.These salient beliefs may have an impact on a person's intentions and actions.By tying each idea to a specific outcome, behavioural beliefs influence attitude.We develop favourable attitudes towards beliefs that lead to desirable results and negative attitudes towards activities that lead to negative consequences.Normative beliefs are linked to social norms, as they are concerned with the approval or disapproval of a specific behaviour shown by influential persons.Control beliefs are linked to perceived behavioural control, as people believe they have more control over a behaviour when they possess the skills or confidence to perform it.
Other studies using the TPB to understand secure behaviour when using information systems include the evaluation of the rank order of employees, addressing careless cyber security behaviour, how to create training and awareness methods for better SETA, and investigating why SETA campaigns fail (Aurigemma and Mattson, 2017;Bada and Sasse, 2014;Safa et al., 2015).
The TPB was deemed suitable in the context of our study to guide the development of the research model and the associated hypotheses.As salient beliefs play a crucial role in the intention to perform a certain behaviour, and unrealistically optimistic people believe bad things will not happen to them, we are of the opinion that TPB is more suitable than any other theory examined.

Research model and hypotheses
3.1 Attitude towards secure behaviour Attitude has been defined as a way of thinking, feeling or acting that represents a mental or emotional condition (McLeod, 2018).In the context of this study, attitude is defined in relation to secure employee behaviour and thus not falling for phishing attacks.
Attitude is used in many studies relating to cyber security.Herath et al. (2014) found that attitude had a significant effect on the use of email identification services in behaving securely, and Lowry and Moody (2015) found that attitude influenced the behaviour to comply with organisational cyber security policies.Other examples include the study of ICS Turkanovi c and Polan ci c (2013) which explained that attitude towards privacy and security is moving from an unaware state to greater awareness where people try to protect themselves as best they can.In addition, a study done by Tsohou et al. (2015) confirms that biases affecting personal beliefs and attitudes influence the intention to behave securely.
We therefore hypothesise that: H1.An employee's attitude towards cyber security is positively associated with the intention to behave securely.In other words, the more the employee values cyber security the more they intend to behave securely.

The behavioural influence of trust
Trust in the context of this study is more accurately described as "benevolent trust"; the employees are placing significant trust in the security systems provided by the employer to protect their mutual interests (Mayer et al., 1995).Musuva et al. (2019) found that a person's perception of trustworthiness was a strong predictor of their behaviour.People who are trusting are more susceptible to becoming victims of social engineering than those who distrust.
Trust spans multiple areas, for example the use of systems, trusting that electronic communications received are authentic and, specifically in our study, the trust placed in the security controls implemented by the organisation's IT department.We adapted the TPB by replacing subjective norms with "trust", because our population consists of employees from one organisation, thus creating a semi-controlled environment.In the context of our study, we did not perceive social pressure in relation to the performance or non-performance of a certain behaviour as significant.The employees trust that the security controls implemented by the IT department would protect them regardless of what other employees were doing (Butavicius et al., 2020).This attitude could, however, lead to insecure behaviour.
Therefore, SETA must clearly state that the security tools that have been implemented only serve as a first line of defence and should not be trusted blindly.Employees need to understand that the tools cannot cater for all threats, especially zero-day vulnerabilities (Butavicius et al., 2020).Accountability for system use, whether that use is secure or insecure, cannot be abdicated.SETA should provide examples of scenarios where the security tools could fail to prevent a breach.We accordingly hypothesise that: H2.An employee's trust is positively associated with the use of technical cyber security controls creating the intention to behave securely.In other words, the more the employee values cyber security the more they intend to behave securely

The behavioural influence of awareness
Awareness is another factor influencing decision-making.Several studies have focused on computer users' inability to identify cyber security threats as a result of their lack of technical skills.Although SETA is believed to be the best solution for combating security attacks involving people, the desired outcome is not always achieved.Aloul's (2012) phishing audit resulted in 9% of users falling victim to the attack.SETA was subsequently conducted, and a second audit revealed a decrease from 9 to 2% of users.Abbasi et al. (2012) found that 15% of users ignored browser security toolbars warning them of phishing sites.Therefore, awareness is not the only factor that needs to be considered when dealing with people and cyber security, as optimism bias should be considered as well.

Optimism bias in susceptibility
A study by Li et al. (2019) showed that employees who are more aware of their company's information security policy and procedures were better equipped to behave securely.Tschakert and Ngamsuriyaroj (2019) performed SETA using various techniques, from instructor-led to gamification, video and text-based material, and found them all to be effective in raising awareness and changing behaviour.
As discussed, although SETA does not completely mitigate insecure behaviour for all employees, it does reduce the risk of insecure behaviour.We therefore hypothesise that: H3.An employee's awareness of cyber security is positively associated with the intention to behave securely

The behavioural influence of optimism bias
Importantly, empirical evidence links optimism bias to behaviour.For example, White et al. (2011) found that young drivers with optimism bias rated themselves as "somewhat more" skilled than a typical young driver in terms of perceived overall and specific driving skills, while rating themselves as "somewhat less" likely to be in an accident, even though road accidents are the leading cause of death and injury among those under the age of 25 in First World countries.optimism bias therefore has a direct influence on decision-making.
Previous research has found that some forms of optimism and self-enhancement can be reduced or at least controlled by providing people with more information (awareness) and making them more accountable for the accuracy of their predictions (Barberia et al., 2013).
A study performed by Cho et al. (2010) considered the influence of optimism bias on online privacy risks and found that unrealistically optimistic people do not respond to mere warning messages about privacy concerns, nor do they personalise the risk, seeing it rather as a risk to "others".Min and Kim (2015) also studied the effect of optimism bias on online privacy concerns and came to the same conclusion; that is, that unrealistically optimistic people engage in risky behaviour.People underestimate the risk because they believe they are immune to cyberattacks, even when others have been shown to be vulnerable.
In the TPB, perceived behavioural control refers to "people's perception of the ease or difficulty of performing the behaviour of interest" (Ajzen, 1991).In this study, we replaced perceived behavioural control with awareness as our control for secure behaviour.Furthermore, we believe that there is a significant difference in behaviour between the optimism bias group and the average group.Optimism bias people frequently believe that behaving securely requires no effort on their part, since they will not be placed in a threat situation.Thus, we hypothesise that: H4a. Employees' attitudes towards cyber security differ significantly between the unrealistic optimistic group and the average group H4b.Employees' trust is positively associated with cyber security and differs significantly between the unrealistic optimistic group and the average group H4c.Employees' awareness of cyber security is positively associated with the intention to behave securely and differs significantly between the unrealistic optimistic group and the average group

Methodology
We used a cross-sectional survey to inform this study (Saunders et al., 2016).Ethical clearance was obtained from both the university ethical standards committee and the financial services organisation where the primary data was collected.Respondents were informed that their results would be used in a research study and that they could opt-out at any stage.

Respondent population and sample
The respondents were representative of the general population of employees (n ¼ 775) in the financial services sector.This company was selected as an audit finding found it had insufficient SETA placing the company at risk.The company's IT is managed as an internal function consisting of infrastructure maintenance, end-user computing, custom software development and BI, network and security functions.SETA is offered through the company learner management system (LMS) and weekly topical security emails remind staff to remain vigilant.Respondents' qualifications and experience differs significantly from each other as the roles included call centre agents, accountants, IT professionals, attorneys and general admin staff.Since this is a financial company the risk resulting from a security breach will be significant as it could impact millions of customers.Systematic random sampling based on employee email addresses was used in the selection process.The email addresses were sorted alphabetically and every third employee was selected.Respondents included employees from all levels within the organisation, such as call centre consultants, administration clerks, staff employed in the legal, IT and human capital departments and senior management.

Data collection and screening
The questionnaire was distributed using the organisation's web-based learning management system (LMS) and completion of the questionnaire was voluntary.Data was collected from 257 respondents.Employees were encouraged by management to complete the survey due to a previous audit finding highlighting the importance of SETA.Owing to a combination of incomplete data input and validity concerns, 31 responses were dropped.Subsequently, the data obtained from the remaining 226 respondents was used for further analysis.Our research instrument was largely adapted from previous research.We used certain items from the Human Aspects of Information Security Questionnaire (HAIS-Q).HAIS-Q was developed to measure information security threats triggered by employees.The questionnaire focuses on individuals' knowledge and attitudes towards policies and procedures relating to their behaviour when using a work computer (Parsons et al., 2017).See Appendix Table A1 provides a complete outline of the items and the associated descriptive statistics that comprised the research instrument for this study.

Classifying employees as unrealistic optimistic or average
We used 11 optimism bias items (refer See Appendix Table A2) to determine which respondents were unrealistic optimists and which were average based on the study performed by Weinstein (1980).The life events that were included, both positive and negative, had to be relevant to all respondents.Respondents were asked to rate themselves in comparison to their peers in terms of the optimism bias questions.When it comes to positive events, optimism is defined as believing that one's odds are better than average, whereas pessimism is defined as believing that one's chances are worse than average.These definitions of optimistic and pessimistic responses are swapped for negative events (Weinstein, 1980).Accordingly, respondents had to complete all the optimism bias questions to categorise themselves as either unrealistic optimists or average.In our sample, 49% of respondents were classified as unrealistic optimists.

Data analysis and results
The primary data was analysed by way of PLS path modelling, which is a suitable approach when the data violates distributional assumptions (Hair et al., 2019).All the items with outer loadings in excess of 0.7 were kept and formed part of the resultant multivariate analysis.
4.4.1 Evaluating the measurement model.As part of this evaluation, we assessed both convergent and discriminant validity.To assess convergent validity, all the constructs and their respective items were investigated to establish whether their factor loadings exceeded 0.5.We also inspected the significance of the outer loadings of the items using their associated t-statistic values.Accordingly, the outer loading of each item was found to be in excess of 0.5.Additionally, all the constructs exhibited AVE values in excess of 0.5.Together, these findings satisfy the criteria for convergent validity.
Discriminant validity was assessed in three ways.First, we used the Fornell-Larcker criterion to evaluate whether the square root of the AVE value of each construct was greater than all the correlation coefficients among the measurement model constructs (Fornell and Larcker, 1981).See Appendix Table A3 presents these square root values on the diagonal.We also inspected the cross loadings to ensure that the items of each construct loaded highest on itself.As a third means of assessing discriminant validity, we inspected the mean heterotrait-monotrait (HTMT) ratio values which were all below the theoretical threshold of 0.9 (Henseler et al., 2014;Kline, 2011).Together, these tests confirm that the measurement model is valid from both a convergent and a discriminant perspective.Because we performed a PLS-based analysis, we also assessed the measurement model for any signs of multicollinearity (Hair et al., 2019;Lowry and Gaskin, 2014).All the variance inflation factor (VIF) values were below 3, thus eliminating the presence of multicollinearity (Craney and Surles, 2002;García et al., 2015;Hair et al., 2010).See Appendix Table A1 for a complete outline of these VIF values.As a final means of evaluating the measurement model, we also assessed the reliability of the questionnaire by calculating values for both composite reliability (CR) and Cronbach's alpha (CA).In general, the reliability of a questionnaire attests to whether it would produce the same results if it were to be administered again (Cronbach, 1951).Both the CA and CR values were in excess of the accepted threshold of 0.7 (Tavakol and Dennick, 2011), as presented in See Appendix Table A3.
4.4.2Evaluating the structural models.To evaluate the models from a structural perspective, we made use of PLS path modelling, in particular the path coefficients, predictive power (R 2 ), effect sizes (f 2 ), and the out-of-sample predictive relevance (Stone-Geisser's Q 2 ).A summarised version of the values relating to the global model is illustrated in Figure 1.From the results illustrated in Figure 1, it is clear that there is support for H1, 2 and H3.Additionally, the global model (both the unrealistically optimistic and the average individuals) accounts for 67.7% (adjusted R 2 ¼ 0.677) of the variance in the target construct (intention to behave securely).The out-of-sample predictive relevance equals 56.8% (Q 2 ¼ 0.568) (Geisser and Eddy, 1979;Hair et al., 2017;Stone, 1974).In the bootstrapped significance testing (with 100 subsamples), attitude was shown to exert a significant influence on intention to behave securely (b ¼ 0.452, p < 0.01), which supports the first hypothesis (H1).The results also indicate that trust significantly influences intention to behave securely (b ¼ 0.099, p < 0.05), which provides support for the second hypothesis (H2).
In addition to the significance testing that was conducted, we also assessed the relative impact of each independent variable by inspecting its effect size using Cohen's guidelines for f 2 (Cohen, 1988).Of all the latent constructs in the global model, attitude exhibited the largest effect size (0.363a large effect) on intention to behave securely.This was followed by ICS awareness (0.103a small effect), and finally trust (0.023the smallest effect).This indicates that an individual's attitude towards cyber security is pivotal when explaining their intentions to act securely within the workplace.
4.4.3PLS multigroup analysis (unrealistically optimistic vs average group).Because the results of the global model did not allow us to infer that a significant difference exists between those individuals who are optimistically biased as opposed to those who are not, we also conducted a PLS-based multigroup analysis.Although we could have employed a repeated application of unpaired t-tests (or a permutation test for that matter), it is important to note the shortcomings of these techniques.First, these tests do not take the whole model into account (Klesel et al., 2019).This is especially pertinent within exploratory contexts, where researchers are not just interested in significant differences between path coefficients.Second, unpaired t-tests are particularly sensitive to the distribution of the samplesomething that does not hamper PLS path modelling.
As with the significance tests conducted for the global model, we also performed a bootstrapped significance test.Note that PLS-based multigroup analyses are nonparametric and are largely based on specific group parameter estimates.These include model aspects such as outer weights and loadings, as well as path coefficients (Sarstedt et al., 2011).From the results presented in Table 1 below, it is apparent that all the relationships illustrated in Figure 1 show significant differences when the quantitative "unrealistic optimists" model is compared with the "average" model.In other words, there are significant differences in attitude, trust and level of awareness when comparing those individuals classified as unrealistic optimists with those classified as average.Importantly, these multigroup results provide support for the fourth hypothesis ( H4a, b and c).Refer to See Appendix Table A4 for a summary of all models that comprise this study.

Discussion
The focus of this study was to gain a better understanding of why certain employees are more susceptible to phishing than others.Specifically, we examined (1) how attitude, trust and awareness influenced the intention to behave securely, and (2) the effect of unrealistic optimism on attitude, trust, awareness and secure behaviour.From the results, the following key findings are noteworthy.First, attitude had a significant influence on the intention to behave securely in the workplace.Second, trust plays a significant role in how employees behave in the workplace, as they rely on the security controls implemented to protect them.Third, awareness leads to more secure behaviour but does not completely eradicate insecure behaviour.These findings give us a better understanding of which employees are more likely to behave insecurely.

Contributions
Our contribution enhances secure behaviour by considering a cognitive bias called optimism bias when confronted with phishing attacks.Our model (Figure 2) proposes that optimism bias influences the intention to behave securely.Our model contributes to the IS literature, providing a better understanding of why certain employees are more susceptible to insecure behaviour.
Our contribution to practice is that SETA programmes need to be adapted to cater for unrealistically optimistic people.SETA should contain a section intended to make employees aware of whether they are unrealistically optimistic and therefore what the likely outcome of their behaviour could look like.
The effectiveness of a SETA programme is influenced by how it is presented.The use of interactive methods to engage participation during training and being able to test the subject matter after the training are critical.SETA is not a once-off event but an ongoing programme, focusing on current threats and scenarios so that employees can relate to and correctly respond to these threats (Bauer et al., 2017).
A false sense of security is created in organisations with a dedicated IT department that manages the end-user environment by implementing restrictive administration controls and

ICS
demonstrating to users that "big brother" is watching.This approach encourages employees to blindly trust that these controls will protect them regardless of their behaviour (Butavicius et al., 2020).Instead, there is a need to educate employees through SETA programmes, so that they will understand that the controls which have been implemented could fail, and that the responsibility for behaving securely can never be placed solely on systems.
We compared the unrealistically optimistic group with the average group and found that attitude had the largest effect on the intention to behave securely.The attitude of "it won"t happen to me' among the unrealistic optimists significantly influenced their trust and ultimately their behaviour (see Appendix Table A4).This discrepancy sheds light on the psychological underpinnings of secure behaviour.Unrealistically optimistic individuals, driven by the belief that security incidents won't affect them personally, exhibit a distinct attitude.These individuals' perception that they are immune to threats influence their intention to behave securely.It is important to recognise that this is not a linear statistical observation but a robust indication that the attitude-intention relationship differs significantly between the two groups.For cyber security practitioners, addressing this attitude gap is imperative.Security awareness programs should be tailored to challenge this overly optimistic outlook, emphasising that anyone (regardless of perceived invulnerability) is susceptible to threats.These results underscore the need for interventions that reshape the optimistic group's attitude by debunking the notion that security breaches are an improbability for them.
As indicated in Table 1, our multigroup analysis also found a significant difference in trust towards the intention to behave securely between the two groups.This outcome implies that trust in security controls varies between unrealistically optimistic and average individuals.Unrealistically optimistic individuals may disproportionately rely on security measures, fostering a (potentially) misguided trust in their effectiveness.The nuanced statistical significance emphasises that this distinction is not arbitrary but indicative of a meaningful disparity in how trust influences the intention to behave securely.For security practitioners, clarifying the limitations of such security controls is therefore paramount.While trust is generally positive, an unwarranted trust that disregards personal responsibility (e.g. to act securely) might lead to negligent security practices.For this reason, we argue that education efforts should emphasise that security controls are part of a broader strategy where individuals need to actively contribute to safeguarding sensitive company information.
Our multigroup results also indicate that there is a significant difference between these groups when it comes to the relationship between awareness and their intention to behave securely.This highlights that awareness plays a role in shaping intentions differently for unrealistically optimistic and average individuals.While both groups benefit from awareness, the optimistic bias might influence how effectively security knowledge is integrated into behaviour.This result underscores the need for nuanced interventions in awareness programs.Unrealistically optimistic individuals may benefit from tailored content that not only imparts security knowledge but also addresses their specific cognitive biases.Fostering a realistic understanding of security threats is essential for translating awareness into proactive and secure behaviour.

Limitations and future research
Since data was collected from a single organisation, the effect of different cultures on optimism bias needs to be studied further.Chang et al (2001) compared optimism bias in American and Japanese people, finding that while both groups had an optimistic bias Optimism bias in susceptibility towards negative life experiences, the Japanese had a pessimistic bias for favourable life occurrences.In addition, variances in organisational culture influence the behaviour of users as they follow the example of top management (Hu et al., 2012).Culture plays a significant role in human behaviour and future research is needed to address this vital theme.
The organisation in which the data was collected has implemented strong security controls.Employees are aware of these controls and have seen in the past how they have successfully stopped certain threats.However, this could lead to a false sense of benevolence trust, creating a careless attitude towards behaving securely.Future research should be done in an organisation with fewer internal controls to see if this would alter the behaviour of unrealistic optimists.

Conclusion
The objective of this study was to understand how a cognitive bias called optimism bias makes certain individuals more susceptible to phishing attacks.Humans remain vulnerable to cyber security breaches, which is clearly seen in the increasing number of successful phishing attacks.We found that one's attitude, security awareness and trust are critical in terms of the intention to behave securely.
Interestingly, there was a significant difference between the average and unrealistically optimistic employee groups in relation to attitude, awareness and trust in terms of the intention to behave securely (Figure 2).A crucial difference in the attitude of an unrealistically optimistic employee is the perception that a security breach won't happen to them.As a result, we have found that unrealistic optimists are more susceptible to security threats.These employees need to be reminded that they cannot blindly trust systems, that not all requests are legitimate, and above all, that they are not immune to security threats.Multiple choice 100% less (no chance), 80% less, 60% less, 40% less, 20% less, 10% less, average, 10% more, 20% more, 40% more, 60% more, 80% more, 100% more, 3 times average, 5 times average 1.Compared to otder employees, what do you think are the chances tdat the following events will happen to you?Your work will be recognized with an award.2. Compared to other employees, what do you think are the chances that the following events will happen to you?You expect to live past the age of 80 years old 3. Compared to other employees, what do you think are the chances that the following events will happen to you?You may have contemplated suicide 4. Compared to other employees, what do you think are the chances that the following events will happen to you?You expect to read about your achievements in the newspaper 5. Compared to other employees, what do you think are the chances that the following events will happen to you?You don't expect to spend a night in hospital in the next 5 years 6.Compared to other employees, what do you think are the chances that the following events will happen to you?You are likely to contract a venereal disease 7. Compared to other employees, what do you think are the chances that the following events will happen to you?Your weight will remain constant for the next five years 8. Compared to other employees, what do you think are the chances that the following events will happen to you?You will probably have a heart attack before the age of 60 years old 9. Compared to other employees, what do you think are the chances that the following events will happen to you?You won't fall ill next winter 10.Compared to other employees, what do you think are the chances that the following events will happen to you?You are likely to take an unattractive job 11.Compared to other employees, what do you think are the chances that the following events will happen to you?You are likely to be fired from a job Source: Created by the author

Optimism bias in susceptibility
About the authors Morn e Owen holds a N.Dip: IT, B.Tech: IT, Master's in Business Information Systems and PhD degree.His research interests lie in behavioural cyber security.Morn e is the Chief Information Officer for a financial services company.Morn e Owen is corresponding author and can be contacted at: morne.owen@gmail.com Stephen V. Flowerday is a Professor in the School of Cyber Studies at the University of Tulsa.His research interests lie in cybersecurity, behavioural information security, and information security management.Over the last sixteen years, he has authored and co-authored more than 120 refereed publications.His work has been funded by IBM, THRIP, NRF, SASUF, ERASMUS, GMRDC, and others.
Karl van der Schyff holds a BSc, MSc, and PhD degree.His field of interest lie in behavioural information security, information privacy, and cyberpsychology.He has authored and co-authored several refereed publications in addition to reviewing publications within the senior scholars basket of IS journals, such as the Journal of the Association for Information Systems (JAIS).
For instructions on how to order reprints of this article, please visit our website: www.emeraldgrouppublishing.com/licensing/reprints.htm Or contact us for further details: permissions@emeraldinsight.com

Table A4 .
Summary of all models