Search results

1 – 10 of over 6000
Article
Publication date: 15 September 2023

Richard G. Mathieu and Alan E. Turovlin

Cyber risk has significantly increased over the past twenty years. In many organizations, data and operations are managed through a complex technology stack underpinned by an…

Abstract

Purpose

Cyber risk has significantly increased over the past twenty years. In many organizations, data and operations are managed through a complex technology stack underpinned by an Enterprise Resource Planning (ERP) system such as systemanalyse programmentwicklung (SAP). The ERP environment by itself can be overwhelming for a typical ERP Manager, coupled with increasing cybersecurity issues that arise creating periods of intense time pressure, stress and workload, increasing risk to the organization. This paper aims to identify a pragmatic approach to prioritize vulnerabilities for the ERP Manager.

Design/methodology/approach

Applying attention-based theory, a pragmatic approach is developed to prioritize an organization’s response to the National Institute of Standards and Technology (NIST) National Vulnerability Database (NVD) vulnerabilities using a Classification and Regression Tree (CART).

Findings

The application of classification and regression tree (CART) to the National Institute of Standards and Technology’s National Vulnerability Database identifies prioritization unavailable within the NIST’s categorization.

Practical implications

The ERP Manager is a role between technology, functionality, centralized control and organization data. Without CART, vulnerabilities are left to a reactive approach, subject to overwhelming situations due to intense time pressure, stress and workload.

Originality/value

To the best of the authors’ knowledge, this work is original and has not been published elsewhere, nor is it currently under consideration for publication elsewhere. CART has previously not been applied to the prioritizing cybersecurity vulnerabilities.

Details

Information & Computer Security, vol. 31 no. 5
Type: Research Article
ISSN: 2056-4961

Keywords

Book part
Publication date: 30 January 2023

Anastasija Nikiforova, Artjoms Daskevics and Otmane Azeroual

Nowadays, there are billions interconnected devices forming Cyber-Physical Systems (CPS), Internet of Things (IoT) and Industrial Internet of Things (IIoT) ecosystems. With an…

Abstract

Nowadays, there are billions interconnected devices forming Cyber-Physical Systems (CPS), Internet of Things (IoT) and Industrial Internet of Things (IIoT) ecosystems. With an increasing number of devices and systems in use, amount and the value of data, the risks of security breaches increase. One of these risks is posed by open data sources, which are databases that are not properly protected. These poorly protected databases are accessible to external actors, which poses a serious risk to the data holder and the results of data-related activities such as analysis, forecasting, monitoring, decision-making, policy development, and the whole contemporary society. This chapter aims at examining the state of the security of open data databases representing both relational databases and NoSQL, with a particular focus on a later category.

Details

Big Data and Decision-Making: Applications and Uses in the Public and Private Sector
Type: Book
ISBN: 978-1-80382-552-6

Keywords

Article
Publication date: 5 March 2018

Baidyanath Biswas and Arunabha Mukhopadhyay

Malicious attackers frequently breach information systems by exploiting disclosed software vulnerabilities. Knowledge of these vulnerabilities over time is essential to decide the…

Abstract

Purpose

Malicious attackers frequently breach information systems by exploiting disclosed software vulnerabilities. Knowledge of these vulnerabilities over time is essential to decide the use of software products by organisations. The purpose of this paper is to propose a novel G-RAM framework for business organisations to assess and mitigate risks arising out of software vulnerabilities.

Design/methodology/approach

The G-RAM risk assessment module uses GARCH to model vulnerability growth. Using 16-year data across 1999-2016 from the National Vulnerability Database, the authors estimate the model parameters and validate the prediction accuracy. Next, the G-RAM risk mitigation module designs optimal software portfolio using Markowitz’s mean-variance optimisation for a given IT budget and preference.

Findings

Based on an empirical analysis, this study establishes that vulnerability follows a non-linear, time-dependent, heteroskedastic growth pattern. Further, efficient software combinations are proposed that optimise correlated risk. The study also reports the empirical evidence of a shift in efficient frontier of software configurations with time.

Research limitations/implications

Existing assumption of independent and identically distributed residuals after vulnerability function fitting is incorrect. This study applies GARCH technique to measure volatility clustering and mean reversal. The risk (or volatility) represented by the instantaneous variance is dependent on the immediately previous one, as well as on the unconditional variance of the entire vulnerability growth process.

Practical implications

The volatility-based estimation of vulnerability growth is a risk assessment mechanism. Next, the portfolio analysis acts as a risk mitigation activity. Results from this study can decide patch management cycle needed for each software – individual or group patching. G-RAM also ranks them into a 2×2 risk-return matrix to ensure that the correlated risk is diversified. Finally the paper helps the business firms to decide what to purchase and what to avoid.

Originality/value

Contrary to the existing techniques which either analyse with statistical distributions or linear econometric methods, this study establishes that vulnerability growth follows a non-linear, time-dependent, heteroskedastic pattern. The paper also links software risk assessment to IT governance and strategic business objectives. To the authors’ knowledge, this is the first study in IT security to examine and forecast volatility, and further design risk-optimal software portfolios.

Details

Journal of Enterprise Information Management, vol. 31 no. 2
Type: Research Article
ISSN: 1741-0398

Keywords

Article
Publication date: 11 September 2017

Brenden Kuerbis and Farzaneh Badiei

There is growing contestation between states and private actors over cybersecurity responsibilities, and its governance is ever more susceptible to nationalization. The authors…

1983

Abstract

Purpose

There is growing contestation between states and private actors over cybersecurity responsibilities, and its governance is ever more susceptible to nationalization. The authors believe these developments are based on an incomplete picture of how cybersecurity is actually governed in practice and theory. Given this disconnect, this paper aims to attempt to provide a cohesive understanding of the cybersecurity institutional landscape.

Design/methodology/approach

Drawing from institutional economics and using extensive desk research, the authors develop a conceptual model and broadly sketch the activities and contributions of market, networked and hierarchical governance structures and analyze how they interact to produce and govern cybersecurity.

Findings

Analysis shows a robust market and networked governance structures and a more limited role for hierarchical structures. Ex ante efforts to produce cybersecurity using purely hierarchical governance structures, even buttressed with support from networked governance structures, struggle without market demand like in the case of secure internet identifiers. To the contrary, ex post efforts like botnet mitigation, route monitoring and other activities involving information sharing seem to work under a variety of combinations of governance structures.

Originality/value

The authors’ conceptual framework and observations offer a useful starting point for unpacking how cybersecurity is produced and governed; ultimately, we need to understand if and how these governance structure arrangements actually impact variation in observed levels of cybersecurity.

Details

Digital Policy, Regulation and Governance, vol. 19 no. 6
Type: Research Article
ISSN: 2398-5038

Keywords

Article
Publication date: 6 February 2007

Theodore Tryfonas, Iain Sutherland and Ioannis Pompogiatzis

The purpose of this paper is to discuss and amalgamate information security principles, and legal and ethical concerns that surround security testing and components of generic…

2135

Abstract

Purpose

The purpose of this paper is to discuss and amalgamate information security principles, and legal and ethical concerns that surround security testing and components of generic security testing methodologies that can be applied to Voice over Internet Protocol (VoIP), in order to form an audit methodology that specifically addresses the needs of this technology.

Design/methodology/approach

Information security principles, legal and ethical concerns are amalgamated that surround security testing and components of generic security testing methodologies that can be applied to VoIP. A simple model is created of a business infrastructure (core network) for the delivery of enterprise VoIP services and the selected tests are applied through a methodically structured action plan.

Findings

The main output of this paper is a, documented in detail, testing plan (audit programme) for the security review of a core VoIP enterprise network infrastructure. Also, a list of recommendations for good testing practice based on the testing experience and derived through the phase of the methodology evaluation stage.

Research limitations/implications

The methodology in the paper does not extend at the moment to the testing of the business operation issues of VoIP telephony, such as revenue assurance or toll fraud detection.

Practical implications

This approach facilitates the conduct or security reviews and auditing in a VoIP infrastructure.

Originality/value

VoIP requires appropriate security testing before its deployment in a commercial environment. A key factor is the security of the underlying data network. If the business value of adopting VoIP is considered then the potential impact of a related security incident becomes clear. This highlights the need for a coherent security framework that includes means for security reviews, risk assessments, and influencing design and deployment. In this respect, this approach can meet this requirement.

Details

Internet Research, vol. 17 no. 1
Type: Research Article
ISSN: 1066-2243

Keywords

Article
Publication date: 11 October 2011

Hannes Holm, Teodor Sommestad, Jonas Almroth and Mats Persson

The purpose of this paper is to evaluate if automated vulnerability scanning accurately identifies vulnerabilities in computer networks and if this accuracy is contingent on the…

4918

Abstract

Purpose

The purpose of this paper is to evaluate if automated vulnerability scanning accurately identifies vulnerabilities in computer networks and if this accuracy is contingent on the platforms used.

Design/methodology/approach

Both qualitative comparisons of functionality and quantitative comparisons of false positives and false negatives are made for seven different scanners. The quantitative assessment includes data from both authenticated and unauthenticated scans. Experiments were conducted on a computer network of 28 hosts with various operating systems, services and vulnerabilities. This network was set up by a team of security researchers and professionals.

Findings

The data collected in this study show that authenticated vulnerability scanning is usable. However, automated scanning is not able to accurately identify all vulnerabilities present in computer networks. Also, scans of hosts running Windows are more accurate than scans of hosts running Linux.

Research limitations/implications

This paper focuses on the direct output of automated scans with respect to the vulnerabilities they identify. Areas such as how to interpret the results assessed by each scanner (e.g. regarding remediation guidelines) or aggregating information about individual vulnerabilities into risk measures are out of scope.

Practical implications

This paper describes how well automated vulnerability scanners perform when it comes to identifying security issues in a network. The findings suggest that a vulnerability scanner is a useable tool to have in your security toolbox given that user credentials are available for the hosts in your network. Manual effort is however needed to complement automated scanning in order to get satisfactory accuracy regarding network security problems.

Originality/value

Previous studies have focused on the qualitative aspects on vulnerability assessment. This study presents a quantitative evaluation of seven of the most popular vulnerability scanners available on the market.

Details

Information Management & Computer Security, vol. 19 no. 4
Type: Research Article
ISSN: 0968-5227

Keywords

Article
Publication date: 13 March 2017

Hannes Holm and Teodor Sommestad

It is often argued that the increased automation and availability of offensive cyber tools has decreased the skill and knowledge required by attackers. Some say that all it takes…

Abstract

Purpose

It is often argued that the increased automation and availability of offensive cyber tools has decreased the skill and knowledge required by attackers. Some say that all it takes to succeed with an attack is to follow some instructions and push some buttons. This paper aims to tests this idea empirically through live exploits and vulnerable machines in a cyber range.

Design/methodology/approach

The experiment involved 204 vulnerable machines in a cyber range. Exploits were chosen based on the results of automated vulnerability scanning. Each exploit was executed following a set of carefully planned actions that enabled reliable tests. A total of 1,223 exploitation attempts were performed.

Findings

A mere eight exploitation attempts succeeded. All these involved the same exploit module (ms08_067_netapi). It is concluded that server-side attacks still are too complicated for novices who lack the skill or knowledge to tune their attacks.

Originality/value

This paper presents the largest conducted test of exploit effectiveness to date. It also presents a sound method for reliable tests of exploit effectiveness (or system vulnerability).

Details

Information & Computer Security, vol. 25 no. 1
Type: Research Article
ISSN: 2056-4961

Keywords

Article
Publication date: 22 October 2019

Navneet Bhatt, Adarsh Anand and Deepti Aggrawal

The purpose of this paper is to provide a mathematical framework to optimally allocate resources required for the discovery of vulnerabilities pertaining to different severity…

Abstract

Purpose

The purpose of this paper is to provide a mathematical framework to optimally allocate resources required for the discovery of vulnerabilities pertaining to different severity risk levels.

Design/methodology/approach

Different sets of optimization problems have been formulated and using the concept of dynamic programming approach, sequence of recursive functions has been constructed for the optimal allocation of resources used for discovering vulnerabilities of different severity scores. Mozilla Thunderbird web browser data set has been considered for giving the empirical evaluation by working with vulnerabilities of different severities.

Findings

As per the impact associated with a vulnerability, critical and high severity level are required to be patched promptly, and hence, a larger amount of funds have to be allocated for vulnerability discovery. Nevertheless, a low or medium risk vulnerability might also get exploited and thereby their discovery is also crucial for higher severity vulnerabilities. The current framework provides a diversified allocation of funds as per the requirement of a software manager and also aims at improving the discovery of vulnerability significantly.

Practical implications

The finding of this research may enable software managers to adequately assign resources in managing the discovery of vulnerabilities. It may also help in acknowledging the funds required for various bug bounty programs to cater security reporters based on the potential number of vulnerabilities present in software.

Originality/value

Much of the attention has been focused on the vulnerability discovery modeling and the risk associated with the security flaws. But, as far as the authors’ knowledge is concern, there is no such study that incorporates optimal allocation of resources with respect to the vulnerabilities of different severity scores. Hence, the building block of this paper contributes to future research.

Details

International Journal of Quality & Reliability Management, vol. 37 no. 6/7
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 30 November 2020

Bharadwaj R.K. Mantha and Borja García de Soto

The aim of this study is o examine the advantages and disadvantages of different existing scoring systems in the cybersecurity domain and their applicability to the AEC industry…

Abstract

Purpose

The aim of this study is o examine the advantages and disadvantages of different existing scoring systems in the cybersecurity domain and their applicability to the AEC industry and to systematically apply a scoring system to determine scores for some of the most significant construction participants.

Design/methodology/approach

This study proposes a methodology that uses the Common Vulnerability Scoring System (CVSS) to calculate scores and the likelihood of occurrence based on communication frequencies to ultimately determine risk categories for different paths in a construction network. As a proof of concept, the proposed methodology is implemented in a construction network from a real project found in the literature.

Findings

Results show that the proposed methodology could provide valuable information to assist project participants to assess the overall cybersecurity vulnerability of construction and assist during the vulnerability-management processes. For example, a project owner can use this information to get a better understanding of what to do to limit its vulnerability, which will lead to the overall improvement of the security of the construction network.

Research limitations/implications

It has to be noted that the scoring systems, the scores and categories adopted in the study need not necessarily be an exact representation of all the construction participants or networks. Therefore, caution should be exercised to avoid generalizing the results of this study.

Practical implications

The proposed methodology can provide valuable information and assist project participants to assess the overall cyber-vulnerability of construction projects and support the vulnerability-management processes. For example, a project owner can use this approach to get a better understanding of what to do to limit its cyber-vulnerability exposure, which will ultimately lead to the overall improvement of the construction network's security. This study will also help raise more awareness about the cybersecurity implications of the digitalization and automation of the AEC industry among practitioners and construction researchers.

Social implications

Given the amount of digitized services and tools used in the AEC industry, cybersecurity is increasingly becoming critical for society in general. In some cases, (e.g. critical infrastructure) incidents could have significant economic and societal or public safety implications. Therefore, proper consideration and action from the AEC research community and industry are needed.

Originality/value

To the authors' knowledge, this is the first attempt to measure and assess the cybersecurity of individual participants and the construction network as a whole by using the Common Vulnerability Scoring System.

Details

Engineering, Construction and Architectural Management, vol. 28 no. 10
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 1 June 2012

Teodor Sommestad, Hannes Holm and Mathias Ekstedt

The purpose of this paper is to identify the importance of the factors that influence the success rate of remote arbitrary code execution attacks. In other words, attacks which…

Abstract

Purpose

The purpose of this paper is to identify the importance of the factors that influence the success rate of remote arbitrary code execution attacks. In other words, attacks which use software vulnerabilities to execute the attacker's own code on targeted machines. Both attacks against servers and attacks against clients are studied.

Design/methodology/approach

The success rates of attacks are assessed for 24 scenarios: 16 scenarios for server‐side attacks and eight for client‐side attacks. The assessment is made through domain experts and is synthesized using Cooke's classical method, an established method for weighting experts' judgments. The variables included in the study were selected based on the literature, a pilot study, and interviews with domain experts.

Findings

Depending on the scenario in question, the expected success rate varies between 15 and 67 percent for server‐side attacks and between 43 and 67 percent for client‐side attacks. Based on these scenarios, the influence of different protective measures is identified.

Practical implications

The results of this study offer guidance to decision makers on how to best secure their assets against remote code execution attacks. These results also indicate the overall risk posed by this type of attack.

Originality/value

Attacks that use software vulnerabilities to execute code on targeted machines are common and pose a serious risk to most enterprises. However, there are no quantitative data on how difficult such attacks are to execute or on how effective security measures are against them. The paper provides such data using a structured technique to combine expert judgments.

1 – 10 of over 6000