Search results

1 – 10 of over 28000
Article
Publication date: 16 October 2023

Miguel Calvo and Marta Beltrán

This paper aims to propose a new method to derive custom dynamic cyber risk metrics based on the well-known Goal, Question, Metric (GQM) approach. A framework that complements it…

Abstract

Purpose

This paper aims to propose a new method to derive custom dynamic cyber risk metrics based on the well-known Goal, Question, Metric (GQM) approach. A framework that complements it and makes it much easier to use has been proposed too. Both, the method and the framework, have been validated within two challenging application domains: continuous risk assessment within a smart farm and risk-based adaptive security to reconfigure a Web application firewall.

Design/methodology/approach

The authors have identified a problem and provided motivation. They have developed their theory and engineered a new method and a framework to complement it. They have demonstrated the proposed method and framework work, validating them in two real use cases.

Findings

The GQM method, often applied within the software quality field, is a good basis for proposing a method to define new tailored cyber risk metrics that meet the requirements of current application domains. A comprehensive framework that formalises possible goals and questions translated to potential measurements can greatly facilitate the use of this method.

Originality/value

The proposed method enables the application of the GQM approach to cyber risk measurement. The proposed framework allows new cyber risk metrics to be inferred by choosing between suggested goals and questions and measuring the relevant elements of probability and impact. The authors’ approach demonstrates to be generic and flexible enough to allow very different organisations with heterogeneous requirements to derive tailored metrics useful for their particular risk management processes.

Details

Information & Computer Security, vol. 32 no. 2
Type: Research Article
ISSN: 2056-4961

Keywords

Article
Publication date: 18 May 2020

Eleni-Laskarina Makri, Zafeiroula Georgiopoulou and Costas Lambrinoudakis

This study aims to assist organizations to protect the privacy of their users and the security of the data that they store and process. Users may be the customers of the…

Abstract

Purpose

This study aims to assist organizations to protect the privacy of their users and the security of the data that they store and process. Users may be the customers of the organization (people using the offered services) or the employees (users who operate the systems of the organization). To be more specific, this paper proposes a privacy impact assessment (PIA) method that explicitly takes into account the organizational characteristics and employs a list of well-defined metrics as input, demonstrating its applicability to two hospital information systems with different characteristics.

Design/methodology/approach

This paper presents a PIA method that employs metrics and takes into account the peculiarities and other characteristics of the organization. The applicability of the method has been demonstrated on two Hospital Information Systems with different characteristics. The aim is to assist the organizations to estimate the criticality of potential privacy breaches and, thus, to select the appropriate security measures for the protection of the data that they collect, process and store.

Findings

The results of the proposed PIA method highlight the criticality of each privacy principle for every data set maintained by the organization. The method employed for the calculation of the criticality level, takes into account the consequences that the organization may experience in case of a security or privacy violation incident on a specific data set, the weighting of each privacy principle and the unique characteristics of each organization. So, the results of the proposed PIA method offer a strong indication of the security measures and privacy enforcement mechanisms that the organization should adopt to effectively protect its data.

Originality/value

The novelty of the method is that it handles security and privacy requirements simultaneously, as it uses the results of risk analysis together with those of a PIA. A further novelty of the method is that it introduces metrics for the quantification of the requirements and also that it takes into account the specific characteristics of the organization.

Details

Information & Computer Security, vol. 28 no. 4
Type: Research Article
ISSN: 2056-4961

Keywords

Article
Publication date: 11 March 2021

Abroon Qazi and Mecit Can Emre Simsekler

The purpose of this paper is to develop and operationalize a process for prioritizing supply chain risks that is capable of capturing the value at risk (VaR), the maximum loss…

Abstract

Purpose

The purpose of this paper is to develop and operationalize a process for prioritizing supply chain risks that is capable of capturing the value at risk (VaR), the maximum loss expected at a given confidence level for a specified timeframe associated with risks within a network setting.

Design/methodology/approach

The proposed “Worst Expected Best” method is theoretically grounded in the framework of Bayesian Belief Networks (BBNs), which is considered an effective technique for modeling interdependency across uncertain variables. An algorithm is developed to operationalize the proposed method, which is demonstrated using a simulation model.

Findings

Point estimate-based methods used for aggregating the network expected loss for a given supply chain risk network are unable to project the realistic risk exposure associated with a supply chain. The proposed method helps in establishing the expected network-wide loss for a given confidence level. The vulnerability and resilience-based risk prioritization schemes for the model considered in this paper have a very weak correlation.

Originality/value

This paper introduces a new “Worst Expected Best” method to the literature on supply chain risk management that helps in assessing the probabilistic network expected VaR for a given supply chain risk network. Further, new risk metrics are proposed to prioritize risks relative to a specific VaR that reflects the decision-maker's risk appetite.

Details

International Journal of Quality & Reliability Management, vol. 39 no. 1
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 29 April 2014

Anthony J. Perrenoud, Brian C. Lines and Kenneth T. Sullivan

The purpose of this study is to describe how the University of Minnesota's capital program implemented risk management metrics on 266 construction projects and to present the…

Abstract

Purpose

The purpose of this study is to describe how the University of Minnesota's capital program implemented risk management metrics on 266 construction projects and to present the results of the risk metrics.

Design/methodology/approach

The implementation of Weekly Risk Reports (WRR) on the university construction projects captured information on the internal and external efforts related to minimizing project risks. The report implemented captured project risks, management plans, cost changes and schedule delays.

Findings

Findings reveal that the university was able to effectively capture project risk metrics through the WRR. The risk metrics identified the risks categories that impacted the 266 project costs and schedules. Through these findings, the university has a better understanding of how their internal stakeholders create the greatest risk to impacting the project cost and schedule. This paper presents the risk impacts collected from the 266 projects.

Research limitations/implications

A complete analysis of the risk metrics was limited in this research due to the extensive measurements collected. Future analysis will provide additional findings from the risk information.

Originality/value

The paper presents both the implementation and the risk management measurements used within a capital program of a major university to provide understanding of the common risks that are involved with capital projects.

Details

Journal of Facilities Management, vol. 12 no. 2
Type: Research Article
ISSN: 1472-5967

Keywords

Article
Publication date: 6 August 2018

Shihong Li

This paper aims to investigate whether the Section 404 of Sarbanes–Oxley Act (SOX 404) changed the way banks use accounting information to price corporate loans.

Abstract

Purpose

This paper aims to investigate whether the Section 404 of Sarbanes–Oxley Act (SOX 404) changed the way banks use accounting information to price corporate loans.

Design/methodology/approach

The study uses a sample of 1,173 US-listed firms that issued syndicated loans both before and after their compliance with SOX 404 to analyze the changes in loan spread’s sensitivity to some key accounting metrics such as ROA, interest coverage, leverage and net worth.

Findings

The study finds that the interest spread’s sensitivity to key accounting metrics, most noticeably for ROA, declined following the borrower’s compliance with the requirements of SOX 404. The decline was not explainable by borrowers that disclosed internal control weaknesses but concentrated among borrowers suspected of real earnings management (REM).

Originality/value

By examining the effects of SOX 404 on banks’ pricing process, this study augments the literature on SOX’s economic consequences. The findings suggest that lenders perceive little new information from SOX 404 disclosures of internal control deficiencies and are cautious about the accounting information provided by REM borrowers. It also extends the research on the use of accounting information in debt contracting. By examining loan interest’s sensitivity to accounting metrics, it broadens the concept of debt contracting value of accounting information to include accounting’s usefulness for assessing credit risk at loan inception.

Details

International Journal of Accounting & Information Management, vol. 26 no. 3
Type: Research Article
ISSN: 1834-7649

Keywords

Article
Publication date: 8 January 2019

Vikas Goyal and Prashant Mishra

The purpose of this paper is to develop a nuanced framework for evaluating a channel partner’s performance in distribution channel relationships. Given a channel partner’s task…

Abstract

Purpose

The purpose of this paper is to develop a nuanced framework for evaluating a channel partner’s performance in distribution channel relationships. Given a channel partner’s task environment characteristics (high/low munificence, dynamism and complexity), the study examines which performance metrics (output, activity or capability) are most relevant for evaluating its performance levels effectively.

Design/methodology/approach

The study adopts self-administered cross-sectional survey-based research design. Matched data were collected from 252 channel partners – manager relationship dyads. The latent change score (LCS) model within SEM framework provides mean paired-differences of the relevance ratings for each metrics. This was used to assess the empirical validity of the hypothesized relationships.

Findings

The study demonstrates the importance of calibrating performance evaluation metrics to a channel partner’s task environment state, made possible by its holistic approach to performance evaluation. Based on an extensive analysis, it shows that no single metric is relevant within all environmental states; rather, it could be dysfunctional, a result that differs from vast majority of the literature.

Research limitations/implications

Investigates individual linkages between task environment dimensions and performance metrics to provide a fuller understanding of these relationships. Also provides a theoretical framework to support further research on the topic.

Practical implications

The study provides managerial guidelines (and extensive graphical analysis) for nuanced and dynamic evaluation of channel partners’ performance that can enable firms to identify and promote their most valuable channel partners and prevent the deterioration of others.

Originality/value

First one to develop and empirically validate a nuanced framework for evaluating performance of exchange partners that operate under diverse task environment states.

Details

Journal of Business & Industrial Marketing, vol. 34 no. 2
Type: Research Article
ISSN: 0885-8624

Keywords

Article
Publication date: 29 December 2017

Syrus Islam, Ralph Adler and Deryl Northcott

Performance measurement systems (PMSs) are at the heart of most organisations. The aim of this study is to examine the attitudes of top-level managers towards the incompleteness…

Abstract

Purpose

Performance measurement systems (PMSs) are at the heart of most organisations. The aim of this study is to examine the attitudes of top-level managers towards the incompleteness of PMSs.

Design/methodology/approach

This paper draws on an in-depth field study conducted in an energy and environmental services provider based in New Zealand. The data, which were obtained from 20 semi-structured interviews, were triangulated against on-site observations and company documents.

Findings

The findings suggest that whether the incompleteness of a PMS is considered problematic or non-problematic depends on the role that the PMS plays in implementing a firm’s strategy. The authors show that when the PMS is mainly used to trigger improvement activities on and around strategic objectives and managers perceive adequate improvement activities to exist, then they consider the incompleteness of the PMS in relation to these strategic objectives to be non-problematic.

Originality/value

This study contributes to the nascent literature on managerial attitudes towards the incompleteness of PMSs by identifying conditions under which the incompleteness is considered problematic or non-problematic. The authors also contribute to the literature on the association between design qualities of PMSs and firm performance by suggesting that poor design qualities of a PMS (such as incompleteness) may not always translate into poor firm performance.

Details

Qualitative Research in Accounting & Management, vol. 15 no. 1
Type: Research Article
ISSN: 1176-6093

Keywords

Article
Publication date: 11 March 2021

Camelia Delcea, Liviu-Adrian Cotfas, R. John Milne, Naiming Xie and Rafał Mierzwiak

The airline industry has been significantly hit by the occurrence of the new coronavirus SARS-CoV-2, facing one of its worst crises in history. In this context, the present paper…

Abstract

Purpose

The airline industry has been significantly hit by the occurrence of the new coronavirus SARS-CoV-2, facing one of its worst crises in history. In this context, the present paper analyses one of the well-known boarding methods used in practice by the airlines before and during the coronavirus outbreak, namely back-to-front and suggests which variations of this method to use when three passenger boarding groups are considered and a jet bridge connects the airport terminal with the airplane.

Design/methodology/approach

Based on the importance accorded by the airlines to operational performance, health risks, and passengers' comfort, the variations in three passenger groups back-to-front boarding are divided into three clusters using the grey clustering approach offered by the grey systems theory.

Findings

Having the clusters based on the selected metrics and considering the social distance among the passengers, airlines can better understand how the variations in back-to-front perform in the new conditions imposed by the novel coronavirus and choose the boarding approach that better fits its policy and goals.

Originality/value

The paper combines the advantages offered by grey clustering and agent-based modelling for offering to determine which are the best configurations that offer a reduced boarding time, while accounting for reduced passengers' health risk, measured through three indicators: aisle risk, seat risk and type-3 seat interferences and for an increased comfort for the passengers manifested through a continuous walking flow while boarding.

Details

Grey Systems: Theory and Application, vol. 12 no. 1
Type: Research Article
ISSN: 2043-9377

Keywords

Article
Publication date: 3 October 2016

Jeremy King and Gary Wayne van Vuuren

This paper aims to investigate the use of the bias ratio as a possible early indicator of financial fraud – specifically in the reporting of hedge fund returns. In the wake of the…

Abstract

Purpose

This paper aims to investigate the use of the bias ratio as a possible early indicator of financial fraud – specifically in the reporting of hedge fund returns. In the wake of the 2008-2009 financial crisis, numerous hedge funds were liquidated and several cases of financial fraud exposed.

Design/methodology/approach

Risk-adjusted return metrics such as the Sharpe ratio and Value at Risk were used to raise suspicion for fraud. These metrics, however, assume distributional normality and thus have had limited success with hedge fund returns (a characteristic of which is highly skewed, non-normal return distributions).

Findings

Results indicate that potential fraud would have been detected in the early stages of the scheme’s life. Having demonstrated the credibility of the bias ratio, it was then applied to several indices and (anonymous) South African hedge funds. The results were used to demonstrate the ratio’s scope and robustness and draw attention to other metrics which could be used in conjunction with it. Results from these multiple sources could be used to justify further investigation.

Research limitations/implications

The traditional metrics for performance evaluation (such as the Sharpe ratio), assume distributional normality and thus have had limited success with hedge fund returns (a characteristic of which is highly skewed, non-normal return distributions). The bias ratio, which does not rely on normally distributed returns, was applied to a known fraud case (Madoff’s Ponzi scheme).

Practical implications

The effectiveness of the bias ratio in demonstrating potential suspicious financial activity has been demonstrated.

Originality/value

The financial market has come under heightened scrutiny in the past decade (2005 – 2015) as a result of the fragile and uncertain economic milieu that still (2015) persists. Numerous risk and return measures have been used to evaluate hedge funds’ risk-adjusted performance, but many fail to account for non-normal return distributions exhibited by hedge funds. The bias ratio, however, has been demonstrated to effectively flag potentially fraudulent funds.

Details

Journal of Financial Crime, vol. 23 no. 4
Type: Research Article
ISSN: 1359-0790

Keywords

Abstract

Details

Tools and Techniques for Financial Stability Analysis
Type: Book
ISBN: 978-1-78756-846-4

1 – 10 of over 28000