Search results

1 – 10 of over 9000
Open Access
Article
Publication date: 25 February 2020

Zsolt Tibor Kosztyán, Tibor Csizmadia, Zoltán Kovács and István Mihálcz

The purpose of this paper is to generalize the traditional risk evaluation methods and to specify a multi-level risk evaluation framework, in order to prepare customized risk

3598

Abstract

Purpose

The purpose of this paper is to generalize the traditional risk evaluation methods and to specify a multi-level risk evaluation framework, in order to prepare customized risk evaluation and to enable effectively integrating the elements of risk evaluation.

Design/methodology/approach

A real case study of an electric motor manufacturing company is presented to illustrate the advantages of this new framework compared to the traditional and fuzzy failure mode and effect analysis (FMEA) approaches.

Findings

The essence of the proposed total risk evaluation framework (TREF) is its flexible approach that enables the effective integration of firms’ individual requirements by developing tailor-made organizational risk evaluation.

Originality/value

Increasing product/service complexity has led to increasingly complex yet unique organizational operations; as a result, their risk evaluation is a very challenging task. Distinct structures, characteristics and processes within and between organizations require a flexible yet robust approach of evaluating risks efficiently. Most recent risk evaluation approaches are considered to be inadequate due to the lack of flexibility and an inappropriate structure for addressing the unique organizational demands and contextual factors. To address this challenge effectively, taking a crucial step toward customization of risk evaluation.

Details

International Journal of Quality & Reliability Management, vol. 37 no. 4
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 5 May 2015

Ron Weber, Wilm Fecke, Imke Moeller and Oliver Musshoff

Using cotton yield, and rainfall data from Tajikistan, the purpose of this paper is to investigate the magnitude of weather induced revenue losses in cotton production. Hereby the…

Abstract

Purpose

Using cotton yield, and rainfall data from Tajikistan, the purpose of this paper is to investigate the magnitude of weather induced revenue losses in cotton production. Hereby the authors look at different risk aggregation levels across political regions (meso-level). The authors then design weather index insurance products able to compensate revenue losses identified and analyze their risk reduction potential.

Design/methodology/approach

The authors design different weather insurance products based on put-options on a cumulated precipitation index. The insurance products are modeled for different inter-regional and intra-regional risk aggregation and risk coverage scenarios. In this attempt the authors deal with the common problem of developing countries in which yield data is often only available on an aggregate level, and weather data is only accessible for a low number of weather stations.

Findings

The authors find that it is feasible to design index-based weather insurance products on the meso-level with a considerable risk reduction potential against weather-induced revenue losses in cotton production. Furthermore, the authors find that risk reduction potential increases on the national level the more subregions are considered for the insurance product design. Moreover, risk reduction potential increases if the index insurance product applied is designed to compensate extreme weather events.

Practical implications

The findings suggest that index-based weather insurance products bear a large risk mitigation potential on an aggregate level. Hence, meso-level insurance should be recognized by institutions with a regional exposure to cost-related weather risks as part of their risk-management strategy.

Originality/value

The authors are the first to investigate the potential of weather index insurance for different risk aggregation levels in developing countries.

Details

Agricultural Finance Review, vol. 75 no. 1
Type: Research Article
ISSN: 0002-1466

Keywords

Article
Publication date: 2 March 2010

Stan Uryasev, Ursula A. Theiler and Gaia Serraino

New methods of integrated risk modeling play an important role in determining the efficiency of bank portfolio management. The purpose of this paper is to suggest a systematic…

4363

Abstract

Purpose

New methods of integrated risk modeling play an important role in determining the efficiency of bank portfolio management. The purpose of this paper is to suggest a systematic approach for risk strategies formulation based on risk‐return optimized portfolios, which applies different methodologies of risk measurement in the context of actual regulatory requirements.

Design/methodology/approach

Optimization problems to illustrate different levels of integrated bank portfolio management has been set up. It constrains economic capital allocation using different risk aggregation methodologies. Novel methods of financial engineering to relate actual bank capital regulations to recently developed methods of risk measurement value‐at‐risk (VaR) and conditional value‐at‐risk (CVaR) deviation are applied. Optimization problems with the portfolio safeguard package by American Optimal Decision (web site: www.AOrDA.com) are run.

Findings

This paper finds evidence that risk aggregation in Internal Capital Adequacy Assessment Process (ICAAP) should be based on risk‐adjusted aggregation approaches, resulting in an efficient use of economic capital. By using different values of confidence level α in VaR and CVaR, deviation, it is possible to obtain optimal portfolios with similar properties. Before deciding to insert constraints on VaR or CVaR, one should analyze properties of the dataset on which computation are based, with particular focus on the model for the tails of the distribution, as none of them is “better” than the other.

Research limitations/implications

This study should further be extended by an inclusion of simulation‐based scenarios and copula approaches for integrated risk measurements.

Originality/value

The suggested optimization models support a systematic generation of risk‐return efficient target portfolios under the ICAAP. However, issues of practical implementation in risk aggregation and capital allocation still remain unsolved and require heuristic implementations.

Details

The Journal of Risk Finance, vol. 11 no. 2
Type: Research Article
ISSN: 1526-5943

Keywords

Article
Publication date: 2 November 2015

Lukas Prorokowski and Hubert Prorokowski

BCBS 239 sets out a challenging standard for risk data processing and reporting. Any bank striving to comply with the principles will be keen to introspect how risk data is…

Abstract

Purpose

BCBS 239 sets out a challenging standard for risk data processing and reporting. Any bank striving to comply with the principles will be keen to introspect how risk data is organized and what execution capabilities are at their disposal. With this in mind, the current paper advises banks on the growing number of solutions, tools and techniques that can be used to support risk data management frameworks under BCBS 239.

Design/methodology/approach

This paper, based on a survey with 29 major financial institutions, including G-SIBs and D-SIBs from diversified geographical regions such as North America, Europe and APAC, aims to advise banks and other financial services firms on what is needed to become ready and compliant with BCBS 239. This paper discusses best practice solutions for master data management, data lineage and end user implementations.

Findings

The primary conclusion of this paper is that banks should not treat BCBS 239 as yet another compliance exercise. The BCBS 239 principles constitute a driving force to restore viability and improve risk governance. In light of the new standards, banks can benefit from making significant progress towards risk data management transformation. This report argues that banks need to invest in a solution that empowers those who use the data to manage risk data. Thus, operational complexities are lifted and no data operations team is needed for proprietary coding of the data. Only then banks will stay abreast of the competition, while becoming fully compliant with the BCBS 239 principles.

Practical implications

As noted by Prorokowski (2014), “Increasingly zero accountability, imposed, leveraged omnipresent vast endeavors, yielding ongoing understanding […] of the impact of the global financial crisis on the ways data should be treated” sparked off international debates addressing the need for an effective solution to risk data management and reporting.

Originality/value

This paper discusses the forthcoming regulatory change that will have a significant impact on the banking industry. The Basel Committee on Banking Supervision published its Principles for effective risk data aggregation and risk reporting (BCBS239) in January last year. The document contains 11 principles that Global Systemically Important Banks (G-SIBs) will need to comply with by January 2016. The BCBS 239 principles are regarded as the least known components of the new regulatory reforms. As it transpires, the principles require many banks to undertake a significant amount of technical work and investments in IT infrastructure. Furthermore, BCBS 239 urges financial services firms to review their definitions of the completeness of risk data.

Details

Journal of Investment Compliance, vol. 16 no. 4
Type: Research Article
ISSN: 1528-5812

Keywords

Article
Publication date: 11 May 2015

Jong Ho Hwang

This paper aims to present a recent history of developments and innovations that, along with advances in information technology, have caused fundamental changes in the way that…

11007

Abstract

Purpose

This paper aims to present a recent history of developments and innovations that, along with advances in information technology, have caused fundamental changes in the way that financial risk is created, transformed, transported and extinguished in modern financial intermediation systems. A review and critique of the global supervisory response to these developments is presented.

Design/methodology/approach

A bottom-up approach to the capture, recording, disaggregation, re-composition and measurement of new, standardized, basic elements of risk that the authors refer to as risk quanta is proposed.

Findings

This approach provides a clearer understanding of the financial world that the people live in today and creates a robust information platform to build innovations, advancements and economic growth in the future.

Practical implications

This approach provides decision-makers with a clearer understanding of the financial world that the people live in today and creates a robust information platform to build innovations, advancements and economic growth in the future.

Social implications

This approach provides financial market participants and the public with a clearer understanding of the financial system and creates a robust information platform to build innovations, advancements and economic growth in the future.

Originality/value

This approach is more comprehensive unlike current international proposals for a global financial risk framework.

Details

Journal of Financial Regulation and Compliance, vol. 23 no. 2
Type: Research Article
ISSN: 1358-1988

Keywords

Article
Publication date: 7 November 2019

Werner Gleißner

This paper aims to present the combination of enterprise risk management (ERM) and value-based management as especially suitable methods for companies with a shareholder value…

1173

Abstract

Purpose

This paper aims to present the combination of enterprise risk management (ERM) and value-based management as especially suitable methods for companies with a shareholder value imperative. Among its major benefits, these methods make the contribution of risk management for business decisions more effective.

Design/methodology/approach

Any possible inconsistencies between ERM, generating value because of imperfect capital markets and the CAPM to calculate cost of capital, which assumes perfect markets, must be avoided. Therefore, it is imperative that valuation methods used are based on risk analysis, and thus do not require perfect capital markets.

Findings

Value-based risk management requires the impact of changes in risk on enterprise value to be calculated and the aggregation of opportunities and risks related to planning to calculate total risk (using Monte Carlo simulation) and valuation techniques that reflect the effects changes in risk, on probability of default, cost of capital and enterprise value (and do not assume perfect capital markets). It is recommended that all relevant risks should be quantified and described using adequate probability distributions derived from the best information.

Practical implications

This approach can help to improve the use of risk analysis in decision-making by improving existing risk-management systems.

Originality/value

This extension of ERM is outlined to provide risk-adequate evaluation methods for business decisions, using Monte Carlo simulation and recently developed methods for risk–fair valuation with incomplete replication in combination with the probability of default. It is shown that quantification of all risk using available information should be accepted for the linking of risk analysis and business decisions.

Article
Publication date: 15 September 2021

Wuyi Ye, Yiqi Wang and Jinhai Zhao

The purpose of this paper is to compare the changes in the risk spillover effects between the copper spot and futures markets before and after the issuance of copper options…

Abstract

Purpose

The purpose of this paper is to compare the changes in the risk spillover effects between the copper spot and futures markets before and after the issuance of copper options, analyze the risk spillover effects between the three markets after the issuance of the options and can provide effective suggestions for regulators and investors who hedge risks.

Design/methodology/approach

The MV-CAViaR model is an extended form of the vector autoregressive model (VAR) to the quantile model, and it is also a special form of the MVMQ-CAViaR model. Based on the VAR quantile model, this model has undergone continuous promotion of the Conditional Autoregressive Value-at-Risk Model (CAViaR) and the Multi-quantile Conditional Autoregressive Value-at-Risk Model (MQ-CAViaR), and finally got the current form of the model.

Findings

The issuance of options has led to certain changes in the risk spillover effect between the copper spot and its derivative markets, and the risk aggregation effect in the futures market has always been significant. Therefore, when supervising the copper product market and investors using copper derivatives to avoid market risks, they need to pay attention to the impact of futures on the spot market, the impact of options on the futures market and the risk spillover effects of spot and futures on the options market.

Practical implications

The empirical results of this paper can be used to hedge market risk investment strategies, and the changes in market relationships also provide an effective basis for the supervision of the copper product market by the supervisory authority.

Originality/value

It is the first literature research to discuss the risk and the impact of spillover effects of copper options on China copper market and its derivative markets. The MV-CAViaR model can capture the mutual risk influence between markets by modeling multiple markets simultaneously.

Details

Journal of Modelling in Management, vol. 17 no. 4
Type: Research Article
ISSN: 1746-5664

Keywords

Article
Publication date: 12 March 2019

Håkan Jankensgård

The purpose of this paper is to develop a theory of enterprise risk management (ERM).

3434

Abstract

Purpose

The purpose of this paper is to develop a theory of enterprise risk management (ERM).

Design/methodology/approach

The method is to develop a theory for ERM based on identifying the general risk management problems that it is supposed to solve and to apply the principle of deduction based on these premises.

Findings

ERM consists of risk governance, which is a set of mechanisms that deals with the agency problem of risk management and risk aggregation, which is a set of mechanisms that deals with the information problem of risk management.

Research limitations/implications

The theory, by identifying the central role of the Board of Directors, encourages further research into the capabilities and incentives of directors as determinants of ERM adoption. It also encourages research into how ERM adoption depends on proxies for agency problems of risk management, such as a decentralized company structure.

Practical implications

The theory encourages Boards of Directors to focus on understanding where the under and over management of risk are likely to be greatest, as opposed to the current practice of mapping a large number of risk factors.

Originality/value

The theory complements existing theory on corporate risk management, which revolves around the role of external frictions, by focusing on internal frictions in the firm that prevent effective risk management. It is the first work to delineate ERM vis-a-vis existing risk theory.

Details

Corporate Governance: The International Journal of Business in Society, vol. 19 no. 3
Type: Research Article
ISSN: 1472-0701

Keywords

Content available
Article
Publication date: 14 April 2022

Son Nguyen, Peggy Shu-Ling Chen and Yuquan Du

Container shipping is a crucial component of the global supply chain that is affected by a large range of operational risks with high uncertainty, threatening the stability of…

Abstract

Purpose

Container shipping is a crucial component of the global supply chain that is affected by a large range of operational risks with high uncertainty, threatening the stability of service, manufacture, distribution and profitability of involved parties. However, quantitative risk analysis (QRA) of container shipping operational risk (CSOR) is being obstructed by the lack of a well-established theoretical structure to guide deeper research efforts. This paper proposes a methodological framework to strengthen the quality and reliability of CSOR analysis (CSORA).

Design/methodology/approach

Focusing on addressing uncertainties, the framework establishes a solid, overarching and updated basis for quantitative CSORA. The framework consists of clearly defined elements and processes, including knowledge establishing, information gathering, aggregating multiple sources of data (social/deliberative and mathematical/statistical), calculating risk and uncertainty level and presenting and interpreting quantified results. The framework is applied in a case study of three container shipping companies in Vietnam.

Findings

Various methodological contributions were rendered regarding CSOR characteristics, settings of analysis models, handling of uncertainties and result interpretation. The empirical study also generated valuable managerial implications regarding CSOR management policies.

Originality/value

This paper fills the gap of an updated framework for CSORA considering the recent advancements of container shipping operations and risk management. The framework can be used by both practitioners as a tool for CSORA and scholars as a test bench to facilitate the comparison and development of QRA models.

Details

Maritime Business Review, vol. 8 no. 2
Type: Research Article
ISSN: 2397-3757

Keywords

Article
Publication date: 31 May 2021

Sebastian Schlütter

This paper aims to propose a scenario-based approach for measuring interest rate risks. Many regulatory capital standards in banking and insurance make use of similar approaches…

Abstract

Purpose

This paper aims to propose a scenario-based approach for measuring interest rate risks. Many regulatory capital standards in banking and insurance make use of similar approaches. The authors provide a theoretical justification and extensive backtesting of our approach.

Design/methodology/approach

The authors theoretically derive a scenario-based value-at-risk for interest rate risks based on a principal component analysis. The authors calibrate their approach based on the Nelson–Siegel model, which is modified to account for lower bounds for interest rates. The authors backtest the model outcomes against historical yield curve changes for a large number of generated asset–liability portfolios. In addition, the authors backtest the scenario-based value-at-risk against the stochastic model.

Findings

The backtesting results of the adjusted Nelson–Siegel model (accounting for a lower bound) are similar to those of the traditional Nelson–Siegel model. The suitability of the scenario-based value-at-risk can be substantially improved by allowing for correlation parameters in the aggregation of the scenario outcomes. Implementing those parameters is straightforward with the replacement of Pearson correlations by value-at-risk-implied tail correlations in situations where risk factors are not elliptically distributed.

Research limitations/implications

The paper assumes deterministic cash flow patterns. The authors discuss the applicability of their approach, e.g. for insurance companies.

Practical implications

The authors’ approach can be used to better communicate interest rate risks using scenarios. Discussing risk measurement results with decision makers can help to backtest stochastic-term structure models.

Originality/value

The authors’ adjustment of the Nelson–Siegel model to account for lower bounds makes the model more useful in the current low-yield environment when unjustifiably high negative interest rates need to be avoided. The proposed scenario-based value-at-risk allows for a pragmatic measurement of interest rate risks, which nevertheless closely approximates the value-at-risk according to the stochastic model.

Details

The Journal of Risk Finance, vol. 22 no. 1
Type: Research Article
ISSN: 1526-5943

Keywords

1 – 10 of over 9000