Search results

1 – 10 of over 1000
Book part
Publication date: 12 April 2012

Bartosz T. Sawik

This chapter presents a multi-criteria portfolio model with the expected return as a performance measure and the expected worst-case return as a risk measure. The problems are…

Abstract

This chapter presents a multi-criteria portfolio model with the expected return as a performance measure and the expected worst-case return as a risk measure. The problems are formulated as a single-objective linear program, as a bi-objective linear program, and as a triple-objective mixed integer program. The problem objective is to allocate the wealth on different securities to optimize the portfolio return. The portfolio approach has allowed the two popular financial engineering percentile measures of risk, value-at-risk (VaR) and conditional value-at-risk (CVaR) to be applied. The decision-maker can assess the value of portfolio return, the risk level, and the number of assets, and can decide how to invest in a real-life situation comparing with ideal (optimal) portfolio solutions. The concave efficient frontiers illustrate the trade-off between the conditional value-at-risk and the expected return of the portfolio. Numerical examples based on historical daily input data from the Warsaw Stock Exchange are presented and selected computational results are provided. The computational experiments prove that both proposed linear and mixed integer programming approaches provide the decision-maker with a simple tool for evaluating the relationship between the expected and the worst-case portfolio return.

Details

Applications of Management Science
Type: Book
ISBN: 978-1-78052-100-8

Article
Publication date: 1 March 2002

CLAUDIO ALBANESE, KEN JACKSON and PETTER WIBERG

Regulators require banks to employ value‐at‐risk (VaR) to estimate the exposure of their trading portfolios to market risk, in order to establish capital requirements. However…

Abstract

Regulators require banks to employ value‐at‐risk (VaR) to estimate the exposure of their trading portfolios to market risk, in order to establish capital requirements. However, portfolio‐level VaR analysis is a high‐dimensional problem and hence computationally intensive. This article presents two new portfolio‐based approaches to reducing the dimensionality of the VaR analysis.

Details

The Journal of Risk Finance, vol. 3 no. 4
Type: Research Article
ISSN: 1526-5943

Article
Publication date: 1 January 2003

CHRIS BROOKS and GITA PERSAND

It is widely accepted that equity return volatility increases more following negative shocks rather than positive shocks. However, much of value‐at‐risk (VaR) analysis relies on…

815

Abstract

It is widely accepted that equity return volatility increases more following negative shocks rather than positive shocks. However, much of value‐at‐risk (VaR) analysis relies on the assumption that returns are normally distributed (a symmetric distribution). This article considers the effect of asymmetries on the evaluation and accuracy of VaR by comparing estimates based on various models.

Details

The Journal of Risk Finance, vol. 4 no. 2
Type: Research Article
ISSN: 1526-5943

Article
Publication date: 1 February 2001

LEO M. TILMAN and PAVEL BRUSILOVSKIY

Value‐at‐Risk (VaR) has become a mainstream risk management technique employed by a large proportion of financial institutions. There exists a substantial amount of research…

Abstract

Value‐at‐Risk (VaR) has become a mainstream risk management technique employed by a large proportion of financial institutions. There exists a substantial amount of research dealing with this task, most commonly referred to as VaR backtesting. A new generation of “self‐learning” VaR models (Conditional Autoregressive Value‐at‐Risk or CAViaR) combine backtesting results with ex ante VaR estimates in an ARIMA framework in order to forecast P/L distributions more accurately. In this commentary, the authors present a systematic overview of several classes of applied statistical techniques that can make VaR backtesting more comprehensive and provide valuable insights into the analytical properties of VaR models in various market environments. In addition, they discuss the challenges associated with extending traditional backtesting approaches for VaR horizons longer than one day and propose solutions to this important problem.

Details

The Journal of Risk Finance, vol. 2 no. 3
Type: Research Article
ISSN: 1526-5943

Article
Publication date: 31 December 2002

Martin Odening and Jan Hinrichs

This study examines problems that may occur when conventional Value‐at‐Risk (VaR) estimators are used to quantify market risks in an agricultural context. For example, standard…

Abstract

This study examines problems that may occur when conventional Value‐at‐Risk (VaR) estimators are used to quantify market risks in an agricultural context. For example, standard VaR methods, such as the variance‐covariance method or historical simulation, can fail when the return distribution is fat tailed. This problem is aggravated when long‐term VaR forecasts are desired. Extreme Value Theory (EVT) is proposed to overcome these problems. The application of EVT is illustrated by an example from the German hog market. Multi‐period VaR forecasts derived by EVT are found to deviate considerably from standard forecasts. We conclude that EVT is a useful complement to traditional VaR methods.

Details

Agricultural Finance Review, vol. 63 no. 1
Type: Research Article
ISSN: 0002-1466

Keywords

Article
Publication date: 31 May 2021

Sebastian Schlütter

This paper aims to propose a scenario-based approach for measuring interest rate risks. Many regulatory capital standards in banking and insurance make use of similar approaches…

Abstract

Purpose

This paper aims to propose a scenario-based approach for measuring interest rate risks. Many regulatory capital standards in banking and insurance make use of similar approaches. The authors provide a theoretical justification and extensive backtesting of our approach.

Design/methodology/approach

The authors theoretically derive a scenario-based value-at-risk for interest rate risks based on a principal component analysis. The authors calibrate their approach based on the Nelson–Siegel model, which is modified to account for lower bounds for interest rates. The authors backtest the model outcomes against historical yield curve changes for a large number of generated asset–liability portfolios. In addition, the authors backtest the scenario-based value-at-risk against the stochastic model.

Findings

The backtesting results of the adjusted Nelson–Siegel model (accounting for a lower bound) are similar to those of the traditional Nelson–Siegel model. The suitability of the scenario-based value-at-risk can be substantially improved by allowing for correlation parameters in the aggregation of the scenario outcomes. Implementing those parameters is straightforward with the replacement of Pearson correlations by value-at-risk-implied tail correlations in situations where risk factors are not elliptically distributed.

Research limitations/implications

The paper assumes deterministic cash flow patterns. The authors discuss the applicability of their approach, e.g. for insurance companies.

Practical implications

The authors’ approach can be used to better communicate interest rate risks using scenarios. Discussing risk measurement results with decision makers can help to backtest stochastic-term structure models.

Originality/value

The authors’ adjustment of the Nelson–Siegel model to account for lower bounds makes the model more useful in the current low-yield environment when unjustifiably high negative interest rates need to be avoided. The proposed scenario-based value-at-risk allows for a pragmatic measurement of interest rate risks, which nevertheless closely approximates the value-at-risk according to the stochastic model.

Details

The Journal of Risk Finance, vol. 22 no. 1
Type: Research Article
ISSN: 1526-5943

Keywords

Article
Publication date: 1 November 2002

Alan Blankley, Reinhold Lamb and Richard Schroeder

In 1997, the Securities and Exchange Commission (SEC) issued new disclosure rules in an amendment to Regulation S‐X. This release requires the disclosure of both qualitative and…

1638

Abstract

In 1997, the Securities and Exchange Commission (SEC) issued new disclosure rules in an amendment to Regulation S‐X. This release requires the disclosure of both qualitative and quantitative information about market risk by all companies registered with the SEC for annual periods ending after 15 June 1998. Larger companies, with market capitalizations in excess of $2.5 billion, banks, and thrifts were required to apply the regulation’s provisions for annual periods after 15 June 1997. This paper presents results of an analysis of the market risk disclosures by the Dow 30 companies for 1997. The provisions of the amendment requiring the disclosure of qualitative information about market risk by were generally followed by all of the companies contained in the DOW 30. Compliance with the other aspects of the amendment was mixed. These failures might be attributed to confusion over the provisions of the amendment. The results of this study indicate that further evidence is needed on the ability of companies to follow the provisions of the amendment.

Details

Managerial Auditing Journal, vol. 17 no. 8
Type: Research Article
ISSN: 0268-6902

Keywords

Article
Publication date: 1 September 2006

Thomas Breuer

Maximum Loss was one of the risk measures proposed as alternatives to Value at Risk following criticism that Value at Risk is not coherent. Although the power of Maximum Loss is…

Abstract

Purpose

Maximum Loss was one of the risk measures proposed as alternatives to Value at Risk following criticism that Value at Risk is not coherent. Although the power of Maximum Loss is recognised for non‐linear portfolios, there are arguments that for linear portfolios Maximum Loss does not convey more information than Value at Risk. This paper argues for the usefulness of Maximum Loss for both linear and non‐linear portfolios.

Design/methodology/approach

This is a synthesis of existing theorems. Results are established by means of counterexamples. The worst case based risk‐return management strategy is presented as a case study.

Findings

For linear portfolios under elliptic distributions Maximum Loss is proportional to Value at Risk, and to Expected Shortfall, with the proportionality constant not depending on the portfolio composition. The paper gives a new example of Value at Risk violating subadditivity, using a portfolio of simple European options. For non‐linear portfolios, Maximum Loss need not even approximately be explained by the sum of Maximum Loss contributions of the individual risk factors. Finally, is proposed a strategy of risk‐return management with Maximum Loss.

Research limitations/implications

The paper is restricted to elliptically distributed risk factors. Although Maximum Loss can be defined for more general continuous and even discrete distributions of risk factor changes, the paper does not address this matter.

Practical implications

The paper proposes an intuitive, computationally easy way how to improve average returns of linear portfolios while reducing worst case losses.

Originality/value

One is a synthesis of existing theorems. The counterexample establishing result is the first example of a portfolio of plain vanilla options violating Value at Risk subadditivity, an effect hitherto only known for portfolios of exotic options. Furthermore, the strategy of risk‐return management with Maximum Loss is original.

Details

Managerial Finance, vol. 32 no. 9
Type: Research Article
ISSN: 0307-4358

Keywords

Article
Publication date: 12 October 2020

Xue Deng and Weimin Li

This paper aims to propose two portfolio selection models with hesitant value-at-risk (HVaR) – HVaR fuzzy portfolio selection model (HVaR-FPSM) and HVaR-score fuzzy portfolio…

Abstract

Purpose

This paper aims to propose two portfolio selection models with hesitant value-at-risk (HVaR) – HVaR fuzzy portfolio selection model (HVaR-FPSM) and HVaR-score fuzzy portfolio selection model (HVaR-S-FPSM) – to help investors solve the problem that how bad a portfolio can be under probabilistic hesitant fuzzy environment.

Design/methodology/approach

It is strictly proved that the higher the probability threshold, the higher the HVaR in HVaR-S-FPSM. Numerical examples and a case study are used to illustrate the steps of building the proposed models and the importance of the HVaR and score constraint. In case study, the authors conduct a sensitivity analysis and compare the proposed models with decision-making models and hesitant fuzzy portfolio models.

Findings

The score constraint can make sure that the portfolio selected is profitable, but will not cause the HVaR to decrease dramatically. The investment proportions of stocks are mainly affected by their HVaRs, which is consistent with the fact that the stock having good performance is usually desirable in portfolio selection. The HVaR-S-FPSM can find portfolios with higher HVaR than each single stock and has little sacrifice of extreme returns.

Originality/value

This paper fulfills a need to construct portfolio selection models with HVaR under probabilistic hesitant fuzzy environment. As a downside risk, the HVaR is more consistent with investors’ intuitions about risks. Moreover, the score constraint makes sure that undesirable portfolios will not be selected.

Details

Engineering Computations, vol. 38 no. 5
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 16 January 2017

Ngoc Quynh Anh Nguyen and Thi Ngoc Trang Nguyen

The purpose of this paper is to present the method for efficient computation of risk measures using Fourier transform technique. Another objective is to demonstrate that this…

Abstract

Purpose

The purpose of this paper is to present the method for efficient computation of risk measures using Fourier transform technique. Another objective is to demonstrate that this technique enables an efficient computation of risk measures beyond value-at-risk and expected shortfall. Finally, this paper highlights the importance of validating assumptions behind the risk model and describes its application in the affine model framework.

Design/methodology/approach

The method proposed is based on Fourier transform methods for computing risk measures. The authors obtain the loss distribution by fitting a cubic spline through the points where Fourier inversion of the characteristic function is applied. From the loss distribution, the authors calculate value-at-risk and expected shortfall. As for the calculation of the entropic value-at-risk, it involves the moment generating function which is closely related to the characteristic function. The expectile risk measure is calculated based on call and put option prices which are available in a semi-closed form by Fourier inversion of the characteristic function. We also consider mean loss, standard deviation and semivariance which are calculated in a similar manner.

Findings

The study offers practical insights into the efficient computation of risk measures as well as validation of the risk models. It also provides a detailed description of algorithms to compute each of the risk measures considered. While the main focus of the paper is on portfolio-level risk metrics, all algorithms are also applicable to single instruments.

Practical implications

The algorithms presented in this paper require little computational effort which makes them very suitable for real-world applications. In addition, the mathematical setup adopted in this paper provides a natural framework for risk model validation which makes the approach presented in this paper particularly appealing in practice.

Originality/value

This is the first study to consider the computation of entropic value-at-risk, semivariance as well as expectile risk measure using Fourier transform method.

Details

The Journal of Risk Finance, vol. 18 no. 1
Type: Research Article
ISSN: 1526-5943

Keywords

1 – 10 of over 1000