Search results

1 – 10 of over 1000
Book part
Publication date: 15 April 2020

Cindy S. H. Wang and Shui Ki Wan

This chapter extends the univariate forecasting method proposed by Wang, Luc, and Hsiao (2013) to forecast the multivariate long memory model subject to structural breaks. The…

Abstract

This chapter extends the univariate forecasting method proposed by Wang, Luc, and Hsiao (2013) to forecast the multivariate long memory model subject to structural breaks. The approach does not need to estimate the parameters of this multivariate system nor need to detect the structural breaks. The only procedure is to employ a VAR(k) model to approximate the multivariate long memory model subject to structural breaks. Therefore, this approach reduces the computational burden substantially and also avoids estimation of the parameters of the multivariate long memory model, which can lead to poor forecasting performance. Moreover, when there are multiple breaks, when the breaks occur close to the end of the sample or when the breaks occur at different locations for the time series in the system, our VAR approximation approach solves the issue of spurious breaks in finite samples, even though the exact orders of the multivariate long memory process are unknown. Insights from our theoretical analysis are confirmed by a set of Monte Carlo experiments, through which we demonstrate that our approach provides a substantial improvement over existing multivariate prediction methods. Finally, an empirical application to the multivariate realized volatility illustrates the usefulness of our forecasting procedure.

Book part
Publication date: 28 October 2019

Angelo Corelli

Abstract

Details

Understanding Financial Risk Management, Second Edition
Type: Book
ISBN: 978-1-78973-794-3

Article
Publication date: 3 May 2016

William E. Balson and Gordon Rausser

Risk-based clearing has been proposed by Rausser et al. (2010) for over-the-counter (OTC) derivatives. This paper aims to illustrate the application of risk-based margins to a…

Abstract

Purpose

Risk-based clearing has been proposed by Rausser et al. (2010) for over-the-counter (OTC) derivatives. This paper aims to illustrate the application of risk-based margins to a case study of the mortgage-backed securities derivative portfolio of the American International Group (AIG) during the period 2005-2008. There exists sufficient publicly available information to examine AIG’s derivative portfolio and how that portfolio would depend on conjectural changes in margin requirements imposed on its OTC derivative positions. Generally, such data on OTC derivative portfolio positions are unavailable in the public domain, and thus, the AIG data provide a unique opportunity for an objective evaluation.

Design/methodology/approach

This paper uses modern financial methodology to evaluate risk-based margining and collateralization for the major OTC derivative portfolio of the AIG.

Findings

This analysis reveals that a risk-based margin procedure would have led to earlier margin calls of greater magnitude initially than the collateral calls actually faced by AIG Financial Products (AIGFP). The total margin ultimately required by the risk-based procedure, however, is similar in magnitude to the collateral calls faced by AIGFP by August 2008. It is likely that a risk-based clearing procedure applied to AIG’s OTC contracts would have led to the AIG undertaking significant hedging and liquidation of their OTC positions well before the losses built up to the point they had, perhaps avoiding the federal government’s orchestrated restructuring that occurred in September 2008.

Originality/value

There has been no published risk-based evaluations of a major OTC portfolio of derivatives for any company, let alone the AIG.

Details

Journal of Financial Economic Policy, vol. 8 no. 2
Type: Research Article
ISSN: 1757-6385

Keywords

Article
Publication date: 1 January 2001

Jorge Mina

VaR calculations often require the valuation of complex payoffs over a large set of scenarios. Since pricing complex derivatives is computationally expensive, there is a direct…

Abstract

VaR calculations often require the valuation of complex payoffs over a large set of scenarios. Since pricing complex derivatives is computationally expensive, there is a direct tradeoff between accuracy and computational cost (e.g. time). Hence, full valuation of these instruments over the set of all feasible scenarios is rarely viable. This article describes a method to approximate expensive pricing functions that allows for fast and accurate VaR calculations. The author discusses general applications of the model to the risk management of portfolios comprised of complex instruments.

Details

The Journal of Risk Finance, vol. 2 no. 2
Type: Research Article
ISSN: 1526-5943

Article
Publication date: 17 March 2014

Joël Wagner

The concept of value at risk is used in the risk-based calculation of solvency capital requirements in the Basel II/III banking regulations and in the planned Solvency II…

1113

Abstract

Purpose

The concept of value at risk is used in the risk-based calculation of solvency capital requirements in the Basel II/III banking regulations and in the planned Solvency II insurance regulation framework planned in the European Union. While this measure controls the ruin probability of a financial institution, the expected policyholder deficit (EPD) and expected shortfall (ES) measures, which are relevant from the customer's perspective as they value the amount of the shortfall, are not controlled at the same time. Hence, if there are variations in or changes to the asset-liability situation, financial companies may still comply with the capital requirement, while the EPD or ES reach unsatisfactory levels. This is a significant drawback to the solvency frameworks. The paper aims to discuss these issues.

Design/methodology/approach

The author has developed a model framework wherein the author evaluates the relevant risk measures using the distribution-free approach of the normal power approximation. This allows the author to derive analytical approximations of the risk measures solely through the use of the first three central moments of the underlying distributions. For the case of a reference insurance company, the author calculates the required capital using the ruin probability and EPD approaches. For this, the author performs sensitivity analyses considering different asset allocations and different liability characteristics.

Findings

The author concludes that only a simultaneous monitoring of the ruin probability and EPD can lead to satisfactory results guaranteeing a constant level of customer protection. For the reference firm, the author evaluates the relative changes in the capital requirement when applying the EPD approach next to the ruin probability approach. Depending on the development of the assets and liabilities, and in the cases the author illustrates, the reference company would need to provide substantial amounts of additional equity capital.

Originality/value

A comparative assessment of alternative risk measures is relevant given the debate among regulators, industry representatives and academics about how adequately they are used. The author borrows the approach in parts from the work of Barth. Barth compares the ruin probability and EPD approach when discussing the RBC formulas of the US National Association of Insurance Commissioners introduced in the 1990s. The author reconsiders several of these findings and discusses them in the light of the new regulatory frameworks. More precisely, the author first performs sensitivity analyses for the risk measures using different parameter configurations. Such analyses are relevant since in practice parameter values may differ from estimates used in the model and have a significant impact on the values of the risk measures. Second, the author goes beyond a simple discussion of the outcomes for each risk measure, by deriving the firm conclusion that both the frequency and magnitude of shortfalls need to be controlled.

Details

The Journal of Risk Finance, vol. 15 no. 2
Type: Research Article
ISSN: 1526-5943

Keywords

Article
Publication date: 1 December 2005

John H.J. Einmahl, Walter N. Foppen, Olivier W. Laseroms and Casper G. de Vries

It is the purpose of this article to improve existing methods for risk management, in particular stress testing, for derivative portfolios. The method is explained and compared…

Abstract

Purpose

It is the purpose of this article to improve existing methods for risk management, in particular stress testing, for derivative portfolios. The method is explained and compared with other methods, using hypothetical portfolios.

Design/methodology/approach

Closed form option pricing formulas are used for valuation. To assess the risk, future price movements are modeled by an empirical distribution in conjunction with a semi‐parametrically estimated tail. This approach captures the non‐linearity of the portfolio risk and it is possible to estimate the extreme risk adequately.

Findings

It is found that this method gives excellent results and that it clearly outperforms the standard approach based on a quadratic approximation and the normal distribution. Especially for very high confidence levels, the improvement is dramatic.

Practical implications

In applications of this type the present method is highly preferable to the classical Delta‐Gamma cum normal distribution approach.

Originality/value

This paper uses a “statistics of extremes” approach to stress testing. With this approach it is possible to estimate the far tail of a derivative portfolio adequately.

Details

The Journal of Risk Finance, vol. 6 no. 5
Type: Research Article
ISSN: 1526-5943

Keywords

Abstract

Details

Understanding Financial Risk Management, Second Edition
Type: Book
ISBN: 978-1-78973-794-3

Book part
Publication date: 12 December 2003

Timothy J. Vogelsang

This paper proposes a new approach to testing in the generalized method of moments (GMM) framework. The new tests are constructed using heteroskedasticity autocorrelation (HAC…

Abstract

This paper proposes a new approach to testing in the generalized method of moments (GMM) framework. The new tests are constructed using heteroskedasticity autocorrelation (HAC) robust standard errors computed using nonparametric spectral density estimators without truncation. While such standard errors are not consistent, a new asymptotic theory shows that they lead to valid tests nonetheless. In an over-identified linear instrumental variables model, simulations suggest that the new tests and the associated limiting distribution theory provide a more accurate first order asymptotic null approximation than both standard nonparametric HAC robust tests and VAR based parametric HAC robust tests. Finite sample power of the new tests is shown to be comparable to standard tests.

Details

Maximum Likelihood Estimation of Misspecified Models: Twenty Years Later
Type: Book
ISBN: 978-1-84950-253-5

Article
Publication date: 31 May 2021

Sebastian Schlütter

This paper aims to propose a scenario-based approach for measuring interest rate risks. Many regulatory capital standards in banking and insurance make use of similar approaches…

Abstract

Purpose

This paper aims to propose a scenario-based approach for measuring interest rate risks. Many regulatory capital standards in banking and insurance make use of similar approaches. The authors provide a theoretical justification and extensive backtesting of our approach.

Design/methodology/approach

The authors theoretically derive a scenario-based value-at-risk for interest rate risks based on a principal component analysis. The authors calibrate their approach based on the Nelson–Siegel model, which is modified to account for lower bounds for interest rates. The authors backtest the model outcomes against historical yield curve changes for a large number of generated asset–liability portfolios. In addition, the authors backtest the scenario-based value-at-risk against the stochastic model.

Findings

The backtesting results of the adjusted Nelson–Siegel model (accounting for a lower bound) are similar to those of the traditional Nelson–Siegel model. The suitability of the scenario-based value-at-risk can be substantially improved by allowing for correlation parameters in the aggregation of the scenario outcomes. Implementing those parameters is straightforward with the replacement of Pearson correlations by value-at-risk-implied tail correlations in situations where risk factors are not elliptically distributed.

Research limitations/implications

The paper assumes deterministic cash flow patterns. The authors discuss the applicability of their approach, e.g. for insurance companies.

Practical implications

The authors’ approach can be used to better communicate interest rate risks using scenarios. Discussing risk measurement results with decision makers can help to backtest stochastic-term structure models.

Originality/value

The authors’ adjustment of the Nelson–Siegel model to account for lower bounds makes the model more useful in the current low-yield environment when unjustifiably high negative interest rates need to be avoided. The proposed scenario-based value-at-risk allows for a pragmatic measurement of interest rate risks, which nevertheless closely approximates the value-at-risk according to the stochastic model.

Details

The Journal of Risk Finance, vol. 22 no. 1
Type: Research Article
ISSN: 1526-5943

Keywords

Article
Publication date: 1 March 2002

CLAUDIO ALBANESE, KEN JACKSON and PETTER WIBERG

Regulators require banks to employ value‐at‐risk (VaR) to estimate the exposure of their trading portfolios to market risk, in order to establish capital requirements. However…

Abstract

Regulators require banks to employ value‐at‐risk (VaR) to estimate the exposure of their trading portfolios to market risk, in order to establish capital requirements. However, portfolio‐level VaR analysis is a high‐dimensional problem and hence computationally intensive. This article presents two new portfolio‐based approaches to reducing the dimensionality of the VaR analysis.

Details

The Journal of Risk Finance, vol. 3 no. 4
Type: Research Article
ISSN: 1526-5943

1 – 10 of over 1000