Search results

1 – 10 of over 9000

Abstract

Details

Applying Maximum Entropy to Econometric Problems
Type: Book
ISBN: 978-0-76230-187-4

Article
Publication date: 1 February 2001

NISSO BUCAY and DAN ROSEN

In recent years, several methodologies for measuring portfolio credit risk have been introduced that demonstrate the benefits of using internal models to measure credit risk in…

Abstract

In recent years, several methodologies for measuring portfolio credit risk have been introduced that demonstrate the benefits of using internal models to measure credit risk in the loan book. These models measure economic credit capital and are specifically designed to capture portfolio effects and account for obligor default correlations. An example of an integrated market and credit risk model that overcomes this limitation is given in Iscoe et al. [1999], which is equally applicable to commercial and retail credit portfolios. However, the measurement of portfolio credit risk in retail loan portfolios has received much less attention than the commercial credit markets. This article proposes a methodology for measuring the credit risk of a retail portfolio, based on the general portfolio credit risk framework of Iscoe et al. The authors discuss the practical estimation and implementation of the model. They demonstrate its applicability with a case study based on the credit card portfolio of a North American financial institution. They also analyze the sensitivity of the results to various assumptions.

Details

The Journal of Risk Finance, vol. 2 no. 3
Type: Research Article
ISSN: 1526-5943

Article
Publication date: 29 April 2021

Chaochao Liu, Zhanwen Niu and Qinglin Li

Existing studies suggested that there is a nonlinear relationship between lean production adoption and organizational performance. Lean production adoption is a gradual process…

Abstract

Purpose

Existing studies suggested that there is a nonlinear relationship between lean production adoption and organizational performance. Lean production adoption is a gradual process, and the application status of lean tools will affect enterprise performance. The existing literature has insufficiently explored the nonlinear relationship of the lean tools application status on operational performance and environmental performance using the same theoretical framework. A combination approach of interpretative structural modeling (ISM) and Bayesian networks was proposed in this paper, which was used to analyze the complex relationship between lean tools application status with operational and environmental performance.

Design/methodology/approach

ISM was used to analyze the inter-relationship of 17 lean tools identified from the lean literature and construct the lean tools structure model providing reference for building Bayesian network. By calculating the prior and conditional probabilities within the lean tools and between the lean tools with the operational and environmental performance, a Bayesian simulation model was constructed and used to analyze the performance outcomes under different lean tools application status.

Findings

The performance simulation result – representing by the probability of three performance levels as good, average and poor – shows inconsistent changes with the changing of lean tools application status. By comparing the changes of operational performance and environmental performance, it can be found that environmental performance is less sensitive to the change of lean tools application status than operational performance.

Originality/value

Using the integrated ISM–Bayesian network approach, the results indicated a nonlinear relationship between lean tools with operational and environmental performance and provided a reference for the exploration of the nonlinear relationship between lean tools and performance. This research further calls for exploring the S-curve relationship between lean tools and environmental performance.

Article
Publication date: 28 October 2014

Carlos Castro and Karen Garcia

Commodity price volatility and small variations in climate conditions may have an important impact on the creditworthiness of any agricultural project. The evolution of such risk…

Abstract

Purpose

Commodity price volatility and small variations in climate conditions may have an important impact on the creditworthiness of any agricultural project. The evolution of such risk factors is vital for the credit risk analysis of a rural bank. The purpose of this paper is to determine the importance of price volatility and climate factors within a default risk model.

Design/methodology/approach

The authors estimate a generalized linear model (GLM) based on a structural default risk model. With the estimated factor loadings, the authors simulate the loss distribution of the portfolio and perform stress test to determine the impact of the relevant risk factors on economic capital.

Findings

The results indicate that both the price volatility and climate factors are statistically significant; however, their economic significance is smaller compare to other factors that the authors control for: macroeconomic conditions for the agricultural sector and intermediate input prices.

Research limitations/implications

The analysis of non-systemic risk factors such as price volatility and climate conditions requires statistical methods focussed on measuring causal effects at higher quantiles, not just at the conditional mean, this is, however, a current limitation of GLMs.

Practical implications

The authors provide a design of a portfolio credit risk model, that is more suited to the special characteristics of a rural bank, than commercial credit risk models.

Originality/value

The paper incorporates agricultural-specific risk factors in a default risk model and a portfolio credit risk model.

Details

Agricultural Finance Review, vol. 74 no. 4
Type: Research Article
ISSN: 0002-1466

Keywords

Book part
Publication date: 6 July 2007

Paul D. Thistle

For over 60 years, Lerner's (1944) probabilistic approach to the welfare evaluation of income distributions has aroused controversy. Lerner's famous theorem is that, under…

Abstract

For over 60 years, Lerner's (1944) probabilistic approach to the welfare evaluation of income distributions has aroused controversy. Lerner's famous theorem is that, under ignorance regarding who has which utility function, the optimal distribution of income is completely equal. However, Lerner's probabilistic approach can only be applied to compare distributions with equal means when the number of possible utility functions equals the number of individuals in the population. Lerner's most controversial assumption that each assignment of utility functions to individuals is equally likely. This paper generalizes Lerner's probabilistic approach to the welfare analysis of income distributions by weakening the restrictions of utilitarian welfare, equal means, equal numbers, and equal probabilities and a homogeneous population. We show there is a tradeoff between invariance (measurability and comparability) and the information about the assignment of utility functions to individuals required to evaluate expected social welfare.

Details

Equity
Type: Book
ISBN: 978-0-7623-1450-8

Article
Publication date: 1 February 2000

CHRISTOPHER C. FINGER

In counterparty credit risk management for swaps, forwards, and other derivative contracts, it is recognized that most common applications of credit exposure measurement suffer…

Abstract

In counterparty credit risk management for swaps, forwards, and other derivative contracts, it is recognized that most common applications of credit exposure measurement suffer from the bias that counterparty default is independent of the amount of exposure. Stress tests are often proposed to compensate for this bias, but these measures tend to be arbitrary and cannot be uniformly applied to setting prices and limits as readily as more standardized approaches. The author proposes a framework in which standard measures of counterparty exposure are conditioned on default probabilities. These conditional measures thus account for “rong way” exposures, but fit naturally into current applications.

Details

The Journal of Risk Finance, vol. 1 no. 3
Type: Research Article
ISSN: 1526-5943

Article
Publication date: 5 November 2021

Libiao Bai, Huijing Shi, Shuyun Kang and Bingbing Zhang

Comprehensive project portfolio risk (PPR) analysis is essential for the success and sustainable development of project portfolios (PPs). However, project interdependency creates…

Abstract

Purpose

Comprehensive project portfolio risk (PPR) analysis is essential for the success and sustainable development of project portfolios (PPs). However, project interdependency creates complexity for PPR analysis. In this study, considering the interdependency effect among projects, the authors develop a quantitative evaluation model to analyze PPR based on a fuzzy Bayesian network.

Design/methodology/approach

In this paper, the primary purpose is to comprehensively evaluate project portfolio risk considering the interdependency effect using a systematical model. Accordingly, a fuzzy Bayesian network (FBN) is developed based on the existing studies. Specifically, first, the risks in project portfolios are identified from the project interdependencies perspective. Second, a fuzzy Bayesian network is adopted to model and quantify the interaction relationships among risks. Finally, the model is implemented to analyze the occurrence situation and characteristics of risks.

Findings

The interdependency effect can lead to high-stake risks, including weak financial liquidity, a lack of cross-project members and project priority imbalance. Furthermore, project schedule risks and inconsistency between product supply and market demand are relatively sensitive and should also be prioritized. Also, the validity of this risk evaluation model has been proved.

Originality/value

The findings identify the most sensitive risks for guaranteeing portfolio implementation and reveal interdependency effect can trigger some specific risks more often. This study proposes for the first time to measure and analyze project portfolio risk by a systematical model. It can help systematically assess and manage the complicated and interdependent risks associated with project portfolios.

Details

Engineering, Construction and Architectural Management, vol. 30 no. 2
Type: Research Article
ISSN: 0969-9988

Keywords

Book part
Publication date: 13 May 2017

Zhuan Pei and Yi Shen

Identification in a regression discontinuity (RD) design hinges on the discontinuity in the probability of treatment when a covariate (assignment variable) exceeds a known…

Abstract

Identification in a regression discontinuity (RD) design hinges on the discontinuity in the probability of treatment when a covariate (assignment variable) exceeds a known threshold. If the assignment variable is measured with error, however, the discontinuity in the relationship between the probability of treatment and the observed mismeasured assignment variable may disappear. Therefore, the presence of measurement error in the assignment variable poses a challenge to treatment effect identification. This chapter provides sufficient conditions to identify the RD treatment effect using the mismeasured assignment variable, the treatment status and the outcome variable. We prove identification separately for discrete and continuous assignment variables and study the properties of various estimation procedures. We illustrate the proposed methods in an empirical application, where we estimate Medicaid takeup and its crowdout effect on private health insurance coverage.

Details

Regression Discontinuity Designs
Type: Book
ISBN: 978-1-78714-390-6

Keywords

Article
Publication date: 1 April 2001

PHILIPP J. SCHÖNBUCHER

This article discusses factor models for portfolio credit. In these models, correlations between individual defaults are driven by a few systematic factors. By conditioning on…

Abstract

This article discusses factor models for portfolio credit. In these models, correlations between individual defaults are driven by a few systematic factors. By conditioning on these factors, defaults observed within are independent. This allows a greater degree of analytical tractability in the model with a realistic dependency structure.

Details

The Journal of Risk Finance, vol. 3 no. 1
Type: Research Article
ISSN: 1526-5943

Article
Publication date: 1 January 1995

F. CRESTANI and C.J. VAN RIJSBERGEN

The evaluation of an implication by Imaging is a logical technique developed in the framework of modal logic. Its interpretation in the context of a ‘possible worlds’ semantics is…

Abstract

The evaluation of an implication by Imaging is a logical technique developed in the framework of modal logic. Its interpretation in the context of a ‘possible worlds’ semantics is very appealing for ir. In 1989, Van Rijsbergen suggested its use for solving one of the fundamental problems of logical models of IR: the evaluation of the implication d → q (where d and q are respectively a document and a query representation). Since then, others have tried to follow that suggestion proposing models and applications, though without much success. Most of these approaches had as their basic assumption the consideration that ‘a document is a possible world’. We propose instead an approach based on a completely different assumption: ‘a term is a possible world’. This approach enables the exploitation of term‐term relationships which are estimated using an information theoretic measure.

Details

Journal of Documentation, vol. 51 no. 1
Type: Research Article
ISSN: 0022-0418

1 – 10 of over 9000