Search results
1 – 10 of over 51000Jonathan B. Dressler and Jeffrey R. Stokes
This paper aims to identify factors that affect agricultural mortgage default and prepayment.
Abstract
Purpose
This paper aims to identify factors that affect agricultural mortgage default and prepayment.
Design/methodology/approach
Using a sample of farm credit system loans, prepayment and default are modeled as competing risks with potentially non‐stationary covariates using a statistical/econometric technique called survival snalysis (SA).
Findings
The analysis suggests that the primary drivers of prepayment and default are the rate of interest charged by the lender at origination and the borrower's current ratio at origination. Tests of the existence of a geographic effect indicate that despite bank management belief to the contrary, branches may not be homogeneous.
Research limitations/implications
This analysis would be improved if more data were available in an easily obtainable manner to control for unobserved heterogeneity. Unobserved heterogeneity or incomplete specification within a model can be problematic. Inferences among regression coefficients can be problematic in that the estimates have inflated variances and unreliable test statistics. In addition, more frequent measures of the time‐varying covariates could be obtained to improve upon the SA models presented above. Future analyses could also incorporate other sections of the agricultural credit association portfolio, as well as a comparison to variable rate notes. One other logical next step would be to obtain loan collateral values to obtain estimates of the exposure at default, and the loss given default, or the estimates needed for the advanced internal ratings based approach described in the Basel Accords.
Originality/value
This paper provides a method for lenders to measure and model mortgage termination, an important consideration for risk managers when determining capital adequacy described in the Basel Accords.
Details
Keywords
Patrick Mair, Horst Treiblmaier and Paul Benjamin Lowry
The purpose of this paper is to present competing risks models and show how dwell times can be applied to predict users’ online behavior. This information enables real-time…
Abstract
Purpose
The purpose of this paper is to present competing risks models and show how dwell times can be applied to predict users’ online behavior. This information enables real-time personalization of web content.
Design/methodology/approach
This paper models transitions between pages based upon the dwell time of the initial state and then analyzes data from a web shop, illustrating how pages that are linked “compete” against each other. Relative risks for web page transitions are estimated based on the dwell time within a clickstream and survival analysis is used to predict clickstreams.
Findings
Using survival analysis and user dwell times allows for a detailed examination of transition behavior over time for different subgroups of internet users. Differences between buyers and non-buyers are shown.
Research limitations/implications
As opposed to other academic fields, survival analysis has only infrequently been used in internet-related research. This paper illustrates how a novel application of this method yields interesting insights into internet users’ online behavior.
Practical implications
A key goal of any online retailer is to increase their customer conversation rates. Using survival analysis, this paper shows how dwell-time information, which can be easily extracted from any server log file, can be used to predict user behavior in real time. Companies can apply this information to design websites that dynamically adjust to assumed user behavior.
Originality/value
The method shows novel clickstream analysis not previously demonstrated. Importantly, this can support the move from web analytics and “big data” from hype to reality.
Details
Keywords
Bruce L. Dixon, Bruce L. Ahrendsen, Brandon R. McFadden, Diana M. Danforth, Monica Foianini and Sandra J. Hamm
The purpose of this paper is to apply duration methods to a sample of Farm Service Agency (FSA) direct, seven‐year operating loans to identify those variables that influence the…
Abstract
Purpose
The purpose of this paper is to apply duration methods to a sample of Farm Service Agency (FSA) direct, seven‐year operating loans to identify those variables that influence the time to loan termination and type of termination. Variables include both those known at time of loan origination and those that characterize the changing economic environment over the life of the loan. Also, to examine the impact of various FSA programs promoting policy objectives.
Design/methodology/approach
A systematic sample of 877 seven‐year, FSA direct loans originated between October 1, 1993 and September 30, 1996 was collected. Cox regression, competing risks models are estimated as a function of borrower and loan characteristics observable at loan origination. Economic indicator variables emphasizing the farm economy and observed quarterly over the life of the loan are also included as explanatory variables.
Findings
Loan characteristics, borrower financial characteristics and degree of borrower interaction with FSA observable at origin are significant variables in determining type of loan outcome (default or paid‐in‐full) and time to outcome. Changes in the economic environment and farm economy during the life of the loan are significant.
Research limitations/implications
The sample consists only of FSA direct loans which implies borrowers are at financial margin. Application of method to agricultural loans from conventional commercial lenders could identify different significant factors.
Practical implications
Using length of time to loan termination instead of just type of outcome provides for a richer analysis of loan performance. Loan performance over time is influenced by the larger economy and should be incorporated into loan performance modeling.
Originality/value
The study described in the paper demonstrates use of competing risks models on intermediate agricultural loans and develops how this technique can be used to learn about dynamic aspects of loan performance. Sample consists of observations on individual FSA direct loan borrowers. The FSA direct loan program is the major source of credit for agricultural borrowers at the financial margin.
Details
Keywords
Xunfa Lu, Kang Sheng and Zhengjun Zhang
This paper aims to better jointly estimate Value at Risk (VaR) and expected shortfall (ES) by using the joint regression combined forecasting (JRCF) model.
Abstract
Purpose
This paper aims to better jointly estimate Value at Risk (VaR) and expected shortfall (ES) by using the joint regression combined forecasting (JRCF) model.
Design/methodology/approach
Combining different forecasting models in financial risk measurement can improve their prediction accuracy by integrating the individual models’ information. This paper applies the JRCF model to measure VaR and ES at 5%, 2.5% and 1% probability levels in the Chinese stock market. While ES is not elicitable on its own, the joint elicitability property of VaR and ES is established by the joint consistent scoring functions, which further refines the ES’s backtest. In addition, a variety of backtesting and evaluation methods are used to analyze and compare the alternative risk measurement models.
Findings
The empirical results show that the JRCF model outperforms the competing models. Based on the evaluation results of the joint scoring functions, the proposed model obtains the minimum scoring function value compared to the individual forecasting models and the average combined forecasting model overall. Moreover, Murphy diagrams’ results further reveal that this model has consistent comparative advantages among all considered models.
Originality/value
The JRCF model of risk measures is proposed, and the application of the joint scoring functions of VaR and ES is expanded. Additionally, this paper comprehensively backtests and evaluates the competing risk models and examines the characteristics of Chinese financial market risks.
Details
Keywords
This study updates the literature review of Jones (1987) published in this journal. The study pays particular attention to two important themes that have shaped the field over the…
Abstract
Purpose
This study updates the literature review of Jones (1987) published in this journal. The study pays particular attention to two important themes that have shaped the field over the past 35 years: (1) the development of a range of innovative new statistical learning methods, particularly advanced machine learning methods such as stochastic gradient boosting, adaptive boosting, random forests and deep learning, and (2) the emergence of a wide variety of bankruptcy predictor variables extending beyond traditional financial ratios, including market-based variables, earnings management proxies, auditor going concern opinions (GCOs) and corporate governance attributes. Several directions for future research are discussed.
Design/methodology/approach
This study provides a systematic review of the corporate failure literature over the past 35 years with a particular focus on the emergence of new statistical learning methodologies and predictor variables. This synthesis of the literature evaluates the strength and limitations of different modelling approaches under different circumstances and provides an overall evaluation the relative contribution of alternative predictor variables. The study aims to provide a transparent, reproducible and interpretable review of the literature. The literature review also takes a theme-centric rather than author-centric approach and focuses on structured themes that have dominated the literature since 1987.
Findings
There are several major findings of this study. First, advanced machine learning methods appear to have the most promise for future firm failure research. Not only do these methods predict significantly better than conventional models, but they also possess many appealing statistical properties. Second, there are now a much wider range of variables being used to model and predict firm failure. However, the literature needs to be interpreted with some caution given the many mixed findings. Finally, there are still a number of unresolved methodological issues arising from the Jones (1987) study that still requiring research attention.
Originality/value
The study explains the connections and derivations between a wide range of firm failure models, from simpler linear models to advanced machine learning methods such as gradient boosting, random forests, adaptive boosting and deep learning. The paper highlights the most promising models for future research, particularly in terms of their predictive power, underlying statistical properties and issues of practical implementation. The study also draws together an extensive literature on alternative predictor variables and provides insights into the role and behaviour of alternative predictor variables in firm failure research.
Details
Keywords
Rafa Madariaga, Ramon Oller and Joan Carles Martori
The purpose of this paper is to assess the capacity of two methodological approaches – discrete choice and survival analysis models – to investigate the relationship between…
Abstract
Purpose
The purpose of this paper is to assess the capacity of two methodological approaches – discrete choice and survival analysis models – to investigate the relationship between socio-economic characteristics and turnover in a retailing company. A comparison of the estimation results under each model and their interpretation is carried out. The study provides a guide to determine, assess and interpret the effects of different driving factors behind turnover.
Design/methodology/approach
The authors use a data set containing information about 1,199 workers followed up between January 2007 and December 2009. First, not distinguishing voluntary and involuntary resignation, a binary logistic regression model and a Cox proportional hazards (PH) model for univariate survival data are set up and estimated. Second, distinguishing voluntary and involuntary resignation, a multinomial logistic regression model and a Cox PH model for competing risk data are set up and estimated.
Findings
When no distinction is made, the results point that wage and age exert a negative effect on turnover. Risk of resignation is higher for male, single, not married and Spanish nationals. When the distinction is made, previous results hold for voluntary turnover: wage, age, gender, marital status and nationality are significant. However, when explaining involuntary turnover, all variables except wage lose explaining power. The survival analysis approach is better suited as it measures risk of resignation in a longitudinal way. Discrete choice models only study the risk at a particular cut-off point (24 months in case of this study).
Originality/value
This paper is a systematic application, evaluation and comparison of four different statistical models for analysing employee turnover in a single firm. This work is original because no systematic comparison has been done in the context of turnover.
Details
Keywords
Vítor Castro and Rodrigo Martins
This paper analyses the collapse of credit booms into soft landings or systemic banking crises.
Abstract
Purpose
This paper analyses the collapse of credit booms into soft landings or systemic banking crises.
Design/methodology/approach
A discrete-time competing risks duration model is employed to disentangle the factors behind the length of benign and harmful credit booms.
Findings
The results show that economic growth and monetary authorities play the major role in explaining the differences in the length and outcome of credit booms. Moreover, both types of credit expansions display positive duration dependence, i.e. both are more likely to end as they grow older, but hard landing credit booms have proven to be longer than those that land softly.
Originality/value
This paper contributes to our understanding of what affects the length of credit booms and why some end up creating havoc and others do not. In particular, it calls the attention to the important role that Central Bank independence plays regarding credit booms length and outcome.
Details
Keywords
Donald E. Hutto, Thomas Mazzuchi and Shahram Sarkani
The purpose of this paper is to provide maintenance personnel with a methodology for using masked field reliability data to determine the probability of each subassembly failure.
Abstract
Purpose
The purpose of this paper is to provide maintenance personnel with a methodology for using masked field reliability data to determine the probability of each subassembly failure.
Design/methodology/approach
The paper compares an iterative maximum likelihood estimation method and a Bayesian methodology for handling masked data collected from 227 identical radar power supplies. The power supply consists of several subassemblies hereafter referred to as shop replaceable assemblies (SRAs).
Findings
The study examined two approaches for dealing with masking, an iterative maximum likelihood estimate procedure, IMLEP, and a Bayesian approach implemented with the application WinBUGS. It indicates that the performances of IMLEP and WinBUGS in estimating the parameters of the SRA distribution under no masking conditions are similar. IMLEP and WinBUGS also provide similar results under masking conditions. However, the study indicates that WinBUGS may perform better than IMLEP when the competing risk responsible for a failure represents a smaller total percentage of the total failures. Future study to confirm this conclusion by expanding the number of SRAs into which the item under study is organized is required.
Research limitations/implications
If an item is considered to be comprised of various subassemblies and the failure of the first subassembly causes the item to fail, then the item is referred to as a series system in the literature. If the probability of a each subassembly failure is statistically independent then the item can be represented by a competing risk model and the probability distributions of the subassemblies can be ascertained from the item's failure data. When the item's cause of failure is not known, the data are referred to in the literature as being masked. Since competing risk theory requires a cause of failure and a time of failure, any masked data must be addressed in the competing risk model.
Practical implications
This study indicates that competing risk theory can be applied to the equipment field failure data to determine a SRA's probability of failure and thereby provide an efficient sequence of replacing suspect failed SRAs.
Originality/value
The analysis of masked failure data is an important area that has had only limited study in the literature due to the availability of failure data. This paper contributes to the research by providing the complete historical equipment usage data for the item under study gathered over a time frame of approximately seven years.
Details
Keywords
F.A.M. Elfaki, I. Bin Daud, N.A. Ibrahim, M.Y. Abdullah and M. Usman
Cox's model with Weibull distribution and Cox's with exponential distribution are the most important models in reliability analysis. This paper seeks to show that, with a large…
Abstract
Purpose
Cox's model with Weibull distribution and Cox's with exponential distribution are the most important models in reliability analysis. This paper seeks to show that, with a large sample size based on expectation maximization (EM) algorithm, both models give similar results.
Design/methodology/approach
The parameters of the models have been estimated by method of maximum likelihood based on EM algorithm. The objective of this analysis is to fit the modification of Cox's model with Weibull distribution and Cox's with exponential distribution, examine its performance and compare their results with Crowder et al.
Findings
A simulation study indicates that the parametric Cox's model with Weibull distribution gives similar results to Cox's with exponential distribution, especially for a large sample size. Also, the modification of the two models showed better results compared with Crowder et al., especially for the second causes of failure.
Originality/value
A modification of the two competing risk models has mostly been applied in failure time data and simulation data. The results of the simulation study indicate that the Weibull and exponential are suitable for Cox's model as they are easy to use and it can achieve even higher accuracy compared with other distribution models.
Details
Keywords
Preeti Wanti Srivastava and Tanu Gupta
Accelerated life test is undertaken to induce early failure in high-reliability products likely to last for several years. Most of these products are exposed to several fatal risk…
Abstract
Purpose
Accelerated life test is undertaken to induce early failure in high-reliability products likely to last for several years. Most of these products are exposed to several fatal risk factors and fail due to one of them. Examples include solar lighting device with two failure modes: capacitor failure, and controller failure. It is necessary to assess each risk factor in the presence of other risk factors as each one cannot be studied in isolation. The purpose of this paper is to explore formulation of optimum time-censored accelerated life test model under modified ramp-stress loading when different failure causes have independent exponential life distributions.
Design/methodology/approach
The modified ramp-stress uses one test chamber in place of the various chambers used in the normal ramp-stress accelerate life test thus saving experimental cost. The stress-life relationship is modeled by inverse power law, and for each failure cause, a cumulative exposure model is assumed. The method of maximum likelihood is used for estimating design parameters. The optimal plan consists in finding out relevant experimental variables, namely, stress rate and stress rate change point(s).
Findings
The optimal plan is devised using D-optimality criterion which consists in finding out optimal stress rate and optimal stress rate change point by maximizing logarithm of determinant of Fisher information matrix to the base 10. This criterion is motivated by the fact that the volume of joint confidence region of model parameters is inversely proportional to square root of determinant of Fisher information matrix. The results of sensitivity analysis show that the plan is robust to small deviations from the true values of baseline parameters.
Originality/value
The model formulated can help reliability engineers obtain reliability estimates quickly of high-reliability products that are likely to last for several years.
Details