Search results
1 – 10 of over 70000Kanak Patel and Ricardo Pereira
This chapter analyses the ability of some structural models to predict corporate bankruptcy. The study extends the existing empirical work on default risk in two ways. First, it…
Abstract
This chapter analyses the ability of some structural models to predict corporate bankruptcy. The study extends the existing empirical work on default risk in two ways. First, it estimates the expected default probabilities (EDPs) for a sample of bankrupt companies in the USA as a function of volatility, debt ratio, and other company variables. Second, it computes default correlations using a copula function and extracts common or latent factors that drive companies’ default correlations using a factor-analytical technique. Idiosyncratic risk is observed to change significantly prior to bankruptcy and its impact on EDPs is found to be more important than that of total volatility. Information-related tests corroborate the results of prediction-orientated tests reported by other studies in the literature; however, only a weak explanatory power is found in the widely used market-to-book assets and book-to-market equity ratio. The results indicate that common factors, which capture the overall state of the economy, explain default correlations quite well.
Youwei He, Kuan Tan, Chunming Fu and Jinliang Luo
The modeling cost of the gradient-enhanced kriging (GEK) method is prohibitive for high-dimensional problems. This study aims to develop an efficient modeling strategy for the GEK…
Abstract
Purpose
The modeling cost of the gradient-enhanced kriging (GEK) method is prohibitive for high-dimensional problems. This study aims to develop an efficient modeling strategy for the GEK method.
Design/methodology/approach
A two-step tuning strategy is proposed for the construction of the GEK model. First, an auxiliary kriging is built efficiently. Then, the hyperparameter of the kriging model is served as a good initial guess to that of the GEK model, and a local optimal search is further used to explore the search space of hyperparameter to guarantee the accuracy of the GEK model. In the construction of the auxiliary kriging, the maximal information coefficient is adopted to estimate the relative magnitude of the hyperparameter, which is used to transform the high-dimension maximum likelihood estimation problem into a one-dimensional optimization. The tuning problem of the auxiliary kriging becomes independent of the dimension. Therefore, the modeling efficiency can be improved significantly.
Findings
The performance of the proposed method is studied with analytic problems ranging from 10D to 50D and an 18D aerodynamic airfoil example. It is further compared with two efficient GEK modeling methods. The empirical experiments show that the proposed model can significantly improve the modeling efficiency without sacrificing accuracy compared with other efficient modeling methods.
Originality/value
This paper developed an efficient modeling strategy for GEK and demonstrated the effectiveness of the proposed method in modeling high-dimension problems.
Details
Keywords
The implementation of credit risk models has largely relied either on the use of historical default dependence, as proxied by the correlation of equity returns, or on risk neutral…
Abstract
Purpose
The implementation of credit risk models has largely relied either on the use of historical default dependence, as proxied by the correlation of equity returns, or on risk neutral equicorrelation, as extracted from CDOs. Contrary to both approaches, the purpose of this paper is to infer risk neutral dependence from CDS data, taking counterparty risk into consideration and avoiding equicorrelation. The impact of risk neutral correlation on the fees of some higher dimensional credit derivatives is also explored.
Design/methodology/approach
Copula functions are used in order to capture dependency. An application to market data is provided.
Findings
Both in the FtD and CDO cases, using (the correct) risk neutral measure instead of equity dependency has the same effect as the adoption of a copula with tail dependency instead of a Gaussian one. This should be important for those who resort to copulas in credit derivative pricing.
Originality/value
As far as is known, several attempts have been made in order to compare the behavior of different copulas in derivative pricing; however, no attempt has been made in order to extract risk neutral dependence without using the equicorrelation assumption. Therefore no attempt has been made to understand which copula features could proxy for risk neutrality, whenever risk neutral dependency cannot be inferred (for instance because CDS involving that name are not actively traded)
Details
Keywords
Xueguang Yu, Xintian Liu, Xu Wang and Xiaolan Wang
This study aims to propose an improved affine interval truncation algorithm to restrain interval extension for interval function.
Abstract
Purpose
This study aims to propose an improved affine interval truncation algorithm to restrain interval extension for interval function.
Design/methodology/approach
To reduce the occurrence times of related variables in interval function, the processing method of interval operation sequence is proposed.
Findings
The interval variable is evenly divided into several subintervals based on correlation analysis of interval variables. The interval function value is modified by the interval truncation method to restrain larger estimation of interval operation results.
Originality/value
Through several uncertain displacement response engineering examples, the effectiveness and applicability of the proposed algorithm are verified by comparing with interval method and optimization algorithm.
Details
Keywords
Chuanqi Liu, Qicheng Sun and Guohua Zhang
Granular materials possess multiscale structures, i.e. micro-scales involving atoms and molecules in a solid particle, meso-scales involving individual particles and their…
Abstract
Purpose
Granular materials possess multiscale structures, i.e. micro-scales involving atoms and molecules in a solid particle, meso-scales involving individual particles and their correlated structure, and macroscopic assembly. Strong and abundant dissipations are exhibited due to mesoscopic unsteady motion of individual grains, and evolution of underlying structures (e.g. force chains, vortex, etc.), which defines the key differences between granular materials and ordinary objects. The purpose of this paper is to introduce the major studies have been conducted in recent two decades.
Design/methodology/approach
The main properties at individual scale are introduced, including the coordination number, pair-correlation function, force and mean stress distribution functions, and the dynamic correlation function. The relationship between meso- and macro-scales is analyzed, such as between contact force and stress, the elastic modulus, and bulk friction in granular flows. At macroscales, conventional engineering models (i.e. elasto-plastic and hypo-plastic ones) are introduced. In particular, the so-called granular hydrodynamics theory, derived from thermodynamics principles, is explained.
Findings
On the basis of recent study the authors conducted, the multiscales (both spatial and temporal) in granular materials are first explained, and a multiscale framework is presented for the mechanics of granular materials.
Originality/value
It would provide a paramount view on the multiscale studies of granular materials.
Details
Keywords
Pingan Zhu, Chao Zhang and Jun Zou
The purpose of the work is to provide a comprehensive review of the digital image correlation (DIC) technique for those who are interested in performing the DIC technique in the…
Abstract
Purpose
The purpose of the work is to provide a comprehensive review of the digital image correlation (DIC) technique for those who are interested in performing the DIC technique in the area of manufacturing.
Design/methodology/approach
No methodology was used because the paper is a review article.
Findings
no fundings.
Originality/value
Herein, the historical development, main strengths and measurement setup of DIC are introduced. Subsequently, the basic principles of the DIC technique are outlined in detail. The analysis of measurement accuracy associated with experimental factors and correlation algorithms is discussed and some useful recommendations for reducing measurement errors are also offered. Then, the utilization of DIC in different manufacturing fields (e.g. cutting, welding, forming and additive manufacturing) is summarized. Finally, the current challenges and prospects of DIC in intelligent manufacturing are discussed.
Details
Keywords
The purpose of this paper was to construct a canonical correlation analysis (CCA) model for the Zimbabwe stock exchange (ZSE). This paper analyses the impact of macroeconomic…
Abstract
Purpose
The purpose of this paper was to construct a canonical correlation analysis (CCA) model for the Zimbabwe stock exchange (ZSE). This paper analyses the impact of macroeconomic variables on stock returns for the Zimbabwe Stock Exchange using the canonical correlation analysis (CCA).
Design/methodology/approach
Data for the independent (macroeconomic) variables and dependent variables (stock returns) were extracted from secondary sources for the period from January 1990 to December 2008. For each variable, 132 sets of data were collected. Eight top trading companies at the ZSE were selected, and their monthly stock returns were calculated using monthly stock prices. The independent variables include: consumer price index, money supply, treasury bills, exchange rate, unemployment, mining and industrial index. The CCA was used to construct the CCA model for the ZSE.
Findings
Maximization of stock returns at the ZSE is mostly influenced by the changes in consumer price index, money supply, exchange rate and treasury bills. The four macroeconomic variables greatly affect the movement of stock prices which, in turn, affect stock returns. The stock returns for Hwange, Barclays, Falcon, Ariston, Border, Caps and Bindura were significant in forming the CCA model.
Research limitations/implications
During the research period, some companies delisted due to economic hardships, and this reduced the sample size for stock returns for respective companies.
Practical implications
The results from this research can be used by policymakers, stock market regulators and the government to make informed decisions when crafting economic policies for the country. The CCA model enables the stakeholders to identify the macroeconomic variables that play a pivotal role in maximizing the strength of the relationship with stock returns.
Social implications
Macroeconomic variables, such as consumer price index, inflation, etc., directly affect the livelihoods of the general populace. They also impact on the performance of companies. The society can monitor economic trends and make the right decisions based on the current trends of economic performance.
Originality/value
This research opens a new dimension to the study of macroeconomic variables and stock returns. Most studies carried out so far in Zimbabwe zeroed in on multiple regression as the central methodology. No study has been done using the CCA as the main methodology.
Details
Keywords
Yash Daultani, Ashish Dwivedi, Saurabh Pratap and Akshay Sharma
Natural disasters cause serious operational risks and disruptions, which further impact the food supply in and around the disaster-impacted area. Resilient functions in the supply…
Abstract
Purpose
Natural disasters cause serious operational risks and disruptions, which further impact the food supply in and around the disaster-impacted area. Resilient functions in the supply chain are required to absorb the impact of resultant disruptions in perishable food supply chains (FSC). The present study identifies specific resilient functions to overcome the problems created by natural disasters in the FSC context.
Design/methodology/approach
The quality function deployment (QFD) method is utilized for identifying these relations. Further, fuzzy term sets and the analytical hierarchy process (AHP) are used to prioritize the identified problems. The results obtained are employed to construct a QFD matrix with the solutions, followed by the technique for order of preference by similarity to the ideal solution (TOPSIS) on the house of quality (HOQ) matrix between the identified problems and functions.
Findings
The results from the study reflect that the shortage of employees in affected areas is the major problem caused by a natural disaster, followed by the food movement problem. The results from the analysis matrix conclude that information sharing should be kept at the highest priority by policymakers to build and increase resilient functions and sustainable crisis management in a perishable FSC network.
Originality/value
The study suggests practical implications for managing a FSC crisis during a natural disaster. The unique contribution of this research lies in finding the correlation and importance ranking among different resilience functions, which is crucial for managing a FSC crisis during a natural disaster.
Details
Keywords
Yong Liu, Jiang Zhang, Junjie Cui, Changsong Zheng, Yajun Liu and Jian Shen
In armored vehicles integrated transmissions, residual life prediction based on oil spectrum data is crucial for condition monitoring and reliability assessment. This paper aims…
Abstract
Purpose
In armored vehicles integrated transmissions, residual life prediction based on oil spectrum data is crucial for condition monitoring and reliability assessment. This paper aims to use the advantages of real-time and accurate prediction of binary Wiener process, the residual life prediction of clutch is studied.
Design/methodology/approach
First, combined with the wet clutch life test, the indicator elements Cu and Pb and the failure threshold of the residual life prediction of the clutch are extracted through the oil replacement correction of the spectral data of the whole life cycle; second, the correlation characteristics of indicating elements are analyzed by MATLAB Copula function, then the correlation function of residual life will be derived; third, according to the inverse Gaussian principle, the performance degradation mathematical models of the unary and binary Wiener processes of the above two indicator elements are established; finally, the maximum likelihood estimation method is used to estimate the parameters, and the monadic and binary performance degradation mathematical models are used to predict the residual life of the tested clutch.
Findings
By comparing the prediction results with the test results, with the passage of time, 81.25% of the predicted value error of the residual life prediction method based on the binary Wiener process is controlled within 20%, while 56.25% of the predicted value error of the residual life prediction method based on the unitary Wiener process is controlled within 20%. At the same time, the prediction accuracy of the binary prediction model is 2%–16.7% higher than that of the unitary prediction model.
Originality/value
This paper studies the residual life prediction theory of wet clutch, which can develop the theory and method of comprehensive transmission health monitoring, and provide theoretical and technical support for the construction of a reliable health management system for high-speed tracked vehicles.
Details