Search results
1 – 10 of over 13000Peter Wanke, Sahar Ostovan, Mohammad Reza Mozaffari, Javad Gerami and Yong Tan
This paper aims to present two-stage network models in the presence of stochastic ratio data.
Abstract
Purpose
This paper aims to present two-stage network models in the presence of stochastic ratio data.
Design/methodology/approach
Black-box, free-link and fix-link techniques are used to apply the internal relations of the two-stage network. A deterministic linear programming model is derived from a stochastic two-stage network data envelopment analysis (DEA) model by assuming that some basic stochastic elements are related to the inputs, outputs and intermediate products. The linkages between the overall process and the two subprocesses are proposed. The authors obtain the relation between the efficiency scores obtained from the stochastic two stage network DEA-ratio considering three different strategies involving black box, free-link and fix-link. The authors applied their proposed approach to 11 airlines in Iran.
Findings
In most of the scenarios, when alpha in particular takes any value between 0.1 and 0.4, three models from Charnes, Cooper, and Rhodes (1978), free-link and fix-link generate similar efficiency scores for the decision-making units (DMUs), While a relatively higher degree of variations in efficiency scores among the DMUs is generated when the alpha takes the value of 0.5. Comparing the results when the alpha takes the value of 0.1–0.4, the DMUs have the same ranking in terms of their efficiency scores.
Originality/value
The authors innovatively propose a deterministic linear programming model, and to the best of the authors’ knowledge, for the first time, the internal relationships of a two-stage network are analyzed by different techniques. The comparison of the results would be able to provide insights from both the policy perspective as well as the methodological perspective.
Details
Keywords
Mohammad Tavassoli, Amirali Fathi and Reza Farzipoor Saen
The purpose of this study is to propose a novel super-efficiency DEA model to appraise the relative efficiency of DMUs with zero data and stochastic data. Our model can work with…
Abstract
Purpose
The purpose of this study is to propose a novel super-efficiency DEA model to appraise the relative efficiency of DMUs with zero data and stochastic data. Our model can work with both variable returns to scale (VRS) and constant returns to scale (CRS).
Design/methodology/approach
This study proposes a new stochastic super-efficiency DEA (SSDEA) model to assess the performance of airlines with stochastic and zero inputs and outputs.
Findings
This paper proposes a new analysis and contribution to the knowledge of efficiency assessment with stochastic super-efficiency DEA model by (1) using input saving and output surplus index for efficient DMUs to get the optimal solution; (2) obtaining efficiency scores from the proposed model that are equivalent to original stochastic super-efficiency model when feasible solutions exist. A case study is given to illustrate the applicability of our proposed model. Also, poor performance reasons are identified to improve the performance of inefficient airlines.
Originality/value
For the first time, a new SSDEA model for ranking DMUs is proposed. The introduced model produces a feasible solution when dealing with zero input or output. This paper applies the input saving and output surplus concept to rectify the infeasibility problem in the stochastic DEA model.
Details
Keywords
Baixi Chen, Weining Mao, Yangsheng Lin, Wenqian Ma and Nan Hu
Fused deposition modeling (FDM) is an extensively used additive manufacturing method with the capacity to build complex functional components. Due to the machinery and…
Abstract
Purpose
Fused deposition modeling (FDM) is an extensively used additive manufacturing method with the capacity to build complex functional components. Due to the machinery and environmental factors during manufacturing, the FDM parts inevitably demonstrated uncertainty in properties and performance. This study aims to identify the stochastic constitutive behaviors of FDM-fabricated polylactic acid (PLA) tensile specimens induced by the manufacturing process.
Design/methodology/approach
By conducting the tensile test, the effects of the printing machine selection and three major manufacturing parameters (i.e., printing speed S, nozzle temperature T and layer thickness t) on the stochastic constitutive behaviors were investigated. The influence of the loading rate was also explained. In addition, the data-driven models were established to quantify and optimize the uncertain mechanical behaviors of FDM-based tensile specimens under various printing parameters.
Findings
As indicated by the results, the uncertain behaviors of the stiffness and strength of the PLA tensile specimens were dominated by the printing speed and nozzle temperature, respectively. The manufacturing-induced stochastic constitutive behaviors could be accurately captured by the developed data-driven model with the R2 over 0.98 on the testing dataset. The optimal parameters obtained from the data-driven framework were T = 231.3595 °C, S = 40.3179 mm/min and t = 0.2343 mm, which were in good agreement with the experiments.
Practical implications
The developed data-driven models can also be integrated into the design and characterization of parts fabricated by extrusion and other additive manufacturing technologies.
Originality/value
Stochastic behaviors of additively manufactured products were revealed by considering extensive manufacturing factors. The data-driven models were proposed to facilitate the description and optimization of the FDM products and control their quality.
Details
Keywords
Aishee Aich and Mihir Kumar Pal
The policy of globalization for India was a mix bag contributing benefits and losses. Increased foreign trade, foreign exchange reserves, market expansion was contrasted with fall…
Abstract
The policy of globalization for India was a mix bag contributing benefits and losses. Increased foreign trade, foreign exchange reserves, market expansion was contrasted with fall in domestic industries, unemployment and increase in inequality. The present study analyzes the presence of convergence or divergence of incomes of the states in India using the concepts of Sigma convergence, Beta convergence, and stochastic convergence for the post-reform period of 1993–1994 to 2014–2015. The study tests for absolute β – convergence by using trend line analysis; regression of CAGR (Compound Annual Growth Rate) as a function of the Average PCSDP (Per Capita State Domestic Product) of initial three years of the observed period and regression of point-to-point growth rate of per capita income to the growth rate of initial three years. A negative relationship shall imply the presence of convergence. Further the study uses panel unit root test and relevant dynamic processes to test for conditional β and stochastic convergences. It reveals the evidence of divergence in income across the states.
Details
Keywords
Weak separability is an important concept in many fields of economic theory. This chapter uses Monte Carlo experiments to investigate the performance of newly developed…
Abstract
Weak separability is an important concept in many fields of economic theory. This chapter uses Monte Carlo experiments to investigate the performance of newly developed nonparametric revealed preference tests for weak separability. A main finding is that the bias of the sequentially implemented test for weak separability proposed by Fleissig and Whitney (2003) is low. The theoretically unbiased Swofford and Whitney test (1994) is found to perform better than all sequentially implemented test procedures but is found to suffer from an empirical bias, most likely because of the complexity in executing the test procedure. As a further source of information, we also perform sensitivity analyses on the nonparametric revealed preference tests. It is found that the Fleissig and Whitney test seems to be sensitive to measurement errors in the data.
Paul Dawson, Hai Lin and Yangshu Liu
Longevity risk, that is, the uncertainty of the demographic survival rate, is an important risk for insurance companies and pension funds, which have large, and long‐term…
Abstract
Purpose
Longevity risk, that is, the uncertainty of the demographic survival rate, is an important risk for insurance companies and pension funds, which have large, and long‐term, exposures to survivorship. The purpose of this paper is to propose a new model to describe this demographic survival risk.
Design/methodology/approach
The model proposed in this paper satisfies all the desired properties of a survival rate and has an explicit distribution for both single years and accumulative years.
Findings
The results show that it is important to consider the expected shift and risk premium of life table uncertainty and the stochastic behaviour of survival rates when pricing the survivor derivatives.
Originality/value
This model can be applied to the rapidly growing market for survivor derivatives.
Details
Keywords
A widely accepted belief indicates that terror activities have negative impact on stock markets. Contrary to numerous empirical studies, the purpose of this paper is to consider…
Abstract
Purpose
A widely accepted belief indicates that terror activities have negative impact on stock markets. Contrary to numerous empirical studies, the purpose of this paper is to consider this issue from another point of view in the sense that markets can become desensitized to terror.
Design/methodology/approach
Here, instead of directly analyzing the existing data, the stochastic nature of the events is taken into consideration.
Findings
The author compares three countries and found out that the correlation between terror and stock markets is almost nil when terror events become a commonplace.
Originality/value
This paper applies mean reverting stochastic processes to terror incidents and brings out interesting results.
Details
Keywords
Jean-Joseph Minviel, Yawose Kudawoo and Faten Ben Bouheni
Recent advances in stochastic frontier analysis (SFA) suggest two alternative approaches to account for unobserved heterogeneity and to distinguish between persistent and…
Abstract
Purpose
Recent advances in stochastic frontier analysis (SFA) suggest two alternative approaches to account for unobserved heterogeneity and to distinguish between persistent and transient inefficiency. The first approach is the generalized true random effects (GTRE) model, and the second approach is an autoregressive inefficiency (ARI) model. This study compares them to highlight whether they capture similar inefficiency aspects.
Design/methodology/approach
Using recent methodological advances in SFA, the authors estimate the GTRE and the ARI models using a Monte Carlo experiment and two real datasets from two industries (banking and agriculture).
Findings
The authors find that the two models provide quite different results in terms of inefficiency persistence and overall inefficiency (combination of transient and persistent inefficiency), regardless of the dataset considered.
Practical implications
The study findings suggest that researchers should be careful when referring to these two models because they do not capture the same inefficiency aspects, even though they have the same conceptual basis. This work is a warning about the empirical aspects of the persistent and transient efficiency framework, in order to convey a consistent story to the reader on firms' performance.
Originality/value
Even though they are used in a large number of studies, the present paper contributes to the productivity and efficiency literature by providing the first comparison of the GTRE and the ARI models.
Details
Keywords
Kozo Harimaya and Koichi Kagitani
The purpose of this paper is to investigate the efficiency of the banking business of Japan’s agricultural cooperatives (JAs), which depend heavily on financial business with…
Abstract
Purpose
The purpose of this paper is to investigate the efficiency of the banking business of Japan’s agricultural cooperatives (JAs), which depend heavily on financial business with non-farmers, contradictory to cooperative principles.
Design/methodology/approach
The authors construct a panel data set over 2005–2016 from the financial statements of JAs’ prefectural-level federations and use the input distance stochastic frontier model with a time-variant inefficiency effect for analysis. Both the flow and stock measures of the banking output are used in identical models and the efficiency results are compared. The authors also investigate the determinants of efficiency by using the Tobit and ordinary least squares regression models.
Findings
There is strong evidence of significant prefectural differences in efficiency values. The ratio of lending to non-members to total loans is positively related to efficiency. In contrast, the higher reliance on a central organization and credit business leads to lower efficiency.
Research limitations/implications
Apart from banking, JAs provide mutual insurance business services. As the authors investigate only the efficiency of JAs’ banking business in this study, it would be necessary to investigate the efficiency of their insurance business as well when evaluating JAs’ overall financial business.
Originality/value
There are few studies that investigate the efficiency of JAs’ banking business and its determinants, although significant attention has been paid to their excessive dependence on the financial business.
Details