Search results

1 – 10 of 363
Book part
Publication date: 4 April 2024

Ren-Raw Chen and Chu-Hua Kuei

Due to its high leverage nature, a bank suffers vitally from the credit risk it inherently bears. As a result, managing credit is the ultimate responsibility of a bank. In this…

Abstract

Due to its high leverage nature, a bank suffers vitally from the credit risk it inherently bears. As a result, managing credit is the ultimate responsibility of a bank. In this chapter, we examine how efficiently banks manage their credit risk via a powerful tool used widely in the decision/management science area called data envelopment analysis (DEA). Among various existing versions, our DEA is a two-stage, dynamic model that captures how each bank performs relative to its peer banks in terms of value creation and credit risk control. Using data from the largest 22 banks in the United States over the period of 1996 till 2013, we have identified leading banks such as First Bank systems and Bank of New York Mellon before and after mergers and acquisitions, respectively. With the goal of preventing financial crises such as the one that occurred in 2008, a conceptual model of credit risk reduction and management (CRR&M) is proposed in the final section of this study. Discussions on strategy formulations at both the individual bank level and the national level are provided. With the help of our two-stage DEA-based decision support systems and CRR&M-driven strategies, policy/decision-makers in a banking sector can identify improvement opportunities regarding value creation and risk mitigation. The effective tool and procedures presented in this work will help banks worldwide manage the unknown and become more resilient to potential credit crises in the 21st century.

Details

Advances in Pacific Basin Business, Economics and Finance
Type: Book
ISBN: 978-1-83753-865-2

Keywords

Book part
Publication date: 5 April 2024

Zhichao Wang and Valentin Zelenyuk

Estimation of (in)efficiency became a popular practice that witnessed applications in virtually any sector of the economy over the last few decades. Many different models were…

Abstract

Estimation of (in)efficiency became a popular practice that witnessed applications in virtually any sector of the economy over the last few decades. Many different models were deployed for such endeavors, with Stochastic Frontier Analysis (SFA) models dominating the econometric literature. Among the most popular variants of SFA are Aigner, Lovell, and Schmidt (1977), which launched the literature, and Kumbhakar, Ghosh, and McGuckin (1991), which pioneered the branch taking account of the (in)efficiency term via the so-called environmental variables or determinants of inefficiency. Focusing on these two prominent approaches in SFA, the goal of this chapter is to try to understand the production inefficiency of public hospitals in Queensland. While doing so, a recognized yet often overlooked phenomenon emerges where possible dramatic differences (and consequently very different policy implications) can be derived from different models, even within one paradigm of SFA models. This emphasizes the importance of exploring many alternative models, and scrutinizing their assumptions, before drawing policy implications, especially when such implications may substantially affect people’s lives, as is the case in the hospital sector.

Open Access
Article
Publication date: 15 December 2023

Nicola Castellano, Roberto Del Gobbo and Lorenzo Leto

The concept of productivity is central to performance management and decision-making, although it is complex and multifaceted. This paper aims to describe a methodology based on…

Abstract

Purpose

The concept of productivity is central to performance management and decision-making, although it is complex and multifaceted. This paper aims to describe a methodology based on the use of Big Data in a cluster analysis combined with a data envelopment analysis (DEA) that provides accurate and reliable productivity measures in a large network of retailers.

Design/methodology/approach

The methodology is described using a case study of a leading kitchen furniture producer. More specifically, Big Data is used in a two-step analysis prior to the DEA to automatically cluster a large number of retailers into groups that are homogeneous in terms of structural and environmental factors and assess a within-the-group level of productivity of the retailers.

Findings

The proposed methodology helps reduce the heterogeneity among the units analysed, which is a major concern in DEA applications. The data-driven factorial and clustering technique allows for maximum within-group homogeneity and between-group heterogeneity by reducing subjective bias and dimensionality, which is embedded with the use of Big Data.

Practical implications

The use of Big Data in clustering applied to productivity analysis can provide managers with data-driven information about the structural and socio-economic characteristics of retailers' catchment areas, which is important in establishing potential productivity performance and optimizing resource allocation. The improved productivity indexes enable the setting of targets that are coherent with retailers' potential, which increases motivation and commitment.

Originality/value

This article proposes an innovative technique to enhance the accuracy of productivity measures through the use of Big Data clustering and DEA. To the best of the authors’ knowledge, no attempts have been made to benefit from the use of Big Data in the literature on retail store productivity.

Details

International Journal of Productivity and Performance Management, vol. 73 no. 11
Type: Research Article
ISSN: 1741-0401

Keywords

Article
Publication date: 22 March 2024

João Eduardo Sampaio Brasil, Fabio Antonio Sartori Piran, Daniel Pacheco Lacerda, Maria Isabel Wolf Morandi, Debora Oliveira da Silva and Miguel Afonso Sellitto

The purpose of this study is to evaluate the efficiency of a Brazilian steelmaking company’s reheating process of the hot rolling mill.

Abstract

Purpose

The purpose of this study is to evaluate the efficiency of a Brazilian steelmaking company’s reheating process of the hot rolling mill.

Design/methodology/approach

The research method is a quantitative modeling. The main research techniques are data envelopment analysis, TOBIT regression and simulation supported by artificial neural networks. The model’s input and output variables consist of the average billet weight, number of billets processed in a batch, gas consumption, thermal efficiency, backlog and production yield within a specific period. The analysis spans 20 months.

Findings

The key findings include an average current efficiency of 81%, identification of influential variables (average billet weight, billet count and gas consumption) and simulated analysis. Among the simulated scenarios, the most promising achieved an average efficiency of 95% through increased equipment availability and billet size.

Practical implications

Additional favorable simulated scenarios entail the utilization of higher pre-reheating temperatures for cold billets, representing a large amount of savings in gas consumption and a reduction in CO2 emissions.

Originality/value

This study’s primary innovation lies in providing steelmaking practitioners with a systematic approach to evaluating and enhancing the efficiency of reheating processes.

Details

Management of Environmental Quality: An International Journal, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1477-7835

Keywords

Open Access
Article
Publication date: 3 May 2023

Lars Stehn and Alexander Jimenez

The purpose of this paper is to understand if and how industrialized house building (IHB) could support productivity developments for housebuilding on project and industry levels…

Abstract

Purpose

The purpose of this paper is to understand if and how industrialized house building (IHB) could support productivity developments for housebuilding on project and industry levels. The take is that fragmentation of construction is one explanation for the lack of productivity growth, and that IHB could be an integrating method of overcoming horizontal and vertical fragmentation.

Design/methodology/approach

Singe-factor productivity measures are calculated based on data reported by IHB companies and compared to official produced and published research data. The survey covers the years 2013–2020 for IHB companies building multi-storey houses in timber. Generalization is sought through descriptive statistics by contrasting the data samples to the used means to control vertical and horizontal fragmentation formulated as three theoretical propositions.

Findings

According to the results, IHB in timber is on average more productive than conventional housebuilding at the company level, project level, in absolute and in growth terms over the eight-year period. On the company level, the labour productivity was on average 10% higher for IHB compared to general construction and positioned between general construction and general manufacturing. On the project level, IHB displayed an average cost productivity growth of 19% for an employed prefabrication degree of about 45%.

Originality/value

Empirical evidence is presented quantifying so far perceived advantages of IHB. By providing analysis of actual cost and project data derived from IHB companies, the article quantifies previous research that IHB is not only about prefabrication. The observed positive productivity growth in relation to the employed prefabrication degree indicates that off-site production is not a sufficient mean for reaching high productivity and productivity growth. Instead, the capabilities to integrate the operative logic of conventional housebuilding together with logic of IHB platform development and use is a probable explanation of the observed positive productivity growth.

Details

Construction Innovation , vol. 24 no. 7
Type: Research Article
ISSN: 1471-4175

Keywords

Open Access
Article
Publication date: 1 December 2023

Gianni Carvelli

The purpose of this study is to provide new insights into the relationship between fiscal policy and total factor productivity (TFP) while accounting for several economic and…

Abstract

Purpose

The purpose of this study is to provide new insights into the relationship between fiscal policy and total factor productivity (TFP) while accounting for several economic and econometric issues of the phenomenon like non-stationarity, fiscal feedback effects, persistence in productivity, country heterogeneity and unobserved global shocks and local spillovers affecting heterogeneously the countries in the sample.

Design/methodology/approach

The paper is empirical. It builds an Error Correction Model (ECM) specification within a dynamic heterogeneous framework with common correlated effects and models both reverse causality and feedback effects.

Findings

The results of this study highlight some new findings relative to the existing related literature. The outcomes suggest some relevant evidence at both the academic and policy levels: (1) the causal effects going from fiscal deficit/surplus to TFP are heterogeneous across countries; (2) the effects depend on the time horizon considered; (3) the long-run dynamics of TFP are positively impacted by improvements in fiscal budget, but only if the austerity measures do not exert slowdowns in aggregate growth.

Originality/value

The main originality of this study is methodological, with possible extensions to related phenomena. Relative to the existing literature, the gains of this study rely on the way econometric techniques, recently proposed in the literature, are adapted to the economic relationship of interest. The endogeneity due to the existence of reverse causality is modelled without implying relevant performance losses of the models. Moreover, this is the first article that questions whether the effects of fiscal budget on productivity depend on the impact of the former on aggregate output growth, thus emphasising the importance of the quality of fiscal adjustments.

Details

Journal of Economic Studies, vol. 51 no. 9
Type: Research Article
ISSN: 0144-3585

Keywords

Book part
Publication date: 5 April 2024

Ziwen Gao, Steven F. Lehrer, Tian Xie and Xinyu Zhang

Motivated by empirical features that characterize cryptocurrency volatility data, the authors develop a forecasting strategy that can account for both model uncertainty and…

Abstract

Motivated by empirical features that characterize cryptocurrency volatility data, the authors develop a forecasting strategy that can account for both model uncertainty and heteroskedasticity of unknown form. The theoretical investigation establishes the asymptotic optimality of the proposed heteroskedastic model averaging heterogeneous autoregressive (H-MAHAR) estimator under mild conditions. The authors additionally examine the convergence rate of the estimated weights of the proposed H-MAHAR estimator. This analysis sheds new light on the asymptotic properties of the least squares model averaging estimator under alternative complicated data generating processes (DGPs). To examine the performance of the H-MAHAR estimator, the authors conduct an out-of-sample forecasting application involving 22 different cryptocurrency assets. The results emphasize the importance of accounting for both model uncertainty and heteroskedasticity in practice.

Book part
Publication date: 5 April 2024

Emir Malikov, Shunan Zhao and Jingfang Zhang

There is growing empirical evidence that firm heterogeneity is technologically non-neutral. This chapter extends the Gandhi, Navarro, and Rivers (2020) proxy variable framework…

Abstract

There is growing empirical evidence that firm heterogeneity is technologically non-neutral. This chapter extends the Gandhi, Navarro, and Rivers (2020) proxy variable framework for structurally identifying production functions to a more general case when latent firm productivity is multi-dimensional, with both factor-neutral and (biased) factor-augmenting components. Unlike alternative methodologies, the proposed model can be identified under weaker data requirements, notably, without relying on the typically unavailable cross-sectional variation in input prices for instrumentation. When markets are perfectly competitive, point identification is achieved by leveraging the information contained in static optimality conditions, effectively adopting a system-of-equations approach. It is also shown how one can partially identify the non-neutral production technology in the traditional proxy variable framework when firms have market power.

Book part
Publication date: 5 April 2024

Hung-pin Lai

The standard method to estimate a stochastic frontier (SF) model is the maximum likelihood (ML) approach with the distribution assumptions of a symmetric two-sided stochastic…

Abstract

The standard method to estimate a stochastic frontier (SF) model is the maximum likelihood (ML) approach with the distribution assumptions of a symmetric two-sided stochastic error v and a one-sided inefficiency random component u. When v or u has a nonstandard distribution, such as v follows a generalized t distribution or u has a χ2 distribution, the likelihood function can be complicated or untractable. This chapter introduces using indirect inference to estimate the SF models, where only least squares estimation is used. There is no need to derive the density or likelihood function, thus it is easier to handle a model with complicated distributions in practice. The author examines the finite sample performance of the proposed estimator and also compare it with the standard ML estimator as well as the maximum simulated likelihood (MSL) estimator using Monte Carlo simulations. The author found that the indirect inference estimator performs quite well in finite samples.

Article
Publication date: 28 March 2024

Hatice Merve Yanardag Erdener and Ecem Edis

Living walls (LWs), vegetated walls with an integrated growth layer behind, are being increasingly incorporated in buildings. Examining plant characteristics’ comparative impacts…

Abstract

Purpose

Living walls (LWs), vegetated walls with an integrated growth layer behind, are being increasingly incorporated in buildings. Examining plant characteristics’ comparative impacts on LWs’ energy efficiency-related thermal behavior was aimed, considering that studies on their relative effects are limited. LWs of varying leaf albedo, leaf transmittance and leaf area index (LAI) were studied for Antalya, Turkey for typical days of four seasons.

Design/methodology/approach

Dynamic simulations run by Envi-met were used to assess the plant characteristics’ influence on seasonal and orientation-based heat fluxes. After model calibration, a sensitivity analysis was conducted through 112 simulations. The minimum, mean and maximum values were investigated for each plant characteristic. Energy need (regardless of orientation), temperature and heat flux results were compared among different scenarios, including a building without LW, to evaluate energy efficiency and variables’ impacts.

Findings

LWs reduced annual energy consumption in Antalya, despite increasing energy needs in winter. South and west facades were particularly advantageous for energy efficiency. The impacts of leaf albedo and transmittance were more significant (44–46%) than LAI (10%) in determining LWs’ effectiveness. The changes in plant characteristics changed the energy needs up to ca 1%.

Research limitations/implications

This study can potentially contribute to generating guiding principles for architects considering LW use in their designs in hot-humid climates.

Originality/value

The plant characteristics’ relative impacts on energy efficiency, which cannot be easily determined by experimental studies, were examined using parametric simulation results regarding three plant characteristics.

Details

Built Environment Project and Asset Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2044-124X

Keywords

1 – 10 of 363