Search results

1 – 10 of 595
Article
Publication date: 14 April 2023

Fatima Saeedi Aval Noughabia, Najmeh Malekmohammadi, Farhad Hosseinzadeh Lotfi and Shabnam Razavyan

The purpose of this paper is to improve the recent models for the evaluation of the efficiency of decision making units (DMUs) comprising a network structure with undesirable…

Abstract

Purpose

The purpose of this paper is to improve the recent models for the evaluation of the efficiency of decision making units (DMUs) comprising a network structure with undesirable intermediate measures and fuzzy data.

Design/methodology/approach

In this paper a three-stage network structure model with desirable and undesirable data is presented and is solved as linear triangular fuzzy planning problems.

Findings

A new three stage network data envelopment analysis (DEA) model is established to evaluate the efficiency of industries with undesirable and desirable indicators in fuzzy environment.

Practical implications

The implication of this study is to evaluate the furniture services and the chipboard industries of wood lumber as a three-stage process.

Originality/value

In some cases, DMUs include two or multi-stage process (series or parallel) operating with a structure called a network DEA. Also, in the real world problems, the data are often presented imprecisely. Additionally, the intermediate measures under the real-world conditions include desirable and undesirable data. These mentioned indexes show the value of the proposed model.

Details

International Journal of Intelligent Computing and Cybernetics, vol. 16 no. 4
Type: Research Article
ISSN: 1756-378X

Keywords

Book part
Publication date: 4 April 2024

Ren-Raw Chen and Chu-Hua Kuei

Due to its high leverage nature, a bank suffers vitally from the credit risk it inherently bears. As a result, managing credit is the ultimate responsibility of a bank. In this…

Abstract

Due to its high leverage nature, a bank suffers vitally from the credit risk it inherently bears. As a result, managing credit is the ultimate responsibility of a bank. In this chapter, we examine how efficiently banks manage their credit risk via a powerful tool used widely in the decision/management science area called data envelopment analysis (DEA). Among various existing versions, our DEA is a two-stage, dynamic model that captures how each bank performs relative to its peer banks in terms of value creation and credit risk control. Using data from the largest 22 banks in the United States over the period of 1996 till 2013, we have identified leading banks such as First Bank systems and Bank of New York Mellon before and after mergers and acquisitions, respectively. With the goal of preventing financial crises such as the one that occurred in 2008, a conceptual model of credit risk reduction and management (CRR&M) is proposed in the final section of this study. Discussions on strategy formulations at both the individual bank level and the national level are provided. With the help of our two-stage DEA-based decision support systems and CRR&M-driven strategies, policy/decision-makers in a banking sector can identify improvement opportunities regarding value creation and risk mitigation. The effective tool and procedures presented in this work will help banks worldwide manage the unknown and become more resilient to potential credit crises in the 21st century.

Details

Advances in Pacific Basin Business, Economics and Finance
Type: Book
ISBN: 978-1-83753-865-2

Keywords

Open Access
Article
Publication date: 15 December 2023

Nicola Castellano, Roberto Del Gobbo and Lorenzo Leto

The concept of productivity is central to performance management and decision-making, although it is complex and multifaceted. This paper aims to describe a methodology based on…

Abstract

Purpose

The concept of productivity is central to performance management and decision-making, although it is complex and multifaceted. This paper aims to describe a methodology based on the use of Big Data in a cluster analysis combined with a data envelopment analysis (DEA) that provides accurate and reliable productivity measures in a large network of retailers.

Design/methodology/approach

The methodology is described using a case study of a leading kitchen furniture producer. More specifically, Big Data is used in a two-step analysis prior to the DEA to automatically cluster a large number of retailers into groups that are homogeneous in terms of structural and environmental factors and assess a within-the-group level of productivity of the retailers.

Findings

The proposed methodology helps reduce the heterogeneity among the units analysed, which is a major concern in DEA applications. The data-driven factorial and clustering technique allows for maximum within-group homogeneity and between-group heterogeneity by reducing subjective bias and dimensionality, which is embedded with the use of Big Data.

Practical implications

The use of Big Data in clustering applied to productivity analysis can provide managers with data-driven information about the structural and socio-economic characteristics of retailers' catchment areas, which is important in establishing potential productivity performance and optimizing resource allocation. The improved productivity indexes enable the setting of targets that are coherent with retailers' potential, which increases motivation and commitment.

Originality/value

This article proposes an innovative technique to enhance the accuracy of productivity measures through the use of Big Data clustering and DEA. To the best of the authors’ knowledge, no attempts have been made to benefit from the use of Big Data in the literature on retail store productivity.

Details

International Journal of Productivity and Performance Management, vol. 73 no. 11
Type: Research Article
ISSN: 1741-0401

Keywords

Article
Publication date: 10 October 2023

Pejman Shabani and Mohsen Akbarpour Shirazi

This paper aims to evaluate commercial bank branches' performance in dynamic and competitive conditions where decision-making units (DMUs) seek a greater proportion of shared…

Abstract

Purpose

This paper aims to evaluate commercial bank branches' performance in dynamic and competitive conditions where decision-making units (DMUs) seek a greater proportion of shared resources as it happens in the real world. By introducing the concepts of cross-shared and serial-shared resources, the authors have emphasized the role of evaluation results of past periods on branches' total efficiency.

Design/methodology/approach

In this study, a new mixed-integer data envelopment analysis (MI-DEA) model has been proposed to evaluate the performance of a dynamic network in the presence of cross-shared and serial-shared resources.

Findings

The proposed model helps bank managers to find the source of inefficiencies and establish a connection between the results of the periodic performance of the DMUs and the distribution of serial and cross-shared resources. The results show that the weighting coefficients of the periods do not significantly affect the overall efficiency of commercial bank branches, unlike desirable and undesirable intermediates.

Originality/value

This paper presents the following factors: (1) A new mixed-integer network data envelopment analysis model is developed under dynamic competitive conditions. (2) For the first time in DEA models, the concept of cross-shared resources is proposed to consider shared resources between DMUs. (3) All controllable, uncontrollable, desirable and undesirable outputs in the model are considered with the possibility to transfer to the next periods. (4) A case study is given for the performance evaluation of 38 branches of an Iranian commercial bank from 2016 to 2020.

Details

Journal of Economic Studies, vol. 51 no. 1
Type: Research Article
ISSN: 0144-3585

Keywords

Article
Publication date: 9 May 2022

Narendra N. Dalei and Jignesh M. Joshi

In India, the operational performance of the refinery is influenced by many factors. It is important to identify those key drivers which can assist the refineries to uphold and…

Abstract

Purpose

In India, the operational performance of the refinery is influenced by many factors. It is important to identify those key drivers which can assist the refineries to uphold and succeed in day-to-day production activities. Therefore, the purpose of this study is to evaluate the operational efficiency of seven Indian oil refineries during the period 2010 to 2018.

Design/methodology/approach

In this work, a two-stage empirical analysis is proposed. In the first stage, the data envelopment analysis (DEA) – variable return to scale model is used to evaluate the operational efficiency of the Indian oil refineries. The ordinary least square (OLS), random effect generalized least square (GLS) and Tobit model are used in the second stage to identify the key determinants of efficiency and to explain the variation in refinery efficiency.

Findings

The first-stage DEA results showed that the Numaligarh Refinery Limited and Chennai Petroleum Corporation Limited are found to be more efficient than the rest of the sampled refineries and attained their efficiency scores of 0.993 and 0.981, respectively, during the study period. The second-stage regression analysis suggested three explanatory variables: refinery structure, utilization rate and distillate yield, which are found to be significant in explaining variations in refinery efficiency.

Practical implications

This study provides valuable information that would help policymakers to formulate policies toward improving the efficiency of underperforming Indian refineries, which reduces the excessive use of resources and gives a competitive advantage.

Originality/value

This study proposes the first-ever application of the profit frontier DEA model for assessing the operational efficiency of oil refineries and explains the variation in refinery’s efficiency using OLS, GLS as well as the Tobit model.

Details

International Journal of Energy Sector Management, vol. 17 no. 3
Type: Research Article
ISSN: 1750-6220

Keywords

Article
Publication date: 28 June 2022

Peter Wanke, Sahar Ostovan, Mohammad Reza Mozaffari, Javad Gerami and Yong Tan

This paper aims to present two-stage network models in the presence of stochastic ratio data.

Abstract

Purpose

This paper aims to present two-stage network models in the presence of stochastic ratio data.

Design/methodology/approach

Black-box, free-link and fix-link techniques are used to apply the internal relations of the two-stage network. A deterministic linear programming model is derived from a stochastic two-stage network data envelopment analysis (DEA) model by assuming that some basic stochastic elements are related to the inputs, outputs and intermediate products. The linkages between the overall process and the two subprocesses are proposed. The authors obtain the relation between the efficiency scores obtained from the stochastic two stage network DEA-ratio considering three different strategies involving black box, free-link and fix-link. The authors applied their proposed approach to 11 airlines in Iran.

Findings

In most of the scenarios, when alpha in particular takes any value between 0.1 and 0.4, three models from Charnes, Cooper, and Rhodes (1978), free-link and fix-link generate similar efficiency scores for the decision-making units (DMUs), While a relatively higher degree of variations in efficiency scores among the DMUs is generated when the alpha takes the value of 0.5. Comparing the results when the alpha takes the value of 0.1–0.4, the DMUs have the same ranking in terms of their efficiency scores.

Originality/value

The authors innovatively propose a deterministic linear programming model, and to the best of the authors’ knowledge, for the first time, the internal relationships of a two-stage network are analyzed by different techniques. The comparison of the results would be able to provide insights from both the policy perspective as well as the methodological perspective.

Details

Journal of Modelling in Management, vol. 18 no. 3
Type: Research Article
ISSN: 1746-5664

Keywords

Article
Publication date: 12 April 2023

Ioannis Tampakoudis, Nikolaos Kiosses and Konstantinos Petridis

The purpose of this study is to evaluate the performance of mutual funds during the COVID-19 pandemic with environmental, social and governance (ESG) criteria. The main research…

1229

Abstract

Purpose

The purpose of this study is to evaluate the performance of mutual funds during the COVID-19 pandemic with environmental, social and governance (ESG) criteria. The main research question is whether mutual fund performance differs with respect to the level of the mutual fund’s ESG score.

Design/methodology/approach

The data set contains global fund data, and mutual fund performance is analyzed using two types of data envelopment analysis (DEA) models: the DEA portfolio index (DPEI) and the range direction measure (RDM) DEA. Propensity score matching and logistic regression are also applied.

Findings

The results reveal that: nonequity mutual funds present significantly higher performance compared to the performance of equity mutual funds; mutual funds with high ESG scores are associated with significantly higher performance compared to those with low to medium ESG scores; funds with high ESG scores experience higher performance irrespective of their type; and efficiency scores derived from the RDM DEA are significantly higher than those derived from the DPEI model.

Research limitations/implications

Investors, fund managers and market participants can benefit from the findings of this study and improve their investment decision-making process, including more sustainable funds in their portfolios. Regulators and policymakers should further promote or even require the inclusion of more sustainable investments in the financial products offered by institutional investors. The main limitation of the study is related to data availability regarding the ESG score of mutual funds.

Originality/value

To the best of the authors’ knowledge, this is the first study that provides robust evidence in support of a positive association between ESG scores and mutual fund performance during the pandemic-induced crisis applying a DEA methodology.

Details

Corporate Governance: The International Journal of Business in Society, vol. 23 no. 7
Type: Research Article
ISSN: 1472-0701

Keywords

Article
Publication date: 18 July 2023

Driss El Kadiri Boutchich

This work aims to establish the relationship between painting art and sustainability, which allows for highlighting implications likely to improve sustainability for humanity's…

Abstract

Purpose

This work aims to establish the relationship between painting art and sustainability, which allows for highlighting implications likely to improve sustainability for humanity's welfare.

Design/methodology/approach

To achieve this objective, painting art is measured by a composite index aggregating the quantity and quality represented by the market value. As for sustainable development, it is represented by a composite index comprising three variables: the climate change performance index (ecological dimension), the wage index reflecting distributive justice (social dimension) and the gross domestic product (economic dimension). The composite indices were determined through adjusted data envelopment analysis. In addition, two other methods are used in this work: correlation analysis and a neural network method. These methods are applied to data from 2007 to 2021 across the world.

Findings

The correlation method highlighted a perfect positive correlation between painting art and sustainability. As for the neural network method, it revealed that the quality of painting has the greatest impact on sustainability. The neural network method also showed that the most positively impacted variable of sustainability by painting art is the social variable, with a pseudo-probability of 0.90.

Originality/value

The relationship between painting art and sustainability is underexplored, in particular in terms of statistical analysis. Therefore, this research intends to fill this gap. Moreover, analysis of the relationship between both using composite indices computed via an original method (adjusted data envelopment analysis) and a neural network method is nonexistent, which constitutes the novelty of this work.

Peer review

The peer review history for this article is available at: https://publons.com/publon/10.1108/IJSE-01-2023-0006

Details

International Journal of Social Economics, vol. 51 no. 1
Type: Research Article
ISSN: 0306-8293

Keywords

Article
Publication date: 30 August 2023

Nitin Arora and Shubhendra Jit Talwar

The fiscal outlay efficiency matters when the performance-based allocation of funds is made to state governments by the central government in a federal structure of an economy…

Abstract

Purpose

The fiscal outlay efficiency matters when the performance-based allocation of funds is made to state governments by the central government in a federal structure of an economy like India. Also the efficiency cannon of public expenditure is a key aspect in the field of public economics. Thus, a study to evaluate the efficiency in fiscal outlay of Indian states has been conducted.

Design/methodology/approach

The paper offers a three divisions–based paradigm under Network Data Envelopment Analysis framework to compare the performance of fiscal entities (say Indian state governments) in converting available fiscal resources into desired short-run and long-run growth and development objectives. The network efficiency score has been taken as a measure of the quality of fiscal outlay management that is trifurcated into divisional efficiencies representing budgeting process, fiscal outlay efficiency process and fiscal outlay effectiveness process.

Findings

It has been noticed that the states are under performing in achieving short-run growth targets and so the efficiency process division has been identified a major source of fiscal under performance. Suboptimum allocation of fiscal expenditure under various heads within the fiscal resources, as explained under budgeting process, is another major cause of fiscal under performance.

Practical implications

The study purposes a three divisions–based paradigm that takes into account efficiency of a state in (1) planning budget, (2) achieving short-run growth targets and (3) achieving long-run development targets. These three stages are named as budgeting process efficiency, fiscal outlay efficiency and fiscal outlay effectiveness, respectively. Therefore, a new paradigm called BEE paradigm is proposed to evaluate performance of fiscal entities in terms of fiscal outlay efficiency.

Originality/value

In existing literature on measuring efficiency of public expenditure, the public sector outputs have been made as function of fiscal expenditure as input treating the said outlay as an exogenous variable. In present context, the fiscal expenditure has been treated endogenous to the budgeting process. A high inefficiency on account of budgeting process supports this treatment too.

Details

Benchmarking: An International Journal, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1463-5771

Keywords

Article
Publication date: 16 October 2023

Maedeh Gholamazad, Jafar Pourmahmoud, Alireza Atashi, Mehdi Farhoudi and Reza Deljavan Anvari

A stroke is a serious, life-threatening condition that occurs when the blood supply to a part of the brain is cut off. The earlier a stroke is treated, the less damage is likely…

Abstract

Purpose

A stroke is a serious, life-threatening condition that occurs when the blood supply to a part of the brain is cut off. The earlier a stroke is treated, the less damage is likely to occur. One of the methods that can lead to faster treatment is timely and accurate prediction and diagnosis. This paper aims to compare the binary integer programming-data envelopment analysis (BIP-DEA) model and the logistic regression (LR) model for diagnosing and predicting the occurrence of stroke in Iran.

Design/methodology/approach

In this study, two algorithms of the BIP-DEA and LR methods were introduced and key risk factors leading to stroke were extracted.

Findings

The study population consisted of 2,100 samples (patients) divided into six subsamples of different sizes. The classification table of each algorithm showed that the BIP-DEA model had more reliable results than the LR for the small data size. After running each algorithm, the BIP-DEA and LR algorithms identified eight and five factors as more effective risk factors and causes of stroke, respectively. Finally, predictive models using the important risk factors were proposed.

Originality/value

The main objective of this study is to provide the integrated BIP-DEA algorithm as a fast, easy and suitable tool for evaluation and prediction. In fact, the BIP-DEA algorithm can be used as an alternative tool to the LR model when the sample size is small. These algorithms can be used in various fields, including the health-care industry, to predict and prevent various diseases before the patient’s condition becomes more dangerous.

Details

Journal of Modelling in Management, vol. 19 no. 2
Type: Research Article
ISSN: 1746-5664

Keywords

1 – 10 of 595