Search results

1 – 10 of 530
Open Access
Article
Publication date: 6 May 2020

Phong Hoang Nguyen and Duyen Thi Bich Pham

The paper aims to enrich previous findings for an emerging banking industry such as Vietnam, reporting the difference between the parametric and nonparametric methods when…

3768

Abstract

Purpose

The paper aims to enrich previous findings for an emerging banking industry such as Vietnam, reporting the difference between the parametric and nonparametric methods when measuring cost efficiency. The purpose of the study is to assess the consistency in issuing policies to improve the cost efficiency of Vietnamese commercial banks.

Design/methodology/approach

The cost efficiency of banks is assessed through the data envelopment analysis (DEA) and the stochastic frontier analysis (SFA). Next, five tests are conducted in succession to analyze the differences in cost efficiency measured by these two methods, including the distribution, the rankings, the identification of the best and worst banks, the time consistency and the determinants of efficiency frontier. The data are collected from the annual financial statements of Vietnamese banks during 2005–2017.

Findings

The results show that the cost efficiency obtained under the SFA models is more consistent than under the DEA models. However, the DEA-based efficiency scores are more similar in ranking order and stability over time. The inconsistency in efficiency characteristics under two different methods reminds policy makers and bank administrators to compare and select the appropriate efficiency frontier measure for each stage and specific economic conditions.

Originality/value

This paper shows the need to control for heterogeneity over banking groups and time as well as for random noise and outliers when measuring the cost efficiency.

Details

Journal of Economics and Development, vol. 22 no. 2
Type: Research Article
ISSN: 1859-0020

Keywords

Open Access
Article
Publication date: 7 March 2022

María Rubio-Misas

This paper investigates why bancassurance coexists with alternative insurance distribution channels in the long run, considering the bank channel is known to involve lower costs…

4712

Abstract

Purpose

This paper investigates why bancassurance coexists with alternative insurance distribution channels in the long run, considering the bank channel is known to involve lower costs than traditional distribution systems. It tests the product-quality hypothesis that maintains that the higher costs of some distribution systems represent expenses associated with producing higher product quality, greater service intensity and/or skills to solve principal-agent conflicts.

Design/methodology/approach

An analysis is conducted on firms operating in the life segment of the Spanish insurance industry over an eight-year sample period. First, the author estimates cost efficiency and profit inefficiency using data envelopment analysis. Cost efficiency enables one to evaluate if the use of the banking channel increases cost efficiency. Profit inefficiency is addressed to identify the existence/absence of product-quality differences. The performance implications of using bancassurance are analyzed by applying Heckman's two-stage random-effects regression model.

Findings

The results support the product-quality arguments. The use of banking channel was found to increase cost efficiency. However, the distribution channel/s utilized did not affect profit inefficiency.

Practical implications

A regulatory environment that supports the development of bancassurance enables this and alternative distribution channels to be sorted into market niches, where each system enjoys comparative advantages in order to minimize insurer costs and maximize insurer revenues. There is no single optimal insurance distribution system.

Originality/value

This is the first study to investigate why bancassurance coexists with alternative insurance distribution channels.

Details

International Journal of Bank Marketing, vol. 40 no. 4
Type: Research Article
ISSN: 0265-2323

Keywords

Open Access
Article
Publication date: 23 December 2019

Andrea Garlatti, Paolo Fedele, Silvia Iacuzzi and Grazia Garlatti Costa

Coproduction is both a recurrent way of organizing public services and a maturing academic field. The academic debate has analyzed several facets, but one deserves further…

1929

Abstract

Purpose

Coproduction is both a recurrent way of organizing public services and a maturing academic field. The academic debate has analyzed several facets, but one deserves further analysis: its impact on the cost efficiency of public services. The purpose of this paper is to aim at systematizing the findings on the relationship between coproduction and cost efficiency and at developing insights for future research.

Design/methodology/approach

This paper is based on a structured literature review (SLR), following the approach proposed by Massaro, Dumay and Guthrie. The SLR approach differs from traditional narrative reviews since, like other meta-analysis methods, it adopts a replicable and transparent process. At the same time, when compared to most common meta-analysis or systematic review logics, it is better suited to incorporate evidence from case studies and etnographies. This makes the method especially suited to public administration and management studies.

Findings

Results shed light on the nature of the academic literature relating coproduction to cost efficiency, on what type of costs are affected and how and on the meaningfulness of productivity measures when public services are co-produced.

Originality/value

In times of fiscal distress for many governments, the paper contributes to research and practice in systematically re-assessing the effects of coproduction on public budgets.

Details

Journal of Public Budgeting, Accounting & Financial Management, vol. 32 no. 1
Type: Research Article
ISSN: 1096-3367

Keywords

Open Access
Article
Publication date: 8 December 2023

Tommaso Piseddu and Fedra Vanhuyse

With more cities aiming to achieve climate neutrality, identifying the funding to support these plans is essential. The purpose of this paper is to exploit the present of a…

Abstract

Purpose

With more cities aiming to achieve climate neutrality, identifying the funding to support these plans is essential. The purpose of this paper is to exploit the present of a structured green bonds framework in Sweden to investigate the typology of abatement projects Swedish municipalities invested in and understand their effectiveness.

Design/methodology/approach

Marginal abatement cost curves of the green bond measures are constructed by using the financial and abatement data provided by municipalities on an annual basis.

Findings

The results highlight the economic competitiveness of clean energy production, measured in abatement potential per unit of currency, even when compared to other emerging technologies that have attracted the interest of policymakers. A comparison with previous studies on the cost efficiency of carbon capture storage reveals that clean energy projects, especially wind energy production, can contribute to the reduction of emissions in a more efficient way. The Swedish carbon tax is a good incentive tool for investments in clean energy projects.

Originality/value

The improvement concerning previous applications is twofold: the authors expand the financial considerations to include the whole life-cycle costs, and the authors consider all the greenhouse gases. This research constitutes a prime in using financial and environmental data produced by local governments to assess the effectiveness of their environmental measures.

Details

Studies in Economics and Finance, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1086-7376

Keywords

Open Access
Article
Publication date: 10 March 2023

Sini Laari, Harri Lorentz, Patrik Jonsson and Roger Lindau

Drawing on information processing theory, the linkage between buffering and bridging and the ability on the part of procurement to resolve demand–supply imbalances is…

2575

Abstract

Purpose

Drawing on information processing theory, the linkage between buffering and bridging and the ability on the part of procurement to resolve demand–supply imbalances is investigated, as well as contexts in which these strategies may be particularly useful or detrimental. Buffering may be achieved through demand change or redundancy, while bridging may be achieved by the means of collaboration or monitoring.

Design/methodology/approach

This study employs a hierarchical regression analysis of a survey of 150 Finnish and Swedish procurement and sales and operations planning professionals, each responding from the perspective of their own area of supply responsibility.

Findings

Both the demand change and redundancy varieties of buffering are associated with procurement's ability to resolve demand–supply imbalances without delivery disruptions, but not with cost-efficient resolution. Bridging is associated with the cost-efficient resolution of imbalances: while collaboration offers benefits, monitoring seems to make things worse. Dynamism diminishes, while the co-management of procurement in S&OP improves procurement's ability to resolve demand–supply imbalances. The most potent strategy for tackling problematic contexts appears to be buffering via demand change.

Practical implications

The results highlight the importance of procurement in the S&OP process and suggest tactical measures that can be taken to resolve and reduce the effects of supply and demand imbalances.

Originality/value

The results contribute to the procurement and S&OP literature by increasing knowledge regarding the role and integration of procurement to the crucial process of balancing demand and supply operations.

Details

International Journal of Operations & Production Management, vol. 43 no. 13
Type: Research Article
ISSN: 0144-3577

Keywords

Open Access
Article
Publication date: 1 June 2021

Sarah Korein, Ahmed Abotalib, Mariusz Trojak and Heba Abou-El-Sood

This paper is motivated by the heated debates preceding the introduction of additional regulatory requirements of Basel III on capital conservation buffer (CCB) and regulatory…

1354

Abstract

Purpose

This paper is motivated by the heated debates preceding the introduction of additional regulatory requirements of Basel III on capital conservation buffer (CCB) and regulatory leverage (RLEV) in banks of emerging markets. The paper aims to examine which policy ratio can improve bank efficiency (BE), in one of the most resilient banking settings in the Middle East and North Africa (MENA) region.

Design/methodology/approach

The analysis is performed on a sample of 13 banks for the period 2010–2018 in Egypt and proceeds in two steps. In the first step, the data envelopment analysis model is used to derive bank-specific efficiency scores. In the second step, BE scores are regressed on the two types of regulatory capital and a set of control variables.

Findings

The paper is motivated by regulatory debates on the viability of RLEV and CCB in enhancing BE. The results show that higher RLEV and CCB are associated with a reduction in BE and that RLEV is highly associated with BE compared to CCB. Hence, results are relevant to policymakers in designing measures for improving BE in emerging markets.

Originality/value

The findings contribute to a small but growing stream of research on capital adequacy in emerging markets. This study provides results on the viability of risk-based vs non-risk-based capital requirements. The findings are also relevant to bank regulators in similar emerging market settings in their efforts to introduce and phase in minimum leverage requirements according to Basel III.

Details

Journal of Humanities and Applied Social Sciences, vol. 4 no. 4
Type: Research Article
ISSN:

Keywords

Open Access
Article
Publication date: 13 October 2023

Roland Hellberg

A deteriorating security situation and an increased need for defence equipment calls for new forms of collaboration between Armed Forces and the defence industry. This paper aims…

1098

Abstract

Purpose

A deteriorating security situation and an increased need for defence equipment calls for new forms of collaboration between Armed Forces and the defence industry. This paper aims to investigate the ways in which the accelerating demand for increased security of supply of equipment and supplies to the Armed Forces requires adaptability in the procurement process that is governed by laws on public procurement (PP).

Design/methodology/approach

This paper is based on a review of current literature as well as empirical data obtained through interviews with representatives from the Swedish Defence Materiel Administration and the Swedish defence industry.

Findings

Collaboration with the globalized defence industry requires new approaches, where the PP rules make procurement of a safe supply of defence equipment difficult.

Research limitations/implications

The study's empirical data and findings are based on the Swedish context. In order to draw more general conclusions in a defence context, the study should be expanded to cover more nations.

Practical implications

The findings will enable the defence industry and the procurement authorizations to better understand the requirements of Armed Forces, and how to cooperate under applicable legal and regulatory requirements.

Originality/value

The paper extends the extant body of academic knowledge of the security of supply into the defence sector. It serves as a first step towards articulating a call for new approaches to collaboration in defence supply chains.

Details

Journal of Defense Analytics and Logistics, vol. 7 no. 2
Type: Research Article
ISSN: 2399-6439

Keywords

Open Access
Article
Publication date: 3 May 2023

Lars Stehn and Alexander Jimenez

The purpose of this paper is to understand if and how industrialized house building (IHB) could support productivity developments for housebuilding on project and industry levels…

Abstract

Purpose

The purpose of this paper is to understand if and how industrialized house building (IHB) could support productivity developments for housebuilding on project and industry levels. The take is that fragmentation of construction is one explanation for the lack of productivity growth, and that IHB could be an integrating method of overcoming horizontal and vertical fragmentation.

Design/methodology/approach

Singe-factor productivity measures are calculated based on data reported by IHB companies and compared to official produced and published research data. The survey covers the years 2013–2020 for IHB companies building multi-storey houses in timber. Generalization is sought through descriptive statistics by contrasting the data samples to the used means to control vertical and horizontal fragmentation formulated as three theoretical propositions.

Findings

According to the results, IHB in timber is on average more productive than conventional housebuilding at the company level, project level, in absolute and in growth terms over the eight-year period. On the company level, the labour productivity was on average 10% higher for IHB compared to general construction and positioned between general construction and general manufacturing. On the project level, IHB displayed an average cost productivity growth of 19% for an employed prefabrication degree of about 45%.

Originality/value

Empirical evidence is presented quantifying so far perceived advantages of IHB. By providing analysis of actual cost and project data derived from IHB companies, the article quantifies previous research that IHB is not only about prefabrication. The observed positive productivity growth in relation to the employed prefabrication degree indicates that off-site production is not a sufficient mean for reaching high productivity and productivity growth. Instead, the capabilities to integrate the operative logic of conventional housebuilding together with logic of IHB platform development and use is a probable explanation of the observed positive productivity growth.

Details

Construction Innovation , vol. 24 no. 7
Type: Research Article
ISSN: 1471-4175

Keywords

Open Access
Article
Publication date: 21 June 2022

Abhishek Das and Mihir Narayan Mohanty

In time and accurate detection of cancer can save the life of the person affected. According to the World Health Organization (WHO), breast cancer occupies the most frequent…

Abstract

Purpose

In time and accurate detection of cancer can save the life of the person affected. According to the World Health Organization (WHO), breast cancer occupies the most frequent incidence among all the cancers whereas breast cancer takes fifth place in the case of mortality numbers. Out of many image processing techniques, certain works have focused on convolutional neural networks (CNNs) for processing these images. However, deep learning models are to be explored well.

Design/methodology/approach

In this work, multivariate statistics-based kernel principal component analysis (KPCA) is used for essential features. KPCA is simultaneously helpful for denoising the data. These features are processed through a heterogeneous ensemble model that consists of three base models. The base models comprise recurrent neural network (RNN), long short-term memory (LSTM) and gated recurrent unit (GRU). The outcomes of these base learners are fed to fuzzy adaptive resonance theory mapping (ARTMAP) model for decision making as the nodes are added to the F_2ˆa layer if the winning criteria are fulfilled that makes the ARTMAP model more robust.

Findings

The proposed model is verified using breast histopathology image dataset publicly available at Kaggle. The model provides 99.36% training accuracy and 98.72% validation accuracy. The proposed model utilizes data processing in all aspects, i.e. image denoising to reduce the data redundancy, training by ensemble learning to provide higher results than that of single models. The final classification by a fuzzy ARTMAP model that controls the number of nodes depending upon the performance makes robust accurate classification.

Research limitations/implications

Research in the field of medical applications is an ongoing method. More advanced algorithms are being developed for better classification. Still, the scope is there to design the models in terms of better performance, practicability and cost efficiency in the future. Also, the ensemble models may be chosen with different combinations and characteristics. Only signal instead of images may be verified for this proposed model. Experimental analysis shows the improved performance of the proposed model. This method needs to be verified using practical models. Also, the practical implementation will be carried out for its real-time performance and cost efficiency.

Originality/value

The proposed model is utilized for denoising and to reduce the data redundancy so that the feature selection is done using KPCA. Training and classification are performed using heterogeneous ensemble model designed using RNN, LSTM and GRU as base classifiers to provide higher results than that of single models. Use of adaptive fuzzy mapping model makes the final classification accurate. The effectiveness of combining these methods to a single model is analyzed in this work.

Details

Applied Computing and Informatics, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2634-1964

Keywords

Content available
Article
Publication date: 1 April 2001

47

Abstract

Details

Industrial Robot: An International Journal, vol. 28 no. 2
Type: Research Article
ISSN: 0143-991X

Keywords

1 – 10 of 530