Search results

1 – 10 of 363
Open Access
Article
Publication date: 6 May 2020

Phong Hoang Nguyen and Duyen Thi Bich Pham

The paper aims to enrich previous findings for an emerging banking industry such as Vietnam, reporting the difference between the parametric and nonparametric methods when…

4380

Abstract

Purpose

The paper aims to enrich previous findings for an emerging banking industry such as Vietnam, reporting the difference between the parametric and nonparametric methods when measuring cost efficiency. The purpose of the study is to assess the consistency in issuing policies to improve the cost efficiency of Vietnamese commercial banks.

Design/methodology/approach

The cost efficiency of banks is assessed through the data envelopment analysis (DEA) and the stochastic frontier analysis (SFA). Next, five tests are conducted in succession to analyze the differences in cost efficiency measured by these two methods, including the distribution, the rankings, the identification of the best and worst banks, the time consistency and the determinants of efficiency frontier. The data are collected from the annual financial statements of Vietnamese banks during 2005–2017.

Findings

The results show that the cost efficiency obtained under the SFA models is more consistent than under the DEA models. However, the DEA-based efficiency scores are more similar in ranking order and stability over time. The inconsistency in efficiency characteristics under two different methods reminds policy makers and bank administrators to compare and select the appropriate efficiency frontier measure for each stage and specific economic conditions.

Originality/value

This paper shows the need to control for heterogeneity over banking groups and time as well as for random noise and outliers when measuring the cost efficiency.

Details

Journal of Economics and Development, vol. 22 no. 2
Type: Research Article
ISSN: 1859-0020

Keywords

Open Access
Article
Publication date: 7 March 2022

María Rubio-Misas

This paper investigates why bancassurance coexists with alternative insurance distribution channels in the long run, considering the bank channel is known to involve lower costs…

5459

Abstract

Purpose

This paper investigates why bancassurance coexists with alternative insurance distribution channels in the long run, considering the bank channel is known to involve lower costs than traditional distribution systems. It tests the product-quality hypothesis that maintains that the higher costs of some distribution systems represent expenses associated with producing higher product quality, greater service intensity and/or skills to solve principal-agent conflicts.

Design/methodology/approach

An analysis is conducted on firms operating in the life segment of the Spanish insurance industry over an eight-year sample period. First, the author estimates cost efficiency and profit inefficiency using data envelopment analysis. Cost efficiency enables one to evaluate if the use of the banking channel increases cost efficiency. Profit inefficiency is addressed to identify the existence/absence of product-quality differences. The performance implications of using bancassurance are analyzed by applying Heckman's two-stage random-effects regression model.

Findings

The results support the product-quality arguments. The use of banking channel was found to increase cost efficiency. However, the distribution channel/s utilized did not affect profit inefficiency.

Practical implications

A regulatory environment that supports the development of bancassurance enables this and alternative distribution channels to be sorted into market niches, where each system enjoys comparative advantages in order to minimize insurer costs and maximize insurer revenues. There is no single optimal insurance distribution system.

Originality/value

This is the first study to investigate why bancassurance coexists with alternative insurance distribution channels.

Details

International Journal of Bank Marketing, vol. 40 no. 4
Type: Research Article
ISSN: 0265-2323

Keywords

Open Access
Article
Publication date: 23 December 2019

Andrea Garlatti, Paolo Fedele, Silvia Iacuzzi and Grazia Garlatti Costa

Coproduction is both a recurrent way of organizing public services and a maturing academic field. The academic debate has analyzed several facets, but one deserves further…

2279

Abstract

Purpose

Coproduction is both a recurrent way of organizing public services and a maturing academic field. The academic debate has analyzed several facets, but one deserves further analysis: its impact on the cost efficiency of public services. The purpose of this paper is to aim at systematizing the findings on the relationship between coproduction and cost efficiency and at developing insights for future research.

Design/methodology/approach

This paper is based on a structured literature review (SLR), following the approach proposed by Massaro, Dumay and Guthrie. The SLR approach differs from traditional narrative reviews since, like other meta-analysis methods, it adopts a replicable and transparent process. At the same time, when compared to most common meta-analysis or systematic review logics, it is better suited to incorporate evidence from case studies and etnographies. This makes the method especially suited to public administration and management studies.

Findings

Results shed light on the nature of the academic literature relating coproduction to cost efficiency, on what type of costs are affected and how and on the meaningfulness of productivity measures when public services are co-produced.

Originality/value

In times of fiscal distress for many governments, the paper contributes to research and practice in systematically re-assessing the effects of coproduction on public budgets.

Details

Journal of Public Budgeting, Accounting & Financial Management, vol. 32 no. 1
Type: Research Article
ISSN: 1096-3367

Keywords

Open Access
Article
Publication date: 8 December 2023

Tommaso Piseddu and Fedra Vanhuyse

With more cities aiming to achieve climate neutrality, identifying the funding to support these plans is essential. The purpose of this paper is to exploit the present of a…

Abstract

Purpose

With more cities aiming to achieve climate neutrality, identifying the funding to support these plans is essential. The purpose of this paper is to exploit the present of a structured green bonds framework in Sweden to investigate the typology of abatement projects Swedish municipalities invested in and understand their effectiveness.

Design/methodology/approach

Marginal abatement cost curves of the green bond measures are constructed by using the financial and abatement data provided by municipalities on an annual basis.

Findings

The results highlight the economic competitiveness of clean energy production, measured in abatement potential per unit of currency, even when compared to other emerging technologies that have attracted the interest of policymakers. A comparison with previous studies on the cost efficiency of carbon capture storage reveals that clean energy projects, especially wind energy production, can contribute to the reduction of emissions in a more efficient way. The Swedish carbon tax is a good incentive tool for investments in clean energy projects.

Originality/value

The improvement concerning previous applications is twofold: the authors expand the financial considerations to include the whole life-cycle costs, and the authors consider all the greenhouse gases. This research constitutes a prime in using financial and environmental data produced by local governments to assess the effectiveness of their environmental measures.

Details

Studies in Economics and Finance, vol. 41 no. 3
Type: Research Article
ISSN: 1086-7376

Keywords

Open Access
Article
Publication date: 10 March 2023

Sini Laari, Harri Lorentz, Patrik Jonsson and Roger Lindau

Drawing on information processing theory, the linkage between buffering and bridging and the ability on the part of procurement to resolve demand–supply imbalances is…

3570

Abstract

Purpose

Drawing on information processing theory, the linkage between buffering and bridging and the ability on the part of procurement to resolve demand–supply imbalances is investigated, as well as contexts in which these strategies may be particularly useful or detrimental. Buffering may be achieved through demand change or redundancy, while bridging may be achieved by the means of collaboration or monitoring.

Design/methodology/approach

This study employs a hierarchical regression analysis of a survey of 150 Finnish and Swedish procurement and sales and operations planning professionals, each responding from the perspective of their own area of supply responsibility.

Findings

Both the demand change and redundancy varieties of buffering are associated with procurement's ability to resolve demand–supply imbalances without delivery disruptions, but not with cost-efficient resolution. Bridging is associated with the cost-efficient resolution of imbalances: while collaboration offers benefits, monitoring seems to make things worse. Dynamism diminishes, while the co-management of procurement in S&OP improves procurement's ability to resolve demand–supply imbalances. The most potent strategy for tackling problematic contexts appears to be buffering via demand change.

Practical implications

The results highlight the importance of procurement in the S&OP process and suggest tactical measures that can be taken to resolve and reduce the effects of supply and demand imbalances.

Originality/value

The results contribute to the procurement and S&OP literature by increasing knowledge regarding the role and integration of procurement to the crucial process of balancing demand and supply operations.

Details

International Journal of Operations & Production Management, vol. 43 no. 13
Type: Research Article
ISSN: 0144-3577

Keywords

Open Access
Article
Publication date: 1 June 2021

Sarah Korein, Ahmed Abotalib, Mariusz Trojak and Heba Abou-El-Sood

This paper is motivated by the heated debates preceding the introduction of additional regulatory requirements of Basel III on capital conservation buffer (CCB) and regulatory…

1513

Abstract

Purpose

This paper is motivated by the heated debates preceding the introduction of additional regulatory requirements of Basel III on capital conservation buffer (CCB) and regulatory leverage (RLEV) in banks of emerging markets. The paper aims to examine which policy ratio can improve bank efficiency (BE), in one of the most resilient banking settings in the Middle East and North Africa (MENA) region.

Design/methodology/approach

The analysis is performed on a sample of 13 banks for the period 2010–2018 in Egypt and proceeds in two steps. In the first step, the data envelopment analysis model is used to derive bank-specific efficiency scores. In the second step, BE scores are regressed on the two types of regulatory capital and a set of control variables.

Findings

The paper is motivated by regulatory debates on the viability of RLEV and CCB in enhancing BE. The results show that higher RLEV and CCB are associated with a reduction in BE and that RLEV is highly associated with BE compared to CCB. Hence, results are relevant to policymakers in designing measures for improving BE in emerging markets.

Originality/value

The findings contribute to a small but growing stream of research on capital adequacy in emerging markets. This study provides results on the viability of risk-based vs non-risk-based capital requirements. The findings are also relevant to bank regulators in similar emerging market settings in their efforts to introduce and phase in minimum leverage requirements according to Basel III.

Details

Journal of Humanities and Applied Social Sciences, vol. 4 no. 4
Type: Research Article
ISSN:

Keywords

Open Access
Article
Publication date: 3 May 2023

Lars Stehn and Alexander Jimenez

The purpose of this paper is to understand if and how industrialized house building (IHB) could support productivity developments for housebuilding on project and industry levels…

Abstract

Purpose

The purpose of this paper is to understand if and how industrialized house building (IHB) could support productivity developments for housebuilding on project and industry levels. The take is that fragmentation of construction is one explanation for the lack of productivity growth, and that IHB could be an integrating method of overcoming horizontal and vertical fragmentation.

Design/methodology/approach

Singe-factor productivity measures are calculated based on data reported by IHB companies and compared to official produced and published research data. The survey covers the years 2013–2020 for IHB companies building multi-storey houses in timber. Generalization is sought through descriptive statistics by contrasting the data samples to the used means to control vertical and horizontal fragmentation formulated as three theoretical propositions.

Findings

According to the results, IHB in timber is on average more productive than conventional housebuilding at the company level, project level, in absolute and in growth terms over the eight-year period. On the company level, the labour productivity was on average 10% higher for IHB compared to general construction and positioned between general construction and general manufacturing. On the project level, IHB displayed an average cost productivity growth of 19% for an employed prefabrication degree of about 45%.

Originality/value

Empirical evidence is presented quantifying so far perceived advantages of IHB. By providing analysis of actual cost and project data derived from IHB companies, the article quantifies previous research that IHB is not only about prefabrication. The observed positive productivity growth in relation to the employed prefabrication degree indicates that off-site production is not a sufficient mean for reaching high productivity and productivity growth. Instead, the capabilities to integrate the operative logic of conventional housebuilding together with logic of IHB platform development and use is a probable explanation of the observed positive productivity growth.

Details

Construction Innovation , vol. 24 no. 7
Type: Research Article
ISSN: 1471-4175

Keywords

Open Access
Article
Publication date: 21 June 2022

Abhishek Das and Mihir Narayan Mohanty

In time and accurate detection of cancer can save the life of the person affected. According to the World Health Organization (WHO), breast cancer occupies the most frequent…

Abstract

Purpose

In time and accurate detection of cancer can save the life of the person affected. According to the World Health Organization (WHO), breast cancer occupies the most frequent incidence among all the cancers whereas breast cancer takes fifth place in the case of mortality numbers. Out of many image processing techniques, certain works have focused on convolutional neural networks (CNNs) for processing these images. However, deep learning models are to be explored well.

Design/methodology/approach

In this work, multivariate statistics-based kernel principal component analysis (KPCA) is used for essential features. KPCA is simultaneously helpful for denoising the data. These features are processed through a heterogeneous ensemble model that consists of three base models. The base models comprise recurrent neural network (RNN), long short-term memory (LSTM) and gated recurrent unit (GRU). The outcomes of these base learners are fed to fuzzy adaptive resonance theory mapping (ARTMAP) model for decision making as the nodes are added to the F_2ˆa layer if the winning criteria are fulfilled that makes the ARTMAP model more robust.

Findings

The proposed model is verified using breast histopathology image dataset publicly available at Kaggle. The model provides 99.36% training accuracy and 98.72% validation accuracy. The proposed model utilizes data processing in all aspects, i.e. image denoising to reduce the data redundancy, training by ensemble learning to provide higher results than that of single models. The final classification by a fuzzy ARTMAP model that controls the number of nodes depending upon the performance makes robust accurate classification.

Research limitations/implications

Research in the field of medical applications is an ongoing method. More advanced algorithms are being developed for better classification. Still, the scope is there to design the models in terms of better performance, practicability and cost efficiency in the future. Also, the ensemble models may be chosen with different combinations and characteristics. Only signal instead of images may be verified for this proposed model. Experimental analysis shows the improved performance of the proposed model. This method needs to be verified using practical models. Also, the practical implementation will be carried out for its real-time performance and cost efficiency.

Originality/value

The proposed model is utilized for denoising and to reduce the data redundancy so that the feature selection is done using KPCA. Training and classification are performed using heterogeneous ensemble model designed using RNN, LSTM and GRU as base classifiers to provide higher results than that of single models. Use of adaptive fuzzy mapping model makes the final classification accurate. The effectiveness of combining these methods to a single model is analyzed in this work.

Details

Applied Computing and Informatics, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2634-1964

Keywords

Open Access
Article
Publication date: 23 May 2024

Ali İhsan Akgün

The purpose of this study is to focus on, namely, the international financial reporting standards (IFRS) or local generally accepted accounting principles (GAAP) effects of…

Abstract

Purpose

The purpose of this study is to focus on, namely, the international financial reporting standards (IFRS) or local generally accepted accounting principles (GAAP) effects of financial reporting as a corporate governance mechanism on mergers and acquisitions (M&As) for banking institutions during the global financial crisis.

Design/methodology/approach

I investigate the characteristics of bank financial statements before the start of the global crisis, which helps to explain the relationships between the accounting standards and the global financial crisis. The observations, which are based on 3,178 deals in a sample period, are crucially important for corporate governance and bank performance. The results from our analysis are robust to a wide variety of modifications in our research design and are corroborated by descriptive statistics, one-way ANOVA and a two-sample t-test on a sample of banks that voluntarily adopted IFRS for M&As.

Findings

The find that IFRS-based monitoring of banks M&As in terms of higher quality financial reporting is negatively linked with bank performance, whereas local GAAP-based monitoring of banks’ M&A is positively associated with accounting performance. Finally, our main results for higher quality financial reporting under local GAAP or IFRS generally hold after controlling for various analyses and relationships between account standards and the financial crisis.

Practical implications

Financial reporting standards setting a corporate governance mechanism are considered since it was impacted recently during the global financial crisis and became a great matter of concern.

Originality/value

The value of this paper is determined by an empirical investigation of the relationships between bank performance and accounting and financial reporting standards in the context of the global economy.

Details

China Accounting and Finance Review, vol. 26 no. 3
Type: Research Article
ISSN: 1029-807X

Keywords

Open Access
Article
Publication date: 9 April 2024

Krisztina Demeter, Levente Szász, Béla-Gergely Rácz and Lehel-Zoltán Györfy

The purpose of this paper is to investigate how different manufacturing technologies are bundled together and how these bundles influence operations performance and, indirectly…

Abstract

Purpose

The purpose of this paper is to investigate how different manufacturing technologies are bundled together and how these bundles influence operations performance and, indirectly, business performance. With the emergence of Industry 4.0 (I4.0) technologies, manufacturing companies can use a wide variety of advanced manufacturing technologies (AMT) to build an efficient and effective production system. Nevertheless, the literature offers little guidance on how these technologies, including novel I4.0 technologies, should be combined in practice and how these combinations might have a different impact on performance.

Design/methodology/approach

Using a survey study of 165 manufacturing plants from 11 different countries, we use factor analysis to empirically derive three distinct manufacturing technology bundles and structural equation modeling to quantify their relationship with operations and business performance.

Findings

Our findings support an evolutionary rather than a revolutionary perspective. I4.0 technologies build on traditional manufacturing technologies and do not constitute a separate direction that would point towards a fundamental digital transformation of companies within our sample. Performance effects are rather weak: out of the three technology bundles identified, only “automation and robotization” have a positive influence on cost efficiency, while “base technologies” and “data-enabled technologies” do not offer a competitive advantage, neither in terms of cost nor in terms of differentiation. Furthermore, while the business performance impact is positive, it is quite weak, suggesting that financial returns on technology investments might require longer time periods.

Originality/value

Relying on a complementarity approach, our research offers a novel perspective on technology implementation in the I4.0 era by investigating novel and traditional manufacturing technologies together.

Details

Journal of Manufacturing Technology Management, vol. 35 no. 9
Type: Research Article
ISSN: 1741-038X

Keywords

1 – 10 of 363