Search results

1 – 10 of 19
Open Access
Article
Publication date: 25 June 2020

Paula Cruz-García, Anabel Forte and Jesús Peiró-Palomino

There is abundant literature analyzing the determinants of banks’ profitability through its main component: the net interest margin. Some of these determinants are suggested by…

1960

Abstract

Purpose

There is abundant literature analyzing the determinants of banks’ profitability through its main component: the net interest margin. Some of these determinants are suggested by seminal theoretical models and subsequent expansions. Others are ad-hoc selections. Up to now, there are no studies assessing these models from a Bayesian model uncertainty perspective. This paper aims to analyze this issue for the EU-15 countries for the period 2008-2014, which mainly corresponds to the Great Recession years.

Design/methodology/approach

It follows a Bayesian variable selection approach to analyze, in a first step, which variables of those suggested by the literature are actually good predictors of banks’ net interest margin. In a second step, using a model selection approach, the authors select the model with the best fit. Finally, the paper provides inference and quantifies the economic impact of the variables selected as good candidates.

Findings

The results widely support the validity of the determinants proposed by the seminal models, with only minor discrepancies, reinforcing their capacity to explain net interest margin disparities also during the recent period of restructuring of the banking industry.

Originality/value

The paper is, to the best of the knowledge, the first one following a Bayesian variable selection approach in this field of the literature.

Details

Applied Economic Analysis, vol. 28 no. 83
Type: Research Article
ISSN: 2632-7627

Keywords

Open Access
Article
Publication date: 12 January 2024

Patrik Jonsson, Johan Öhlin, Hafez Shurrab, Johan Bystedt, Azam Sheikh Muhammad and Vilhelm Verendel

This study aims to explore and empirically test variables influencing material delivery schedule inaccuracies?

Abstract

Purpose

This study aims to explore and empirically test variables influencing material delivery schedule inaccuracies?

Design/methodology/approach

A mixed-method case approach is applied. Explanatory variables are identified from the literature and explored in a qualitative analysis at an automotive original equipment manufacturer. Using logistic regression and random forest classification models, quantitative data (historical schedule transactions and internal data) enables the testing of the predictive difference of variables under various planning horizons and inaccuracy levels.

Findings

The effects on delivery schedule inaccuracies are contingent on a decoupling point, and a variable may have a combined amplifying (complexity generating) and stabilizing (complexity absorbing) moderating effect. Product complexity variables are significant regardless of the time horizon, and the item’s order life cycle is a significant variable with predictive differences that vary. Decoupling management is identified as a mechanism for generating complexity absorption capabilities contributing to delivery schedule accuracy.

Practical implications

The findings provide guidelines for exploring and finding patterns in specific variables to improve material delivery schedule inaccuracies and input into predictive forecasting models.

Originality/value

The findings contribute to explaining material delivery schedule variations, identifying potential root causes and moderators, empirically testing and validating effects and conceptualizing features that cause and moderate inaccuracies in relation to decoupling management and complexity theory literature?

Details

International Journal of Operations & Production Management, vol. 44 no. 13
Type: Research Article
ISSN: 0144-3577

Keywords

Open Access
Article
Publication date: 3 February 2018

M. Sudha and A. Kumaravel

Rough set theory is a simple and potential methodology in extracting and minimizing rules from decision tables. Its concepts are core, reduct and discovering knowledge in the form…

Abstract

Rough set theory is a simple and potential methodology in extracting and minimizing rules from decision tables. Its concepts are core, reduct and discovering knowledge in the form of rules. The decision rules explain the decision state to predict and support the new situation. Initially it was proposed as a useful tool for analysis of decision states. This approach produces a set of decision rules involves two types namely certain and possible rules based on approximation. The prediction may highly be affected if the data size varies in larger numbers. Application of Rough set theory towards this direction has not been considered yet. Hence the main objective of this paper is to study the influence of data size and the number of rules generated by rough set methods. The performance of these methods is presented through the metric like accuracy and quality of classification. The results obtained show the range of performance and first of its kind in current research trend.

Details

Applied Computing and Informatics, vol. 16 no. 1/2
Type: Research Article
ISSN: 2634-1964

Keywords

Open Access
Article
Publication date: 13 June 2023

Marissa Condon

The purpose of the paper is the simulation of nonuniform transmission lines.

Abstract

Purpose

The purpose of the paper is the simulation of nonuniform transmission lines.

Design/methodology/approach

The method involves a Magnus expansion and a numerical Laplace transform. The method involves a judicious arrangement of the governing equations so as to enable efficient simulation.

Findings

The results confirm an effective and efficient numerical solver for inclusion of nonuniform transmission lines in circuit simulation.

Originality/value

The work combines a Magnus expansion and numerical Laplace transform algorithm in a novel manner and applies the resultant algorithm for the effective and efficient simulation of nonuniform transmission lines.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering , vol. 43 no. 1
Type: Research Article
ISSN: 0332-1649

Keywords

Open Access
Article
Publication date: 1 June 2021

Ondřej Bublík, Libor Lobovský, Václav Heidler, Tomáš Mandys and Jan Vimmr

The paper targets on providing new experimental data for validation of the well-established mathematical models within the framework of the lattice Boltzmann method (LBM), which…

Abstract

Purpose

The paper targets on providing new experimental data for validation of the well-established mathematical models within the framework of the lattice Boltzmann method (LBM), which are applied to problems of casting processes in complex mould cavities.

Design/methodology/approach

An experimental campaign aiming at the free-surface flow within a system of narrow channels is designed and executed under well-controlled laboratory conditions. An in-house lattice Boltzmann solver is implemented. Its algorithm is described in detail and its performance is tested thoroughly using both the newly recorded experimental data and well-known analytical benchmark tests.

Findings

The benchmark tests prove the ability of the implemented algorithm to provide a reliable solution when the surface tension effects become dominant. The convergence of the implemented method is assessed. The two new experimentally studied problems are resolved well by simulations using a coarse computational grid.

Originality/value

A detailed set of original experimental data for validation of computational schemes for simulations of free-surface gravity-driven flow within a system of narrow channels is presented.

Details

Engineering Computations, vol. 38 no. 10
Type: Research Article
ISSN: 0264-4401

Keywords

Open Access
Article
Publication date: 4 January 2021

Stefano Costa and Eugenio Costamagna

This paper aims to solve inhomogeneous dielectric problems by matching boundary conditions at the interfaces among homogeneous subdomains. The capabilities of Hilbert transform…

Abstract

Purpose

This paper aims to solve inhomogeneous dielectric problems by matching boundary conditions at the interfaces among homogeneous subdomains. The capabilities of Hilbert transform computations are deeply investigated in the case of limited numbers of samples, and a refined model is presented by means of investigating accuracies in a case study with three subdomains.

Design/methodology/approach

The accuracies, refined by Richardson extrapolation to zero error, are compared to finite element (FEM) and finite difference methods. The boundary matching procedures can be easily applied to the results of a previous Schwarz–Christoffel (SC) conformal mapping stage in SC + BC procedures, to cope with field singularities or with open boundary problems.

Findings

The proposed field computations are of general interest both for electrostatic and magnetostatic field analysis and optimization. They can be useful as comparison tools for FEM results or when severe field singularities can impair the accuracies of other methods.

Research limitations/implications

This static field methodology, of course, can be used to analyse transverse electro magnetic (TEM) or quasi-TEM propagation modes. It is possible that, in some case, these may make a contribution to the analysis of axis symmetrical problems.

Originality/value

The most relevant result is the possible introduction of SC + BC computations as a standard tool for solving inhomogeneous dielectric field problems.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering , vol. 40 no. 2
Type: Research Article
ISSN: 0332-1649

Keywords

Open Access
Article
Publication date: 16 September 2022

Chems Eddine Berrehail and Amar Makhlouf

The objective of this work is to study the periodic solutions for a class of sixth-order autonomous ordinary differential equations x…

Abstract

Purpose

The objective of this work is to study the periodic solutions for a class of sixth-order autonomous ordinary differential equations x(6)+(1+p2+q2)x… .+(p2+q2+p2q2)x¨+p2q2x=εF(x,ẋ,x¨,x,x… .,x(5)), where p and q are rational numbers different from 1, 0, −1 and pq, ε is a small enough parameter and FC2 is a nonlinear autonomous function.

Design/methodology/approach

The authors shall use the averaging theory to study the periodic solutions for a class of perturbed sixth-order autonomous differential equations (DEs). The averaging theory is a classical tool for the study of the dynamics of nonlinear differential systems with periodic forcing. The averaging theory has a long history that begins with the classical work of Lagrange and Laplace. The averaging theory is used to the study of periodic solutions for second and higher order DEs.

Findings

All the main results for the periodic solutions for a class of perturbed sixth-order autonomous DEs are presenting in the Theorem 1. The authors present some applications to illustrate the main results.

Originality/value

The authors studied Equation 1 which depends explicitly on the independent variable t. Here, the authors studied the autonomous case using a different approach.

Details

Arab Journal of Mathematical Sciences, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1319-5166

Keywords

Open Access
Article
Publication date: 19 April 2024

Bong-Gyu Jang and Hyeng Keun Koo

We present an approach for pricing American put options with a regime-switching volatility. Our method reveals that the option price can be expressed as the sum of two components…

Abstract

We present an approach for pricing American put options with a regime-switching volatility. Our method reveals that the option price can be expressed as the sum of two components: the price of a European put option and the premium associated with the early exercise privilege. Our analysis demonstrates that, under these conditions, the perpetual put option consistently commands a higher price during periods of high volatility compared to those of low volatility. Moreover, we establish that the optimal exercise boundary is lower in high-volatility regimes than in low-volatility regimes. Additionally, we develop an analytical framework to describe American puts with an Erlang-distributed random-time horizon, which allows us to propose a numerical technique for approximating the value of American puts with finite expiry. We also show that a combined approach involving randomization and Richardson extrapolation can be a robust numerical algorithm for estimating American put prices with finite expiry.

Details

Journal of Derivatives and Quantitative Studies: 선물연구, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1229-988X

Keywords

Open Access
Article
Publication date: 2 December 2016

Taylor Boyd, Grace Docken and John Ruggiero

The purpose of this paper is to improve the estimation of the production frontier in cases where outliers exist. We focus on the case when outliers appear above the true frontier…

2649

Abstract

Purpose

The purpose of this paper is to improve the estimation of the production frontier in cases where outliers exist. We focus on the case when outliers appear above the true frontier due to measurement error.

Design/methodology/approach

The authors use stochastic data envelopment analysis (SDEA) to allow observed points above the frontier. They supplement SDEA with assumptions on the efficiency and show that the true frontier in the presence of outliers can be derived.

Findings

This paper finds that the authors’ maximum likelihood approach outperforms super-efficiency measures. Using simulations, this paper shows that SDEA is a useful model for outlier detection.

Originality/value

The model developed in this paper is original; the authors add distributional assumptions to derive the optimal quantile with SDEA to remove outliers. The authors believe that the value of the paper will lead to many citations because real-world data are often subject to outliers.

Details

Journal of Centrum Cathedra, vol. 9 no. 2
Type: Research Article
ISSN: 1851-6599

Keywords

Open Access
Article
Publication date: 21 December 2021

Vahid Badeli, Sascha Ranftl, Gian Marco Melito, Alice Reinbacher-Köstinger, Wolfgang Von Der Linden, Katrin Ellermann and Oszkar Biro

This paper aims to introduce a non-invasive and convenient method to detect a life-threatening disease called aortic dissection. A Bayesian inference based on enhanced…

Abstract

Purpose

This paper aims to introduce a non-invasive and convenient method to detect a life-threatening disease called aortic dissection. A Bayesian inference based on enhanced multi-sensors impedance cardiography (ICG) method has been applied to classify signals from healthy and sick patients.

Design/methodology/approach

A 3D numerical model consisting of simplified organ geometries is used to simulate the electrical impedance changes in the ICG-relevant domain of the human torso. The Bayesian probability theory is used for detecting an aortic dissection, which provides information about the probabilities for both cases, a dissected and a healthy aorta. Thus, the reliability and the uncertainty of the disease identification are found by this method and may indicate further diagnostic clarification.

Findings

The Bayesian classification shows that the enhanced multi-sensors ICG is more reliable in detecting aortic dissection than conventional ICG. Bayesian probability theory allows a rigorous quantification of all uncertainties to draw reliable conclusions for the medical treatment of aortic dissection.

Originality/value

This paper presents a non-invasive and reliable method based on a numerical simulation that could be beneficial for the medical management of aortic dissection patients. With this method, clinicians would be able to monitor the patient’s status and make better decisions in the treatment procedure of each patient.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering , vol. 41 no. 3
Type: Research Article
ISSN: 0332-1649

Keywords

1 – 10 of 19