Search results
1 – 10 of 19Paula Cruz-García, Anabel Forte and Jesús Peiró-Palomino
There is abundant literature analyzing the determinants of banks’ profitability through its main component: the net interest margin. Some of these determinants are suggested by…
Abstract
Purpose
There is abundant literature analyzing the determinants of banks’ profitability through its main component: the net interest margin. Some of these determinants are suggested by seminal theoretical models and subsequent expansions. Others are ad-hoc selections. Up to now, there are no studies assessing these models from a Bayesian model uncertainty perspective. This paper aims to analyze this issue for the EU-15 countries for the period 2008-2014, which mainly corresponds to the Great Recession years.
Design/methodology/approach
It follows a Bayesian variable selection approach to analyze, in a first step, which variables of those suggested by the literature are actually good predictors of banks’ net interest margin. In a second step, using a model selection approach, the authors select the model with the best fit. Finally, the paper provides inference and quantifies the economic impact of the variables selected as good candidates.
Findings
The results widely support the validity of the determinants proposed by the seminal models, with only minor discrepancies, reinforcing their capacity to explain net interest margin disparities also during the recent period of restructuring of the banking industry.
Originality/value
The paper is, to the best of the knowledge, the first one following a Bayesian variable selection approach in this field of the literature.
Details
Keywords
Patrik Jonsson, Johan Öhlin, Hafez Shurrab, Johan Bystedt, Azam Sheikh Muhammad and Vilhelm Verendel
This study aims to explore and empirically test variables influencing material delivery schedule inaccuracies?
Abstract
Purpose
This study aims to explore and empirically test variables influencing material delivery schedule inaccuracies?
Design/methodology/approach
A mixed-method case approach is applied. Explanatory variables are identified from the literature and explored in a qualitative analysis at an automotive original equipment manufacturer. Using logistic regression and random forest classification models, quantitative data (historical schedule transactions and internal data) enables the testing of the predictive difference of variables under various planning horizons and inaccuracy levels.
Findings
The effects on delivery schedule inaccuracies are contingent on a decoupling point, and a variable may have a combined amplifying (complexity generating) and stabilizing (complexity absorbing) moderating effect. Product complexity variables are significant regardless of the time horizon, and the item’s order life cycle is a significant variable with predictive differences that vary. Decoupling management is identified as a mechanism for generating complexity absorption capabilities contributing to delivery schedule accuracy.
Practical implications
The findings provide guidelines for exploring and finding patterns in specific variables to improve material delivery schedule inaccuracies and input into predictive forecasting models.
Originality/value
The findings contribute to explaining material delivery schedule variations, identifying potential root causes and moderators, empirically testing and validating effects and conceptualizing features that cause and moderate inaccuracies in relation to decoupling management and complexity theory literature?
Details
Keywords
M. Sudha and A. Kumaravel
Rough set theory is a simple and potential methodology in extracting and minimizing rules from decision tables. Its concepts are core, reduct and discovering knowledge in the form…
Abstract
Rough set theory is a simple and potential methodology in extracting and minimizing rules from decision tables. Its concepts are core, reduct and discovering knowledge in the form of rules. The decision rules explain the decision state to predict and support the new situation. Initially it was proposed as a useful tool for analysis of decision states. This approach produces a set of decision rules involves two types namely certain and possible rules based on approximation. The prediction may highly be affected if the data size varies in larger numbers. Application of Rough set theory towards this direction has not been considered yet. Hence the main objective of this paper is to study the influence of data size and the number of rules generated by rough set methods. The performance of these methods is presented through the metric like accuracy and quality of classification. The results obtained show the range of performance and first of its kind in current research trend.
Details
Keywords
The purpose of the paper is the simulation of nonuniform transmission lines.
Abstract
Purpose
The purpose of the paper is the simulation of nonuniform transmission lines.
Design/methodology/approach
The method involves a Magnus expansion and a numerical Laplace transform. The method involves a judicious arrangement of the governing equations so as to enable efficient simulation.
Findings
The results confirm an effective and efficient numerical solver for inclusion of nonuniform transmission lines in circuit simulation.
Originality/value
The work combines a Magnus expansion and numerical Laplace transform algorithm in a novel manner and applies the resultant algorithm for the effective and efficient simulation of nonuniform transmission lines.
Details
Keywords
Ondřej Bublík, Libor Lobovský, Václav Heidler, Tomáš Mandys and Jan Vimmr
The paper targets on providing new experimental data for validation of the well-established mathematical models within the framework of the lattice Boltzmann method (LBM), which…
Abstract
Purpose
The paper targets on providing new experimental data for validation of the well-established mathematical models within the framework of the lattice Boltzmann method (LBM), which are applied to problems of casting processes in complex mould cavities.
Design/methodology/approach
An experimental campaign aiming at the free-surface flow within a system of narrow channels is designed and executed under well-controlled laboratory conditions. An in-house lattice Boltzmann solver is implemented. Its algorithm is described in detail and its performance is tested thoroughly using both the newly recorded experimental data and well-known analytical benchmark tests.
Findings
The benchmark tests prove the ability of the implemented algorithm to provide a reliable solution when the surface tension effects become dominant. The convergence of the implemented method is assessed. The two new experimentally studied problems are resolved well by simulations using a coarse computational grid.
Originality/value
A detailed set of original experimental data for validation of computational schemes for simulations of free-surface gravity-driven flow within a system of narrow channels is presented.
Details
Keywords
Stefano Costa and Eugenio Costamagna
This paper aims to solve inhomogeneous dielectric problems by matching boundary conditions at the interfaces among homogeneous subdomains. The capabilities of Hilbert transform…
Abstract
Purpose
This paper aims to solve inhomogeneous dielectric problems by matching boundary conditions at the interfaces among homogeneous subdomains. The capabilities of Hilbert transform computations are deeply investigated in the case of limited numbers of samples, and a refined model is presented by means of investigating accuracies in a case study with three subdomains.
Design/methodology/approach
The accuracies, refined by Richardson extrapolation to zero error, are compared to finite element (FEM) and finite difference methods. The boundary matching procedures can be easily applied to the results of a previous Schwarz–Christoffel (SC) conformal mapping stage in SC + BC procedures, to cope with field singularities or with open boundary problems.
Findings
The proposed field computations are of general interest both for electrostatic and magnetostatic field analysis and optimization. They can be useful as comparison tools for FEM results or when severe field singularities can impair the accuracies of other methods.
Research limitations/implications
This static field methodology, of course, can be used to analyse transverse electro magnetic (TEM) or quasi-TEM propagation modes. It is possible that, in some case, these may make a contribution to the analysis of axis symmetrical problems.
Originality/value
The most relevant result is the possible introduction of SC + BC computations as a standard tool for solving inhomogeneous dielectric field problems.
Details
Keywords
Chems Eddine Berrehail and Amar Makhlouf
The objective of this work is to study the periodic solutions for a class of sixth-order autonomous ordinary differential equations
Abstract
Purpose
The objective of this work is to study the periodic solutions for a class of sixth-order autonomous ordinary differential equations
Design/methodology/approach
The authors shall use the averaging theory to study the periodic solutions for a class of perturbed sixth-order autonomous differential equations (DEs). The averaging theory is a classical tool for the study of the dynamics of nonlinear differential systems with periodic forcing. The averaging theory has a long history that begins with the classical work of Lagrange and Laplace. The averaging theory is used to the study of periodic solutions for second and higher order DEs.
Findings
All the main results for the periodic solutions for a class of perturbed sixth-order autonomous DEs are presenting in the Theorem 1. The authors present some applications to illustrate the main results.
Originality/value
The authors studied Equation 1 which depends explicitly on the independent variable t. Here, the authors studied the autonomous case using a different approach.
Details
Keywords
Bong-Gyu Jang and Hyeng Keun Koo
We present an approach for pricing American put options with a regime-switching volatility. Our method reveals that the option price can be expressed as the sum of two components…
Abstract
We present an approach for pricing American put options with a regime-switching volatility. Our method reveals that the option price can be expressed as the sum of two components: the price of a European put option and the premium associated with the early exercise privilege. Our analysis demonstrates that, under these conditions, the perpetual put option consistently commands a higher price during periods of high volatility compared to those of low volatility. Moreover, we establish that the optimal exercise boundary is lower in high-volatility regimes than in low-volatility regimes. Additionally, we develop an analytical framework to describe American puts with an Erlang-distributed random-time horizon, which allows us to propose a numerical technique for approximating the value of American puts with finite expiry. We also show that a combined approach involving randomization and Richardson extrapolation can be a robust numerical algorithm for estimating American put prices with finite expiry.
Details
Keywords
Taylor Boyd, Grace Docken and John Ruggiero
The purpose of this paper is to improve the estimation of the production frontier in cases where outliers exist. We focus on the case when outliers appear above the true frontier…
Abstract
Purpose
The purpose of this paper is to improve the estimation of the production frontier in cases where outliers exist. We focus on the case when outliers appear above the true frontier due to measurement error.
Design/methodology/approach
The authors use stochastic data envelopment analysis (SDEA) to allow observed points above the frontier. They supplement SDEA with assumptions on the efficiency and show that the true frontier in the presence of outliers can be derived.
Findings
This paper finds that the authors’ maximum likelihood approach outperforms super-efficiency measures. Using simulations, this paper shows that SDEA is a useful model for outlier detection.
Originality/value
The model developed in this paper is original; the authors add distributional assumptions to derive the optimal quantile with SDEA to remove outliers. The authors believe that the value of the paper will lead to many citations because real-world data are often subject to outliers.
Details
Keywords
Vahid Badeli, Sascha Ranftl, Gian Marco Melito, Alice Reinbacher-Köstinger, Wolfgang Von Der Linden, Katrin Ellermann and Oszkar Biro
This paper aims to introduce a non-invasive and convenient method to detect a life-threatening disease called aortic dissection. A Bayesian inference based on enhanced…
Abstract
Purpose
This paper aims to introduce a non-invasive and convenient method to detect a life-threatening disease called aortic dissection. A Bayesian inference based on enhanced multi-sensors impedance cardiography (ICG) method has been applied to classify signals from healthy and sick patients.
Design/methodology/approach
A 3D numerical model consisting of simplified organ geometries is used to simulate the electrical impedance changes in the ICG-relevant domain of the human torso. The Bayesian probability theory is used for detecting an aortic dissection, which provides information about the probabilities for both cases, a dissected and a healthy aorta. Thus, the reliability and the uncertainty of the disease identification are found by this method and may indicate further diagnostic clarification.
Findings
The Bayesian classification shows that the enhanced multi-sensors ICG is more reliable in detecting aortic dissection than conventional ICG. Bayesian probability theory allows a rigorous quantification of all uncertainties to draw reliable conclusions for the medical treatment of aortic dissection.
Originality/value
This paper presents a non-invasive and reliable method based on a numerical simulation that could be beneficial for the medical management of aortic dissection patients. With this method, clinicians would be able to monitor the patient’s status and make better decisions in the treatment procedure of each patient.
Details