Search results

1 – 10 of 217
Content available
Book part
Publication date: 14 November 2022

Karen McGregor Richmond

Abstract

Details

Marketisation and Forensic Science Provision in England and Wales
Type: Book
ISBN: 978-1-83909-124-7

Open Access
Article
Publication date: 27 August 2020

Dieter Koemle and Xiaohua Yu

This paper reviews the current literature on theoretical and methodological issues in discrete choice experiments, which have been widely used in non-market value analysis, such…

9244

Abstract

Purpose

This paper reviews the current literature on theoretical and methodological issues in discrete choice experiments, which have been widely used in non-market value analysis, such as elicitation of residents' attitudes toward recreation or biodiversity conservation of forests.

Design/methodology/approach

We review the literature, and attribute the possible biases in choice experiments to theoretical and empirical aspects. Particularly, we introduce regret minimization as an alternative to random utility theory and sheds light on incentive compatibility, status quo, attributes non-attendance, cognitive load, experimental design, survey methods, estimation strategies and other issues.

Findings

The practitioners should pay attention to many issues when carrying out choice experiments in order to avoid possible biases. Many alternatives in theoretical foundations, experimental designs, estimation strategies and even explanations should be taken into account in practice in order to obtain robust results.

Originality/value

The paper summarizes the recent developments in methodological and empirical issues of choice experiments and points out the pitfalls and future directions both theoretically and empirically.

Details

Forestry Economics Review, vol. 2 no. 1
Type: Research Article
ISSN: 2631-3030

Keywords

Open Access
Article
Publication date: 20 October 2023

Thembeka Sibahle Ngcobo, Lindokuhle Talent Zungu and Nomusa Yolanda Nkomo

This study aims to test the dynamic impact of public debt and economic growth on newly democratized African countries (South Africa and Namibia) and compare the findings with…

Abstract

Purpose

This study aims to test the dynamic impact of public debt and economic growth on newly democratized African countries (South Africa and Namibia) and compare the findings with those of newly democratized European countries (Germany and Ukraine) during the period 1990–2022.

Design/methodology/approach

The methodology involves three stages: identifying the appropriate transition variable, assessing the linearity between public debt and economic growth and selecting the order m of the transition function. The linearity test helps identify the nature of relationships between public debt and economic growth. The wild cluster bootstrap-Lagrange Multiplier test is used to evaluate the model’s appropriateness. All these tests would be executed using the Lagrange Multiplier type of test.

Findings

The results signify the policy switch, as the authors find that the relationship between public debt and economic growth is characterized by two transitions that symbolize that the current stage of the relationship is beyond the U-shape; however, an S-shape. The results show that for newly democratized African countries, the threshold during the first waves was 50% of GDP, represented by a U-shape, which then transits to an inverted U-shape with a threshold of 65% of GDP. Then, for the European case, it was 60% of GDP, which is now 72% of GDP.

Originality/value

The findings suggest that an escalating level of public debt has a negative impact on economic growth; therefore, it is important to implement fiscal discipline, prioritize government spending and reduce reliance on debt financing. This can be achieved by focusing on revenue generation, implementing effective taxation policies, reducing wasteful expenditures and promoting investment and productivity-enhancing measures.

Details

International Journal of Development Issues, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1446-8956

Keywords

Content available

Abstract

Details

Balance Sheet, vol. 12 no. 5
Type: Research Article
ISSN: 0965-7967

Open Access
Article
Publication date: 21 December 2021

Vahid Badeli, Sascha Ranftl, Gian Marco Melito, Alice Reinbacher-Köstinger, Wolfgang Von Der Linden, Katrin Ellermann and Oszkar Biro

This paper aims to introduce a non-invasive and convenient method to detect a life-threatening disease called aortic dissection. A Bayesian inference based on enhanced…

Abstract

Purpose

This paper aims to introduce a non-invasive and convenient method to detect a life-threatening disease called aortic dissection. A Bayesian inference based on enhanced multi-sensors impedance cardiography (ICG) method has been applied to classify signals from healthy and sick patients.

Design/methodology/approach

A 3D numerical model consisting of simplified organ geometries is used to simulate the electrical impedance changes in the ICG-relevant domain of the human torso. The Bayesian probability theory is used for detecting an aortic dissection, which provides information about the probabilities for both cases, a dissected and a healthy aorta. Thus, the reliability and the uncertainty of the disease identification are found by this method and may indicate further diagnostic clarification.

Findings

The Bayesian classification shows that the enhanced multi-sensors ICG is more reliable in detecting aortic dissection than conventional ICG. Bayesian probability theory allows a rigorous quantification of all uncertainties to draw reliable conclusions for the medical treatment of aortic dissection.

Originality/value

This paper presents a non-invasive and reliable method based on a numerical simulation that could be beneficial for the medical management of aortic dissection patients. With this method, clinicians would be able to monitor the patient’s status and make better decisions in the treatment procedure of each patient.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering , vol. 41 no. 3
Type: Research Article
ISSN: 0332-1649

Keywords

Open Access
Article
Publication date: 16 June 2022

Dejan Živkov and Jasmina Đurašković

This paper aims to investigate how oil price uncertainty affects real gross domestic product (GDP) and industrial production in eight Central and Eastern European countries (CEEC).

1226

Abstract

Purpose

This paper aims to investigate how oil price uncertainty affects real gross domestic product (GDP) and industrial production in eight Central and Eastern European countries (CEEC).

Design/methodology/approach

In the research process, the authors use the Bayesian method of inference for the two applied methodologies – Markov switching generalized autoregressive conditional heteroscedasticity (GARCH) model and quantile regression.

Findings

The results clearly indicate that oil price uncertainty has a low effect on output in moderate market conditions in the selected countries. On the other hand, in the phases of contraction and expansion, which are portrayed by the tail quantiles, the authors find negative and positive Bayesian quantile parameters, which are relatively high in magnitude. This implies that in periods of deep economic crises, an increase in the oil price uncertainty reduces output, amplifying in this way recession pressures in the economy. Contrary, when the economy is in expansion, oil price uncertainty has no influence on the output. The probable reason lies in the fact that the negative effect of oil volatility is not strong enough in the expansion phase to overpower all other positive developments which characterize a growing economy. Also, evidence suggests that increased oil uncertainty has a more negative effect on industrial production than on real GDP, whereas industrial share in GDP plays an important role in how strong some CEECs are impacted by oil uncertainty.

Originality/value

This paper is the first one that investigates the spillover effect from oil uncertainty to output in CEEC.

Details

Applied Economic Analysis, vol. 31 no. 91
Type: Research Article
ISSN: 2632-7627

Keywords

Open Access
Article
Publication date: 25 June 2020

Paula Cruz-García, Anabel Forte and Jesús Peiró-Palomino

There is abundant literature analyzing the determinants of banks’ profitability through its main component: the net interest margin. Some of these determinants are suggested by…

1960

Abstract

Purpose

There is abundant literature analyzing the determinants of banks’ profitability through its main component: the net interest margin. Some of these determinants are suggested by seminal theoretical models and subsequent expansions. Others are ad-hoc selections. Up to now, there are no studies assessing these models from a Bayesian model uncertainty perspective. This paper aims to analyze this issue for the EU-15 countries for the period 2008-2014, which mainly corresponds to the Great Recession years.

Design/methodology/approach

It follows a Bayesian variable selection approach to analyze, in a first step, which variables of those suggested by the literature are actually good predictors of banks’ net interest margin. In a second step, using a model selection approach, the authors select the model with the best fit. Finally, the paper provides inference and quantifies the economic impact of the variables selected as good candidates.

Findings

The results widely support the validity of the determinants proposed by the seminal models, with only minor discrepancies, reinforcing their capacity to explain net interest margin disparities also during the recent period of restructuring of the banking industry.

Originality/value

The paper is, to the best of the knowledge, the first one following a Bayesian variable selection approach in this field of the literature.

Details

Applied Economic Analysis, vol. 28 no. 83
Type: Research Article
ISSN: 2632-7627

Keywords

Open Access
Article
Publication date: 2 September 2019

Pedro Albuquerque, Gisela Demo, Solange Alfinito and Kesia Rozzett

Factor analysis is the most used tool in organizational research and its widespread use in scale validations contribute to decision-making in management. However, standard factor…

1753

Abstract

Purpose

Factor analysis is the most used tool in organizational research and its widespread use in scale validations contribute to decision-making in management. However, standard factor analysis is not always applied correctly mainly due to the misuse of ordinal data as interval data and the inadequacy of the former for classical factor analysis. The purpose of this paper is to present and apply the Bayesian factor analysis for mixed data (BFAMD) in the context of empirical using the Bayesian paradigm for the construction of scales.

Design/methodology/approach

Ignoring the categorical nature of some variables often used in management studies, as the popular Likert scale, may result in a model with false accuracy and possibly biased estimates. To address this issue, Quinn (2004) proposed a Bayesian factor analysis model for mixed data, which is capable of modeling ordinal (qualitative measure) and continuous data (quantitative measure) jointly and allows the inclusion of qualitative information through prior distributions for the parameters’ model. This model, adopted here, presents considering advantages and allows the estimation of the posterior distribution for the latent variables estimated, making the process of inference easier.

Findings

The results show that BFAMD is an effective approach for scale validation in management studies making both exploratory and confirmatory analyses possible for the estimated factors and also allowing the analysts to insert a priori information regardless of the sample size, either by using the credible intervals for Factor Loadings or by conducting specific hypotheses tests. The flexibility of the Bayesian approach presented is counterbalanced by the fact that the main estimates used in factor analysis as uniqueness and communalities commonly lose their usual interpretation due to the choice of using prior distributions.

Originality/value

Considering that the development of scales through factor analysis aims to contribute to appropriate decision-making in management and the increasing misuse of ordinal scales as interval in organizational studies, this proposal seems to be effective for mixed data analyses. The findings found here are not intended to be conclusive or limiting but offer a useful starting point from which further theoretical and empirical research of Bayesian factor analysis can be built.

Details

RAUSP Management Journal, vol. 54 no. 4
Type: Research Article
ISSN: 2531-0488

Keywords

Content available
Book part
Publication date: 15 April 2020

Abstract

Details

Essays in Honor of Cheng Hsiao
Type: Book
ISBN: 978-1-78973-958-9

Open Access
Article
Publication date: 10 May 2021

Chao Yu, Haiying Li, Xinyue Xu and Qi Sun

During rush hours, many passengers find it difficult to board the first train due to the insufficient capacity of metro vehicles, namely, left behind phenomenon. In this paper, a…

Abstract

Purpose

During rush hours, many passengers find it difficult to board the first train due to the insufficient capacity of metro vehicles, namely, left behind phenomenon. In this paper, a data-driven approach is presented to estimate left-behind patterns using automatic fare collection (AFC) data and train timetable data.

Design/methodology/approach

First, a data preprocessing method is introduced to obtain the waiting time of passengers at the target station. Second, a hierarchical Bayesian (HB) model is proposed to describe the left behind phenomenon, in which the waiting time is expressed as a Gaussian mixture model. Then a sampling algorithm based on Markov Chain Monte Carlo (MCMC) is developed to estimate the parameters in the model. Third, a case of Beijing metro system is taken as an application of the proposed method.

Findings

The comparison result shows that the proposed method performs better in estimating left behind patterns than the existing Maximum Likelihood Estimation. Finally, three main reasons for left behind phenomenon are summarized to make relevant strategies for metro managers.

Originality/value

First, an HB model is constructed to describe the left behind phenomenon in a target station and in the target direction on the basis of AFC data and train timetable data. Second, a MCMC-based sampling method Metropolis–Hasting algorithm is proposed to estimate the model parameters and obtain the quantitative results of left behind patterns. Third, a case of Beijing metro is presented as an application to test the applicability and accuracy of the proposed method.

Details

Smart and Resilient Transportation, vol. 3 no. 2
Type: Research Article
ISSN: 2632-0487

Keywords

1 – 10 of 217