Search results

1 – 10 of over 2000
Open Access
Article
Publication date: 2 September 2019

Pedro Albuquerque, Gisela Demo, Solange Alfinito and Kesia Rozzett

Factor analysis is the most used tool in organizational research and its widespread use in scale validations contribute to decision-making in management. However, standard factor…

1762

Abstract

Purpose

Factor analysis is the most used tool in organizational research and its widespread use in scale validations contribute to decision-making in management. However, standard factor analysis is not always applied correctly mainly due to the misuse of ordinal data as interval data and the inadequacy of the former for classical factor analysis. The purpose of this paper is to present and apply the Bayesian factor analysis for mixed data (BFAMD) in the context of empirical using the Bayesian paradigm for the construction of scales.

Design/methodology/approach

Ignoring the categorical nature of some variables often used in management studies, as the popular Likert scale, may result in a model with false accuracy and possibly biased estimates. To address this issue, Quinn (2004) proposed a Bayesian factor analysis model for mixed data, which is capable of modeling ordinal (qualitative measure) and continuous data (quantitative measure) jointly and allows the inclusion of qualitative information through prior distributions for the parameters’ model. This model, adopted here, presents considering advantages and allows the estimation of the posterior distribution for the latent variables estimated, making the process of inference easier.

Findings

The results show that BFAMD is an effective approach for scale validation in management studies making both exploratory and confirmatory analyses possible for the estimated factors and also allowing the analysts to insert a priori information regardless of the sample size, either by using the credible intervals for Factor Loadings or by conducting specific hypotheses tests. The flexibility of the Bayesian approach presented is counterbalanced by the fact that the main estimates used in factor analysis as uniqueness and communalities commonly lose their usual interpretation due to the choice of using prior distributions.

Originality/value

Considering that the development of scales through factor analysis aims to contribute to appropriate decision-making in management and the increasing misuse of ordinal scales as interval in organizational studies, this proposal seems to be effective for mixed data analyses. The findings found here are not intended to be conclusive or limiting but offer a useful starting point from which further theoretical and empirical research of Bayesian factor analysis can be built.

Details

RAUSP Management Journal, vol. 54 no. 4
Type: Research Article
ISSN: 2531-0488

Keywords

Open Access
Article
Publication date: 4 September 2017

Yuanxing Zhang, Zhuqi Li, Kaigui Bian, Yichong Bai, Zhi Yang and Xiaoming Li

Projecting the population distribution in geographical regions is important for many applications such as launching marketing campaigns or enhancing the public safety in certain…

Abstract

Purpose

Projecting the population distribution in geographical regions is important for many applications such as launching marketing campaigns or enhancing the public safety in certain densely populated areas. Conventional studies require the collection of people’s trajectory data through offline means, which is limited in terms of cost and data availability. The wide use of online social network (OSN) apps over smartphones has provided the opportunities of devising a lightweight approach of conducting the study using the online data of smartphone apps. This paper aims to reveal the relationship between the online social networks and the offline communities, as well as to project the population distribution by modeling geo-homophily in the online social networks.

Design/methodology/approach

In this paper, the authors propose the concept of geo-homophily in OSNs to determine how much the data of an OSN can help project the population distribution in a given division of geographical regions. Specifically, the authors establish a three-layered theoretic framework that first maps the online message diffusion among friends in the OSN to the offline population distribution over a given division of regions via a Dirichlet process and then projects the floating population across the regions.

Findings

By experiments over large-scale OSN data sets, the authors show that the proposed prediction models have a high prediction accuracy in characterizing the process of how the population distribution forms and how the floating population changes over time.

Originality/value

This paper tries to project population distribution by modeling geo-homophily in OSNs.

Details

International Journal of Crowd Science, vol. 1 no. 3
Type: Research Article
ISSN: 2398-7294

Keywords

Open Access
Article
Publication date: 19 June 2019

Sherine Al-shawarby and Mai El Mossallamy

This paper aims to estimate a New Keynesian small open economy dynamic stochastic general equilibrium (DSGE) model for Egypt using Bayesian techniques and data for the period…

6629

Abstract

Purpose

This paper aims to estimate a New Keynesian small open economy dynamic stochastic general equilibrium (DSGE) model for Egypt using Bayesian techniques and data for the period FY2004/2005:Q1-FY2015/2016:Q4 to assess monetary and fiscal policy interactions and their impact on economic stabilization. Outcomes of monetary and fiscal authority commitment to policy instruments, interest rate, government spending and taxes, are evaluated using Taylor-type and optimal simple rules.

Design/methodology/approach

The study extends the stylized micro-founded small open economy New Keynesian DSGE model, proposed by Lubik and Schorfheide (2007), by explicitly introducing fiscal policy behavior into the model (Fragetta and Kirsanova, 2010 and Çebi, 2011). The model is calibrated using quarterly data for Egypt on key macroeconomic variables during FY2004/2005:Q1-FY2015/2016:Q4; and Bayesian methods are used in estimation.

Findings

The results show that monetary and fiscal policy instruments in Egypt contribute to economic stability through their effects on inflation, output and debt stock. The monetary policy Taylor rule estimates reveal that the Central Bank of Egypt (CBE) attaches significant importance to anti-inflationary policy and (to a lesser extent) to output targeting but responds weakly to nominal exchange rate variations. CBE decisions are significantly influenced by interest rate smoothing. Egyptian fiscal policy has an important role in output and government debt stabilization. Additionally, the fiscal authority chooses pro-cyclical government spending and counter-cyclical tax policies for output stabilization. Again, past values of the fiscal instruments are influential in the evolution of the future fiscal policy-making process.

Originality/value

A few studies have examined the interaction between monetary and fiscal policy in Egypt within a unified framework. The presented paper integrates the monetary and fiscal policy analysis within a unified dynamic general equilibrium open economy rational expectations framework. Without such a framework, it would not be easy to jointly analyze monetary and fiscal transmission mechanisms for output, inflation and debt. Also, it would be neither possible to contrast the outcome of monetary and fiscal authorities commitment to a simple Taylor instrument rule vis-à-vis optimal policy outcomes nor to assess the behavior of monetary and fiscal agents in macroeconomic stability in context of an active/passive policy decisions framework.

Details

Review of Economics and Political Science, vol. 4 no. 2
Type: Research Article
ISSN: 2631-3561

Keywords

Open Access
Article
Publication date: 30 July 2021

Tien Ha My Duong, Thi Anh Nhu Nguyen and Van Diep Nguyen

The paper aims to examine the impact of social capital on the size of the shadow economy in the BIRCS countries over the period 1995–2014.

1009

Abstract

Purpose

The paper aims to examine the impact of social capital on the size of the shadow economy in the BIRCS countries over the period 1995–2014.

Design/methodology/approach

The authors employ the Bayesian linear regression method to uncover the relationship between social capital and the shadow economy. The method applies a normal distribution for the prior probability distribution while the posterior distribution is determined using the Markov chain Monte Carlo technique.

Findings

The results indicate that the unemployment rate and tax burden positively affect the size of the shadow economy. By contrast, corruption control and trade openness are negatively associated with the development of this informal sector. Moreover, the paper's primary finding is that social capital represented by social trust and tax morale can hinder the size of the shadow economy.

Research limitations/implications

This study is limited to the case of the BRICS countries for the period 1995–2014. The determinants of the shadow economy in different groups of countries can be heterogeneous. Moreover, social capital is a multidimensional concept that may consist of various components. This difficulty of measuring the social capital calls for further research on the relationship between other dimensions of social capital and the shadow economy.

Originality/value

Many studies investigate the effect of economic factors on the size of the shadow economy. This paper applies a new approach to discover the issue. Notably, the authors use the Bayesian linear regression method to analyze the relationship between social capital and the shadow economy in the BRICS countries.

Details

Asian Journal of Economics and Banking, vol. 5 no. 3
Type: Research Article
ISSN: 2615-9821

Keywords

Open Access
Article
Publication date: 19 September 2023

Cleyton Farias and Marcelo Silva

The authors explore the hypothesis that some movements in commodity prices are anticipated (news shocks) and can trigger aggregate fluctuations in small open emerging economies…

Abstract

Purpose

The authors explore the hypothesis that some movements in commodity prices are anticipated (news shocks) and can trigger aggregate fluctuations in small open emerging economies. This paper aims to discuss the aforementioned objective.

Design/methodology/approach

The authors build a multi-sector dynamic stochastic general equilibrium model with endogenous commodity production. There are five exogenous processes: a country-specific interest rate shock that responds to commodity price fluctuations, a productivity (TFP) shock for each sector and a commodity price shock. Both TFP and commodity price shocks are composed of unanticipated and anticipated components.

Findings

The authors show that news shocks to commodity prices lead to higher output, investment and consumption, and a countercyclical movement in the trade-balance-to-output ratio. The authors also show that commodity price news shocks explain about 24% of output aggregate fluctuations in the small open economy.

Practical implications

Given the importance of both anticipated and unanticipated commodity price shocks, policymakers should pay attention to developments in commodity markets when designing policies to attenuate the business cycles. Future research should investigate the design of optimal fiscal and monetary policies in SOE subject to news shocks in commodity prices.

Originality/value

This paper contributes to the knowledge of the sources of fluctuations in emerging economies highlighting the importance of a new source: news shocks in commodity prices.

Details

EconomiA, vol. 24 no. 2
Type: Research Article
ISSN: 1517-7580

Keywords

Open Access
Article
Publication date: 17 October 2019

Mahmoud ELsayed and Amr Soliman

The purpose of this study is to estimate the linear regression parameters using two alternative techniques. First technique is to apply the generalized linear model (GLM) and the…

3199

Abstract

Purpose

The purpose of this study is to estimate the linear regression parameters using two alternative techniques. First technique is to apply the generalized linear model (GLM) and the second technique is the Markov Chain Monte Carlo (MCMC) method.

Design/methodology/approach

In this paper, the authors adopted the incurred claims of Egyptian non-life insurance market as a dependent variable during a 10-year period. MCMC uses Gibbs sampling to generate a sample from a posterior distribution of a linear regression to estimate the parameters of interest. However, the authors used the R package to estimate the parameters of the linear regression using the above techniques.

Findings

These procedures will guide the decision-maker for estimating the reserve and set proper investment strategy.

Originality/value

In this paper, the authors will estimate the parameters of a linear regression model using MCMC method via R package. Furthermore, MCMC uses Gibbs sampling to generate a sample from a posterior distribution of a linear regression to estimate parameters to predict future claims. In the same line, these procedures will guide the decision-maker for estimating the reserve and set proper investment strategy.

Details

Journal of Humanities and Applied Social Sciences, vol. 2 no. 1
Type: Research Article
ISSN: 2632-279X

Keywords

Open Access
Article
Publication date: 7 August 2019

Markus Neumayer, Thomas Suppan and Thomas Bretterklieber

The application of statistical inversion theory provides a powerful approach for solving estimation problems including the ability for uncertainty quantification (UQ) by means of…

Abstract

Purpose

The application of statistical inversion theory provides a powerful approach for solving estimation problems including the ability for uncertainty quantification (UQ) by means of Markov chain Monte Carlo (MCMC) methods and Monte Carlo integration. This paper aims to analyze the application of a state reduction technique within different MCMC techniques to improve the computational efficiency and the tuning process of these algorithms.

Design/methodology/approach

A reduced state representation is constructed from a general prior distribution. For sampling the Metropolis Hastings (MH) Algorithm and the Gibbs sampler are used. Efficient proposal generation techniques and techniques for conditional sampling are proposed and evaluated for an exemplary inverse problem.

Findings

For the MH-algorithm, high acceptance rates can be obtained with a simple proposal kernel. For the Gibbs sampler, an efficient technique for conditional sampling was found. The state reduction scheme stabilizes the ill-posed inverse problem, allowing a solution without a dedicated prior distribution. The state reduction is suitable to represent general material distributions.

Practical implications

The state reduction scheme and the MCMC techniques can be applied in different imaging problems. The stabilizing nature of the state reduction improves the solution of ill-posed problems. The tuning of the MCMC methods is simplified.

Originality/value

The paper presents a method to improve the solution process of inverse problems within the Bayesian framework. The stabilization of the inverse problem due to the state reduction improves the solution. The approach simplifies the tuning of MCMC methods.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering , vol. 38 no. 5
Type: Research Article
ISSN: 0332-1649

Keywords

Open Access
Article
Publication date: 24 March 2021

Ilenia Confente, Ivan Russo, Simone Peinkofer and Robert Frankel

While remanufactured products represent an increasingly researched phenomenon in the literature, not much is known about consumers' understanding and acceptance of such products…

4782

Abstract

Purpose

While remanufactured products represent an increasingly researched phenomenon in the literature, not much is known about consumers' understanding and acceptance of such products. This study explores this issue in the context of the theory of perceived risk (TPR), investigating return policy leniency and distribution channel choice as potential factors to foster remanufactured products' sales.

Design/methodology/approach

This research utilizes an experimental design composed of a pre-test and a scenario-based main experiment to explore how return policy leniency might mitigate consumers' perceived risk and how their related purchase intention differs across two types of retail distribution channel structures (i.e. brick-and-mortar vs. online).

Findings

The investigation into the efficacy of return policy leniency within two retail distribution channel settings (i.e. brick-and-mortar vs. online) illustrates that providing a lenient return policy is an effective “cue” in increasing consumer purchase intention for remanufactured products. While prior literature has established that consumers value return policy leniency for new products, the authors provide empirical evidence that this preference also applies to remanufactured products. Notably, that return policy preference holds true in both channel settings (i.e. brick-and-mortar vs. online) under consideration. Additionally, and contrary to the authors’ predictions, consumers perceived remanufactured products sold via both channel settings as equally risky, thus highlighting that both are appropriate distribution channels for remanufactured products. Finally, while research on new products provides some initial guidance on consumer perceptions of quality and risk, the study provides empirical evidence into the difference of perceived risk with regard to new versus remanufactured products.

Originality/value

By employing the TPR, this research explored the role played by two supply chain management related factors (returns policy and channel structure) in reducing consumer's perceived risk and increasing purchase intention. In doing so, this study answers the call for more consumer-based supply chain management research in a controlled experimental research setting.

Details

International Journal of Physical Distribution & Logistics Management, vol. 51 no. 4
Type: Research Article
ISSN: 0960-0035

Keywords

Open Access
Article
Publication date: 7 September 2015

Hubert Zangl and Stephan Mühlbacher-Karrer

The purpose of this paper is to reduce the artifacts in fast Bayesian reconstruction images in electrical tomography. This is in particular important with respect to object…

1055

Abstract

Purpose

The purpose of this paper is to reduce the artifacts in fast Bayesian reconstruction images in electrical tomography. This is in particular important with respect to object detection in electrical tomography applications.

Design/methodology/approach

The authors suggest to apply the Box-Cox transformation in Bayesian linear minimum mean square error (BMMSE) reconstruction to better accommodate the non-linear relation between the capacitance matrix and the permittivity distribution. The authors compare the results of the original algorithm with the modified algorithm and with the ground truth in both, simulation and experiments.

Findings

The results show a reduction of 50 percent of the mean square error caused by artifacts in low permittivity regions. Furthermore, the algorithm does not increase the computational complexity significantly such that the hard real time constraints can still be met. The authors demonstrate that the algorithm also works with limited observations angles. This allows for object detection in real time, e.g., in robot collision avoidance.

Originality/value

This paper shows that the extension of BMMSE by applying the Box-Cox transformation leads to a significant improvement of the quality of the reconstruction image while hard real time constraints are still met.

Details

COMPEL: The International Journal for Computation and Mathematics in Electrical and Electronic Engineering, vol. 34 no. 5
Type: Research Article
ISSN: 0332-1649

Keywords

Open Access
Article
Publication date: 8 December 2022

James Christopher Westland

This paper tests whether Bayesian A/B testing yields better decisions that traditional Neyman-Pearson hypothesis testing. It proposes a model and tests it using a large, multiyear…

1252

Abstract

Purpose

This paper tests whether Bayesian A/B testing yields better decisions that traditional Neyman-Pearson hypothesis testing. It proposes a model and tests it using a large, multiyear Google Analytics (GA) dataset.

Design/methodology/approach

This paper is an empirical study. Competing A/B testing models were used to analyze a large, multiyear dataset of GA dataset for a firm that relies entirely on their website and online transactions for customer engagement and sales.

Findings

Bayesian A/B tests of the data not only yielded a clear delineation of the timing and impact of the intellectual property fraud, but calculated the loss of sales dollars, traffic and time on the firm’s website, with precise confidence limits. Frequentist A/B testing identified fraud in bounce rate at 5% significance, and bounces at 10% significance, but was unable to ascertain fraud at the standard significance cutoffs for scientific studies.

Research limitations/implications

None within the scope of the research plan.

Practical implications

Bayesian A/B tests of the data not only yielded a clear delineation of the timing and impact of the IP fraud, but calculated the loss of sales dollars, traffic and time on the firm’s website, with precise confidence limits.

Social implications

Bayesian A/B testing can derive economically meaningful statistics, whereas frequentist A/B testing only provide p-value’s whose meaning may be hard to grasp, and where misuse is widespread and has been a major topic in metascience. While misuse of p-values in scholarly articles may simply be grist for academic debate, the uncertainty surrounding the meaning of p-values in business analytics actually can cost firms money.

Originality/value

There is very little empirical research in e-commerce that uses Bayesian A/B testing. Almost all corporate testing is done via frequentist Neyman-Pearson methods.

Details

Journal of Electronic Business & Digital Economics, vol. 1 no. 1/2
Type: Research Article
ISSN: 2754-4214

Keywords

1 – 10 of over 2000