Search results

1 – 10 of 512
Book part
Publication date: 27 June 2023

Richa Srivastava and M A Sanjeev

Several inferential procedures are advocated in the literature. The most commonly used techniques are the frequentist and the Bayesian inferential procedures. Bayesian methods…

Abstract

Several inferential procedures are advocated in the literature. The most commonly used techniques are the frequentist and the Bayesian inferential procedures. Bayesian methods afford inferences based on small data sets and are especially useful in studies with limited data availability. Bayesian approaches also help incorporate prior knowledge, especially subjective knowledge, into predictions. Considering the increasing difficulty in data acquisition, the application of Bayesian techniques can be hugely beneficial to managers, especially in analysing limited data situations like a study of expert opinion. Another factor constraining the broader application of Bayesian statistics in business was computational power requirements and the availability of appropriate analytical tools. However, with the increase in computational power, connectivity and the development of appropriate software programmes, Bayesian applications have become more attractive. This chapter attempts to unravel the applications of the Bayesian inferential procedure in marketing management.

Book part
Publication date: 1 January 2008

Arnold Zellner

After briefly reviewing the past history of Bayesian econometrics and Alan Greenspan's (2004) recent description of his use of Bayesian methods in managing policy-making risk…

Abstract

After briefly reviewing the past history of Bayesian econometrics and Alan Greenspan's (2004) recent description of his use of Bayesian methods in managing policy-making risk, some of the issues and needs that he mentions are discussed and linked to past and present Bayesian econometric research. Then a review of some recent Bayesian econometric research and needs is presented. Finally, some thoughts are presented that relate to the future of Bayesian econometrics.

Details

Bayesian Econometrics
Type: Book
ISBN: 978-1-84855-308-8

Book part
Publication date: 18 October 2019

Edward George, Purushottam Laud, Brent Logan, Robert McCulloch and Rodney Sparapani

Bayesian additive regression trees (BART) is a fully Bayesian approach to modeling with ensembles of trees. BART can uncover complex regression functions with high-dimensional…

Abstract

Bayesian additive regression trees (BART) is a fully Bayesian approach to modeling with ensembles of trees. BART can uncover complex regression functions with high-dimensional regressors in a fairly automatic way and provide Bayesian quantification of the uncertainty through the posterior. However, BART assumes independent and identical distributed (i.i.d) normal errors. This strong parametric assumption can lead to misleading inference and uncertainty quantification. In this chapter we use the classic Dirichlet process mixture (DPM) mechanism to nonparametrically model the error distribution. A key strength of BART is that default prior settings work reasonably well in a variety of problems. The challenge in extending BART is to choose the parameters of the DPM so that the strengths of the standard BART approach is not lost when the errors are close to normal, but the DPM has the ability to adapt to non-normal errors.

Details

Topics in Identification, Limited Dependent Variables, Partial Observability, Experimentation, and Flexible Modeling: Part B
Type: Book
ISBN: 978-1-83867-419-9

Abstract

Details

Functional Structure and Approximation in Econometrics
Type: Book
ISBN: 978-0-44450-861-4

Book part
Publication date: 22 November 2012

Enrique Martínez-García, Diego Vilán and Mark A. Wynne

Open-Economy models are central to the discussion of the trade-offs monetary policy faces in an increasingly more globalized world (e.g., Marínez-García & Wynne, 2010), but…

Abstract

Open-Economy models are central to the discussion of the trade-offs monetary policy faces in an increasingly more globalized world (e.g., Marínez-García & Wynne, 2010), but bringing them to the data is not without its challenges. Controlling for misspecification bias, we trace the problem of uncertainty surrounding structural parameter estimation in the context of a fully specified New Open Economy Macro (NOEM) model partly to sample size. We suggest that standard macroeconomic time series with a coverage of less than forty years may not be informative enough for some parameters of interest to be recovered with precision. We also illustrate how uncertainty also arises from weak structural identification, irrespective of the sample size. This remains a concern for empirical research and we recommend estimation with simulated observations before using actual data as a way of detecting structural parameters that are prone to weak identification. We also recommend careful evaluation and documentation of the implementation strategy (specially in the selection of observables) as it can have significant effects on the strength of identification of key model parameters.

Details

DSGE Models in Macroeconomics: Estimation, Evaluation, and New Developments
Type: Book
ISBN: 978-1-78190-305-6

Keywords

Book part
Publication date: 1 January 2008

Gary Koop

Equilibrium job search models allow for labor markets with homogeneous workers and firms to yield nondegenerate wage densities. However, the resulting wage densities do not accord…

Abstract

Equilibrium job search models allow for labor markets with homogeneous workers and firms to yield nondegenerate wage densities. However, the resulting wage densities do not accord well with empirical regularities. Accordingly, many extensions to the basic equilibrium search model have been considered (e.g., heterogeneity in productivity, heterogeneity in the value of leisure, etc.). It is increasingly common to use nonparametric forms for these extensions and, hence, researchers can obtain a perfect fit (in a kernel smoothed sense) between theoretical and empirical wage densities. This makes it difficult to carry out model comparison of different model extensions. In this paper, we first develop Bayesian parametric and nonparametric methods which are comparable to the existing non-Bayesian literature. We then show how Bayesian methods can be used to compare various nonparametric equilibrium search models in a statistically rigorous sense.

Details

Bayesian Econometrics
Type: Book
ISBN: 978-1-84855-308-8

Book part
Publication date: 1 January 2008

Michiel de Pooter, Francesco Ravazzolo, Rene Segers and Herman K. van Dijk

Several lessons learnt from a Bayesian analysis of basic macroeconomic time-series models are presented for the situation where some model parameters have substantial posterior…

Abstract

Several lessons learnt from a Bayesian analysis of basic macroeconomic time-series models are presented for the situation where some model parameters have substantial posterior probability near the boundary of the parameter region. This feature refers to near-instability within dynamic models, to forecasting with near-random walk models and to clustering of several economic series in a small number of groups within a data panel. Two canonical models are used: a linear regression model with autocorrelation and a simple variance components model. Several well-known time-series models like unit root and error correction models and further state space and panel data models are shown to be simple generalizations of these two canonical models for the purpose of posterior inference. A Bayesian model averaging procedure is presented in order to deal with models with substantial probability both near and at the boundary of the parameter region. Analytical, graphical, and empirical results using U.S. macroeconomic data, in particular on GDP growth, are presented.

Details

Bayesian Econometrics
Type: Book
ISBN: 978-1-84855-308-8

Book part
Publication date: 18 January 2022

Badi H. Baltagi, Georges Bresson, Anoop Chaturvedi and Guy Lacroix

This chapter extends the work of Baltagi, Bresson, Chaturvedi, and Lacroix (2018) to the popular dynamic panel data model. The authors investigate the robustness of Bayesian panel…

Abstract

This chapter extends the work of Baltagi, Bresson, Chaturvedi, and Lacroix (2018) to the popular dynamic panel data model. The authors investigate the robustness of Bayesian panel data models to possible misspecification of the prior distribution. The proposed robust Bayesian approach departs from the standard Bayesian framework in two ways. First, the authors consider the ε-contamination class of prior distributions for the model parameters as well as for the individual effects. Second, both the base elicited priors and the ε-contamination priors use Zellner’s (1986) g-priors for the variance–covariance matrices. The authors propose a general “toolbox” for a wide range of specifications which includes the dynamic panel model with random effects, with cross-correlated effects à la Chamberlain, for the Hausman–Taylor world and for dynamic panel data models with homogeneous/heterogeneous slopes and cross-sectional dependence. Using a Monte Carlo simulation study, the authors compare the finite sample properties of the proposed estimator to those of standard classical estimators. The chapter contributes to the dynamic panel data literature by proposing a general robust Bayesian framework which encompasses the conventional frequentist specifications and their associated estimation methods as special cases.

Details

Essays in Honor of M. Hashem Pesaran: Panel Modeling, Micro Applications, and Econometric Methodology
Type: Book
ISBN: 978-1-80262-065-8

Keywords

Abstract

This article surveys recent developments in the evaluation of point and density forecasts in the context of forecasts made by vector autoregressions. Specific emphasis is placed on highlighting those parts of the existing literature that are applicable to direct multistep forecasts and those parts that are applicable to iterated multistep forecasts. This literature includes advancements in the evaluation of forecasts in population (based on true, unknown model coefficients) and the evaluation of forecasts in the finite sample (based on estimated model coefficients). The article then examines in Monte Carlo experiments the finite-sample properties of some tests of equal forecast accuracy, focusing on the comparison of VAR forecasts to AR forecasts. These experiments show the tests to behave as should be expected given the theory. For example, using critical values obtained by bootstrap methods, tests of equal accuracy in population have empirical size about equal to nominal size.

Details

VAR Models in Macroeconomics – New Developments and Applications: Essays in Honor of Christopher A. Sims
Type: Book
ISBN: 978-1-78190-752-8

Keywords

Book part
Publication date: 19 November 2014

Garland Durham and John Geweke

Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially…

Abstract

Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially quite advantageous ways. But to fully exploit these benefits algorithms that conform to parallel computing environments are needed. This paper presents a sequential posterior simulator designed to operate efficiently in this context. The simulator makes fewer analytical and programming demands on investigators, and is faster, more reliable, and more complete than conventional posterior simulators. The paper extends existing sequential Monte Carlo methods and theory to provide a thorough and practical foundation for sequential posterior simulation that is well suited to massively parallel computing environments. It provides detailed recommendations on implementation, yielding an algorithm that requires only code for simulation from the prior and evaluation of prior and data densities and works well in a variety of applications representative of serious empirical work in economics and finance. The algorithm facilitates Bayesian model comparison by producing marginal likelihood approximations of unprecedented accuracy as an incidental by-product, is robust to pathological posterior distributions, and provides estimates of numerical standard error and relative numerical efficiency intrinsically. The paper concludes with an application that illustrates the potential of these simulators for applied Bayesian inference.

1 – 10 of 512