Search results

1 – 10 of 833
Book part
Publication date: 1 January 2008

Gary Koop, Roberto Leon-Gonzalez and Rodney Strachan

This paper develops methods of Bayesian inference in a cointegrating panel data model. This model involves each cross-sectional unit having a vector error correction…

Abstract

This paper develops methods of Bayesian inference in a cointegrating panel data model. This model involves each cross-sectional unit having a vector error correction representation. It is flexible in the sense that different cross-sectional units can have different cointegration ranks and cointegration spaces. Furthermore, the parameters that characterize short-run dynamics and deterministic components are allowed to vary over cross-sectional units. In addition to a noninformative prior, we introduce an informative prior which allows for information about the likely location of the cointegration space and about the degree of similarity in coefficients in different cross-sectional units. A collapsed Gibbs sampling algorithm is developed which allows for efficient posterior inference. Our methods are illustrated using real and artificial data.

Details

Bayesian Econometrics
Type: Book
ISBN: 978-1-84855-308-8

Abstract

This article surveys recent developments in the evaluation of point and density forecasts in the context of forecasts made by vector autoregressions. Specific emphasis is placed on highlighting those parts of the existing literature that are applicable to direct multistep forecasts and those parts that are applicable to iterated multistep forecasts. This literature includes advancements in the evaluation of forecasts in population (based on true, unknown model coefficients) and the evaluation of forecasts in the finite sample (based on estimated model coefficients). The article then examines in Monte Carlo experiments the finite-sample properties of some tests of equal forecast accuracy, focusing on the comparison of VAR forecasts to AR forecasts. These experiments show the tests to behave as should be expected given the theory. For example, using critical values obtained by bootstrap methods, tests of equal accuracy in population have empirical size about equal to nominal size.

Details

VAR Models in Macroeconomics – New Developments and Applications: Essays in Honor of Christopher A. Sims
Type: Book
ISBN: 978-1-78190-752-8

Keywords

Book part
Publication date: 18 October 2019

Justin L. Tobias and Joshua C. C. Chan

We present a new procedure for nonparametric Bayesian estimation of regression functions. Specifically, our method makes use of an idea described in Frühwirth-Schnatter and Wagner…

Abstract

We present a new procedure for nonparametric Bayesian estimation of regression functions. Specifically, our method makes use of an idea described in Frühwirth-Schnatter and Wagner (2010) to impose linearity exactly (conditional upon an unobserved binary indicator), yet also permits departures from linearity while imposing smoothness of the regression curves. An advantage of this approach is that the posterior probability of linearity is essentially produced as a by-product of the procedure. We apply our methods in both generated data experiments as well as in an illustrative application involving the impact of body mass index (BMI) on labor market earnings.

Details

Topics in Identification, Limited Dependent Variables, Partial Observability, Experimentation, and Flexible Modeling: Part B
Type: Book
ISBN: 978-1-83867-419-9

Keywords

Book part
Publication date: 29 February 2008

Francesco Ravazzolo, Richard Paap, Dick van Dijk and Philip Hans Franses

This chapter develops a return forecasting methodology that allows for instability in the relationship between stock returns and predictor variables, model uncertainty, and…

Abstract

This chapter develops a return forecasting methodology that allows for instability in the relationship between stock returns and predictor variables, model uncertainty, and parameter estimation uncertainty. The predictive regression specification that is put forward allows for occasional structural breaks of random magnitude in the regression parameters, uncertainty about the inclusion of forecasting variables, and uncertainty about parameter values by employing Bayesian model averaging. The implications of these three sources of uncertainty and their relative importance are investigated from an active investment management perspective. It is found that the economic value of incorporating all three sources of uncertainty is considerable. A typical investor would be willing to pay up to several hundreds of basis points annually to switch from a passive buy-and-hold strategy to an active strategy based on a return forecasting model that allows for model and parameter uncertainty as well as structural breaks in the regression parameters.

Details

Forecasting in the Presence of Structural Breaks and Model Uncertainty
Type: Book
ISBN: 978-1-84950-540-6

Book part
Publication date: 16 September 2022

Luis Uzeda

This chapter investigates the impact of different state correlation assumptions for out-of-sample performance of unobserved components (UC) models with stochastic volatility

Abstract

This chapter investigates the impact of different state correlation assumptions for out-of-sample performance of unobserved components (UC) models with stochastic volatility. Using several measures of US inflation the author finds that allowing for correlation between inflation’s trend and cyclical (or gap) components is a useful feature to predict inflation in the short run. In contrast, orthogonality between such components improves the out-of-sample performance as the forecasting horizon widens. Accordingly, trend inflation from orthogonal trend-gap UC models closely tracks survey-based measures of long-run inflation expectations. Trend dynamics in the correlated-component case behave similarly to survey-based nowcasts. To carry out estimation, an efficient algorithm which builds upon properties of Toeplitz matrices and recent advances in precision-based samplers is provided.

Details

Essays in Honour of Fabio Canova
Type: Book
ISBN: 978-1-80382-636-3

Keywords

Book part
Publication date: 19 November 2014

Garland Durham and John Geweke

Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially…

Abstract

Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially quite advantageous ways. But to fully exploit these benefits algorithms that conform to parallel computing environments are needed. This paper presents a sequential posterior simulator designed to operate efficiently in this context. The simulator makes fewer analytical and programming demands on investigators, and is faster, more reliable, and more complete than conventional posterior simulators. The paper extends existing sequential Monte Carlo methods and theory to provide a thorough and practical foundation for sequential posterior simulation that is well suited to massively parallel computing environments. It provides detailed recommendations on implementation, yielding an algorithm that requires only code for simulation from the prior and evaluation of prior and data densities and works well in a variety of applications representative of serious empirical work in economics and finance. The algorithm facilitates Bayesian model comparison by producing marginal likelihood approximations of unprecedented accuracy as an incidental by-product, is robust to pathological posterior distributions, and provides estimates of numerical standard error and relative numerical efficiency intrinsically. The paper concludes with an application that illustrates the potential of these simulators for applied Bayesian inference.

Book part
Publication date: 19 November 2014

Enrique Martínez-García and Mark A. Wynne

We investigate the Bayesian approach to model comparison within a two-country framework with nominal rigidities using the workhorse New Keynesian open-economy model of…

Abstract

We investigate the Bayesian approach to model comparison within a two-country framework with nominal rigidities using the workhorse New Keynesian open-economy model of Martínez-García and Wynne (2010). We discuss the trade-offs that monetary policy – characterized by a Taylor-type rule – faces in an interconnected world, with perfectly flexible exchange rates. We then use posterior model probabilities to evaluate the weight of evidence in support of such a model when estimated against more parsimonious specifications that either abstract from monetary frictions or assume autarky by means of controlled experiments that employ simulated data. We argue that Bayesian model comparison with posterior odds is sensitive to sample size and the choice of observable variables for estimation. We show that posterior model probabilities strongly penalize overfitting, which can lead us to favor a less parameterized model against the true data-generating process when the two become arbitrarily close to each other. We also illustrate that the spillovers from monetary policy across countries have an added confounding effect.

Book part
Publication date: 21 February 2008

Mingliang Li and Justin L. Tobias

We describe a new Bayesian estimation algorithm for fitting a binary treatment, ordered outcome selection model in a potential outcomes framework. We show how recent advances in…

Abstract

We describe a new Bayesian estimation algorithm for fitting a binary treatment, ordered outcome selection model in a potential outcomes framework. We show how recent advances in simulation methods, namely data augmentation, the Gibbs sampler and the Metropolis-Hastings algorithm can be used to fit this model efficiently, and also introduce a reparameterization to help accelerate the convergence of our posterior simulator. Conventional “treatment effects” such as the Average Treatment Effect (ATE), the effect of treatment on the treated (TT) and the Local Average Treatment Effect (LATE) are adapted for this specific model, and Bayesian strategies for calculating these treatment effects are introduced. Finally, we review how one can potentially learn (or at least bound) the non-identified cross-regime correlation parameter and use this learning to calculate (or bound) parameters of interest beyond mean treatment effects.

Details

Modelling and Evaluating Treatment Effects in Econometrics
Type: Book
ISBN: 978-0-7623-1380-8

Book part
Publication date: 30 August 2019

Gary J. Cornwall, Jeffrey A. Mills, Beau A. Sauley and Huibin Weng

This chapter develops a predictive approach to Granger causality (GC) testing that utilizes k…

Abstract

This chapter develops a predictive approach to Granger causality (GC) testing that utilizes k -fold cross-validation and posterior simulation to perform out-of-sample testing. A Monte Carlo study indicates that the cross-validation predictive procedure has improved power in comparison to previously available out-of-sample testing procedures, matching the performance of the in-sample F-test while retaining the credibility of post- sample inference. An empirical application to the Phillips curve is provided evaluating the evidence on GC between inflation and unemployment rates.

Details

Topics in Identification, Limited Dependent Variables, Partial Observability, Experimentation, and Flexible Modeling: Part A
Type: Book
ISBN: 978-1-78973-241-2

Keywords

Book part
Publication date: 30 August 2019

Zhe Yu, Raquel Prado, Steve C. Cramer, Erin B. Quinlan and Hernando Ombao

We develop a Bayesian approach for modeling brain activation and connectivity from functional magnetic resonance image (fMRI) data. Our approach simultaneously estimates local…

Abstract

We develop a Bayesian approach for modeling brain activation and connectivity from functional magnetic resonance image (fMRI) data. Our approach simultaneously estimates local hemodynamic response functions (HRFs) and activation parameters, as well as global effective and functional connectivity parameters. Existing methods assume identical HRFs across brain regions, which may lead to erroneous conclusions in inferring activation and connectivity patterns. Our approach addresses this limitation by estimating region-specific HRFs. Additionally, it enables neuroscientists to compare effective connectivity networks for different experimental conditions. Furthermore, the use of spike and slab priors on the connectivity parameters allows us to directly select significant effective connectivities in a given network.

We include a simulation study that demonstrates that, compared to the standard generalized linear model (GLM) approach, our model generally has higher power and lower type I error and bias than the GLM approach, and it also has the ability to capture condition-specific connectivities. We applied our approach to a dataset from a stroke study and found different effective connectivity patterns for task and rest conditions in certain brain regions of interest (ROIs).

Details

Topics in Identification, Limited Dependent Variables, Partial Observability, Experimentation, and Flexible Modeling: Part A
Type: Book
ISBN: 978-1-78973-241-2

Keywords

1 – 10 of 833