Search results

1 – 10 of over 1000
Book part
Publication date: 19 November 2014

Enrique Martínez-García and Mark A. Wynne

We investigate the Bayesian approach to model comparison within a two-country framework with nominal rigidities using the workhorse New Keynesian open-economy model of…

Abstract

We investigate the Bayesian approach to model comparison within a two-country framework with nominal rigidities using the workhorse New Keynesian open-economy model of Martínez-García and Wynne (2010). We discuss the trade-offs that monetary policy – characterized by a Taylor-type rule – faces in an interconnected world, with perfectly flexible exchange rates. We then use posterior model probabilities to evaluate the weight of evidence in support of such a model when estimated against more parsimonious specifications that either abstract from monetary frictions or assume autarky by means of controlled experiments that employ simulated data. We argue that Bayesian model comparison with posterior odds is sensitive to sample size and the choice of observable variables for estimation. We show that posterior model probabilities strongly penalize overfitting, which can lead us to favor a less parameterized model against the true data-generating process when the two become arbitrarily close to each other. We also illustrate that the spillovers from monetary policy across countries have an added confounding effect.

Book part
Publication date: 1 January 2008

Michiel de Pooter, Francesco Ravazzolo, Rene Segers and Herman K. van Dijk

Several lessons learnt from a Bayesian analysis of basic macroeconomic time-series models are presented for the situation where some model parameters have substantial posterior

Abstract

Several lessons learnt from a Bayesian analysis of basic macroeconomic time-series models are presented for the situation where some model parameters have substantial posterior probability near the boundary of the parameter region. This feature refers to near-instability within dynamic models, to forecasting with near-random walk models and to clustering of several economic series in a small number of groups within a data panel. Two canonical models are used: a linear regression model with autocorrelation and a simple variance components model. Several well-known time-series models like unit root and error correction models and further state space and panel data models are shown to be simple generalizations of these two canonical models for the purpose of posterior inference. A Bayesian model averaging procedure is presented in order to deal with models with substantial probability both near and at the boundary of the parameter region. Analytical, graphical, and empirical results using U.S. macroeconomic data, in particular on GDP growth, are presented.

Details

Bayesian Econometrics
Type: Book
ISBN: 978-1-84855-308-8

Book part
Publication date: 1 January 2008

Gary Koop, Roberto Leon-Gonzalez and Rodney Strachan

This paper develops methods of Bayesian inference in a cointegrating panel data model. This model involves each cross-sectional unit having a vector error correction…

Abstract

This paper develops methods of Bayesian inference in a cointegrating panel data model. This model involves each cross-sectional unit having a vector error correction representation. It is flexible in the sense that different cross-sectional units can have different cointegration ranks and cointegration spaces. Furthermore, the parameters that characterize short-run dynamics and deterministic components are allowed to vary over cross-sectional units. In addition to a noninformative prior, we introduce an informative prior which allows for information about the likely location of the cointegration space and about the degree of similarity in coefficients in different cross-sectional units. A collapsed Gibbs sampling algorithm is developed which allows for efficient posterior inference. Our methods are illustrated using real and artificial data.

Details

Bayesian Econometrics
Type: Book
ISBN: 978-1-84855-308-8

Book part
Publication date: 30 August 2019

Timothy Cogley and Richard Startz

Standard estimation of ARMA models in which the AR and MA roots nearly cancel, so that individual coefficients are only weakly identified, often produces inferential ranges for…

Abstract

Standard estimation of ARMA models in which the AR and MA roots nearly cancel, so that individual coefficients are only weakly identified, often produces inferential ranges for individual coefficients that give a spurious appearance of accuracy. We remedy this problem with a model that uses a simple mixture prior. The posterior mixing probability is derived using Bayesian methods, but we show that the method works well in both Bayesian and frequentist setups. In particular, we show that our mixture procedure weights standard results heavily when given data from a well-identified ARMA model (which does not exhibit near root cancellation) and weights heavily an uninformative inferential region when given data from a weakly-identified ARMA model (with near root cancellation). When our procedure is applied to a well-identified process the investigator gets the “usual results,” so there is no important statistical cost to using our procedure. On the other hand, when our procedure is applied to a weakly identified process, the investigator learns that the data tell us little about the parameters – and is thus protected against making spurious inferences. We recommend that mixture models be computed routinely when inference about ARMA coefficients is of interest.

Details

Topics in Identification, Limited Dependent Variables, Partial Observability, Experimentation, and Flexible Modeling: Part A
Type: Book
ISBN: 978-1-78973-241-2

Keywords

Book part
Publication date: 29 February 2008

Francesco Ravazzolo, Richard Paap, Dick van Dijk and Philip Hans Franses

This chapter develops a return forecasting methodology that allows for instability in the relationship between stock returns and predictor variables, model uncertainty, and…

Abstract

This chapter develops a return forecasting methodology that allows for instability in the relationship between stock returns and predictor variables, model uncertainty, and parameter estimation uncertainty. The predictive regression specification that is put forward allows for occasional structural breaks of random magnitude in the regression parameters, uncertainty about the inclusion of forecasting variables, and uncertainty about parameter values by employing Bayesian model averaging. The implications of these three sources of uncertainty and their relative importance are investigated from an active investment management perspective. It is found that the economic value of incorporating all three sources of uncertainty is considerable. A typical investor would be willing to pay up to several hundreds of basis points annually to switch from a passive buy-and-hold strategy to an active strategy based on a return forecasting model that allows for model and parameter uncertainty as well as structural breaks in the regression parameters.

Details

Forecasting in the Presence of Structural Breaks and Model Uncertainty
Type: Book
ISBN: 978-1-84950-540-6

Book part
Publication date: 30 December 2004

Leslie W. Hepple

Within spatial econometrics a whole family of different spatial specifications has been developed, with associated estimators and tests. This lead to issues of model comparison…

Abstract

Within spatial econometrics a whole family of different spatial specifications has been developed, with associated estimators and tests. This lead to issues of model comparison and model choice, measuring the relative merits of alternative specifications and then using appropriate criteria to choose the “best” model or relative model probabilities. Bayesian theory provides a comprehensive and coherent framework for such model choice, including both nested and non-nested models within the choice set. The paper reviews the potential application of this Bayesian theory to spatial econometric models, examining the conditions and assumptions under which application is possible. Problems of prior distributions are outlined, and Bayes factors and marginal likelihoods are derived for a particular subset of spatial econometric specifications. These are then applied to two well-known spatial data-sets to illustrate the methods. Future possibilities, and comparisons with other approaches to both Bayesian and non-Bayesian model choice are discussed.

Details

Spatial and Spatiotemporal Econometrics
Type: Book
ISBN: 978-0-76231-148-4

Article
Publication date: 24 May 2013

Premkumar Thodi, Faisal Khan and Mahmoud Haddara

The purpose of this paper is to develop a risk‐based integrity model for the optimal replacement of offshore process components, based on the likelihood and consequence of failure…

Abstract

Purpose

The purpose of this paper is to develop a risk‐based integrity model for the optimal replacement of offshore process components, based on the likelihood and consequence of failure arising from time‐dependent degradation mechanisms.

Design/methodology/approach

Risk is a combination of the probability of failure and its likely consequences. Offshore process component degradation mechanisms are modeled using Bayesian prior‐posterior analysis. The failure consequences are developed in terms of the cost incurred as a result of failure, inspection and maintenance. By combining the cumulative posterior probability of failure and the equivalent cost of degradations, the operational life‐risk curve is produced. The optimal replacement strategy is obtained as the global minimum of the operational risk curve.

Findings

The offshore process component degradation mechanisms are random processes. The proposed risk‐based integrity model can be used to model these processes effectively to obtain an optimal replacement strategy. Bayesian analysis can be used to model the uncertainty in the degradation data. The Bayesian posterior estimation using an M‐H algorithm converged to satisfactory results using 10,000 simulations. The computed operational risk curve is observed to be a convex function of the service life. Furthermore, it is observed that the application of this model will reduce the risk of operation close to an ALARP level and consequently will promote the safety of operation.

Research limitations/implications

The developed model is applicable to offshore process components which suffer time‐dependent stochastic degradation mechanisms. Furthermore, this model is developed based on an assumption that the component degradation processes are independent. In reality, the degradation processes may not be independent.

Practical implications

The developed methodology and models will assist asset integrity engineers/managers in estimating optimal replacement intervals for offshore process components. This can reduce operating costs and resources required for inspection and maintenance (IM) tasks.

Originality/value

The frequent replacement of offshore process components involves higher cost and risk. Similarly, the late replacement of components may result in failure and costly breakdown maintenance. The developed model estimates an optimal replacement strategy for offshore process components suffering stochastic degradation. Implementation of the developed model improves component integrity, increases safety, reduces potential shutdown and reduces operational cost.

Details

Journal of Quality in Maintenance Engineering, vol. 19 no. 2
Type: Research Article
ISSN: 1355-2511

Keywords

Article
Publication date: 15 January 2024

Michael O'Connell

The author examines the impact these efficient factors have on factor model comparison tests in US returns using the Bayesian model scan approach of Chib et al. (2020), and Chib…

Abstract

Purpose

The author examines the impact these efficient factors have on factor model comparison tests in US returns using the Bayesian model scan approach of Chib et al. (2020), and Chib et al.(2022).

Design/methodology/approach

Ehsani and Linnainmaa (2022) show that time-series efficient investment factors in US stock returns span and earn 40% higher Sharpe ratios than the original factors.

Findings

The author shows that the optimal asset pricing model is an eight-factor model which contains efficient versions of the market factor, value factor (HML) and long-horizon behavioral factor (FIN). The findings show that efficient factors enhance the performance of US factor model performance. The top performing asset pricing model does not change in recent data.

Originality/value

The author is the only one to examine if the efficient factors developed by Ehsani and Linnainmaa (2022) have an impact on model comparison tests in US stock returns.

Details

Journal of Economic Studies, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0144-3585

Keywords

Abstract

Details

Nonlinear Time Series Analysis of Business Cycles
Type: Book
ISBN: 978-0-44451-838-5

Book part
Publication date: 1 January 2008

Deborah Gefang

This paper proposes a Bayesian procedure to investigate the purchasing power parity (PPP) utilizing an exponential smooth transition vector error correction model (VECM)…

Abstract

This paper proposes a Bayesian procedure to investigate the purchasing power parity (PPP) utilizing an exponential smooth transition vector error correction model (VECM). Employing a simple Gibbs sampler, we jointly estimate the cointegrating relationship along with the nonlinearities caused by the departures from the long-run equilibrium. By allowing for nonlinear regime changes, we provide strong evidence that PPP holds between the US and each of the remaining G7 countries. The model we employed implies that the dynamics of the PPP deviations can be rather complex, which is attested to by the impulse response analysis.

Details

Bayesian Econometrics
Type: Book
ISBN: 978-1-84855-308-8

1 – 10 of over 1000