Search results

1 – 10 of 47
Book part
Publication date: 19 November 2014

Enrique Martínez-García and Mark A. Wynne

We investigate the Bayesian approach to model comparison within a two-country framework with nominal rigidities using the workhorse New Keynesian open-economy model of…

Abstract

We investigate the Bayesian approach to model comparison within a two-country framework with nominal rigidities using the workhorse New Keynesian open-economy model of Martínez-García and Wynne (2010). We discuss the trade-offs that monetary policy – characterized by a Taylor-type rule – faces in an interconnected world, with perfectly flexible exchange rates. We then use posterior model probabilities to evaluate the weight of evidence in support of such a model when estimated against more parsimonious specifications that either abstract from monetary frictions or assume autarky by means of controlled experiments that employ simulated data. We argue that Bayesian model comparison with posterior odds is sensitive to sample size and the choice of observable variables for estimation. We show that posterior model probabilities strongly penalize overfitting, which can lead us to favor a less parameterized model against the true data-generating process when the two become arbitrarily close to each other. We also illustrate that the spillovers from monetary policy across countries have an added confounding effect.

Book part
Publication date: 18 October 2019

Hedibert Freitas Lopes, Matthew Taddy and Matthew Gardner

Heavy-tailed distributions present a tough setting for inference. They are also common in industrial applications, particularly with internet transaction datasets, and machine…

Abstract

Heavy-tailed distributions present a tough setting for inference. They are also common in industrial applications, particularly with internet transaction datasets, and machine learners often analyze such data without considering the biases and risks associated with the misuse of standard tools. This chapter outlines a procedure for inference about the mean of a (possibly conditional) heavy-tailed distribution that combines nonparametric analysis for the bulk of the support with Bayesian parametric modeling – motivated from extreme value theory – for the heavy tail. The procedure is fast and massively scalable. The work should find application in settings wherever correct inference is important and reward tails are heavy; we illustrate the framework in causal inference for A/B experiments involving hundreds of millions of users of eBay.com.

Details

Topics in Identification, Limited Dependent Variables, Partial Observability, Experimentation, and Flexible Modeling: Part B
Type: Book
ISBN: 978-1-83867-419-9

Book part
Publication date: 1 December 2016

Roman Liesenfeld, Jean-François Richard and Jan Vogler

We propose a generic algorithm for numerically accurate likelihood evaluation of a broad class of spatial models characterized by a high-dimensional latent Gaussian process and…

Abstract

We propose a generic algorithm for numerically accurate likelihood evaluation of a broad class of spatial models characterized by a high-dimensional latent Gaussian process and non-Gaussian response variables. The class of models under consideration includes specifications for discrete choices, event counts and limited-dependent variables (truncation, censoring, and sample selection) among others. Our algorithm relies upon a novel implementation of efficient importance sampling (EIS) specifically designed to exploit typical sparsity of high-dimensional spatial precision (or covariance) matrices. It is numerically very accurate and computationally feasible even for very high-dimensional latent processes. Thus, maximum likelihood (ML) estimation of high-dimensional non-Gaussian spatial models, hitherto considered to be computationally prohibitive, becomes feasible. We illustrate our approach with ML estimation of a spatial probit for US presidential voting decisions and spatial count data models (Poisson and Negbin) for firm location choices.

Details

Spatial Econometrics: Qualitative and Limited Dependent Variables
Type: Book
ISBN: 978-1-78560-986-2

Keywords

Book part
Publication date: 22 November 2012

Anna Kormilitsina and Denis Nekipelov

The Laplace-type estimator (LTE) is a simulation-based alternative to the classical extremum estimator that has gained popularity in applied research. We show that even though the…

Abstract

The Laplace-type estimator (LTE) is a simulation-based alternative to the classical extremum estimator that has gained popularity in applied research. We show that even though the estimator has desirable asymptotic properties, in small samples the point estimate provided by LTE may not necessarily converge to the extremum of the sample objective function. Furthermore, we suggest a simple test to verify if the estimator converges. We illustrate these results by estimating a prototype dynamic stochastic general equilibrium model widely used in macroeconomics research.

Details

DSGE Models in Macroeconomics: Estimation, Evaluation, and New Developments
Type: Book
ISBN: 978-1-78190-305-6

Keywords

Book part
Publication date: 30 August 2019

Timothy Cogley and Richard Startz

Standard estimation of ARMA models in which the AR and MA roots nearly cancel, so that individual coefficients are only weakly identified, often produces inferential ranges for…

Abstract

Standard estimation of ARMA models in which the AR and MA roots nearly cancel, so that individual coefficients are only weakly identified, often produces inferential ranges for individual coefficients that give a spurious appearance of accuracy. We remedy this problem with a model that uses a simple mixture prior. The posterior mixing probability is derived using Bayesian methods, but we show that the method works well in both Bayesian and frequentist setups. In particular, we show that our mixture procedure weights standard results heavily when given data from a well-identified ARMA model (which does not exhibit near root cancellation) and weights heavily an uninformative inferential region when given data from a weakly-identified ARMA model (with near root cancellation). When our procedure is applied to a well-identified process the investigator gets the “usual results,” so there is no important statistical cost to using our procedure. On the other hand, when our procedure is applied to a weakly identified process, the investigator learns that the data tell us little about the parameters – and is thus protected against making spurious inferences. We recommend that mixture models be computed routinely when inference about ARMA coefficients is of interest.

Details

Topics in Identification, Limited Dependent Variables, Partial Observability, Experimentation, and Flexible Modeling: Part A
Type: Book
ISBN: 978-1-78973-241-2

Keywords

Book part
Publication date: 1 January 2008

Paolo Giordani and Robert Kohn

Our paper discusses simulation-based Bayesian inference using information from previous draws to build the proposals. The aim is to produce samplers that are easy to implement…

Abstract

Our paper discusses simulation-based Bayesian inference using information from previous draws to build the proposals. The aim is to produce samplers that are easy to implement, that explore the target distribution effectively, and that are computationally efficient and mix well.

Details

Bayesian Econometrics
Type: Book
ISBN: 978-1-84855-308-8

Book part
Publication date: 1 December 2016

Jacob Dearmon and Tony E. Smith

Statistical methods of spatial analysis are often successful at either prediction or explanation, but not necessarily both. In a recent paper, Dearmon and Smith (2016) showed that…

Abstract

Statistical methods of spatial analysis are often successful at either prediction or explanation, but not necessarily both. In a recent paper, Dearmon and Smith (2016) showed that by combining Gaussian Process Regression (GPR) with Bayesian Model Averaging (BMA), a modeling framework could be developed in which both needs are addressed. In particular, the smoothness properties of GPR together with the robustness of BMA allow local spatial analyses of individual variable effects that yield remarkably stable results. However, this GPR-BMA approach is not without its limitations. In particular, the standard (isotropic) covariance kernel of GPR treats all explanatory variables in a symmetric way that limits the analysis of their individual effects. Here we extend this approach by introducing a mixture of kernels (both isotropic and anisotropic) which allow different length scales for each variable. To do so in a computationally efficient manner, we also explore a number of Bayes-factor approximations that avoid the need for costly reversible-jump Monte Carlo methods.

To demonstrate the effectiveness of this Variable Length Scale (VLS) model in terms of both predictions and local marginal analyses, we employ selected simulations to compare VLS with Geographically Weighted Regression (GWR), which is currently the most popular method for such spatial modeling. In addition, we employ the classical Boston Housing data to compare VLS not only with GWR but also with other well-known spatial regression models that have been applied to this same data. Our main results are to show that VLS not only compares favorably with spatial regression at the aggregate level but is also far more accurate than GWR at the local level.

Details

Spatial Econometrics: Qualitative and Limited Dependent Variables
Type: Book
ISBN: 978-1-78560-986-2

Keywords

Book part
Publication date: 28 February 2002

Lars Muus, Hiek van der Scheer and Tom Wansbeek

One of the most important issues facing a firm involved in direct marketing is the selection of addresses from a mailing list. When the parameters of the model describing…

Abstract

One of the most important issues facing a firm involved in direct marketing is the selection of addresses from a mailing list. When the parameters of the model describing consumers' reaction to a mailing are known, addresses for a future mailing can be selected in a profit-maximizing way. Usually, these parameters are unknown and have to be estimated. These estimates are used to rank the potential addressees and to select the best targets.

Several methods for this selection process have been proposed in the recent literature. All of these methods consider the estimation and selection step separately. Since estimation uncertainty is neglected these methods lead to a suboptimal decision rule and hence not to optimal profits. We derive an optimal Bayes decision rule that follows from the firm's profit function and which explicitly takes estimation uncertainty into account. We show that the integral resulting from the Bayes decision rule can be either approximated through a normal posterior, or numerically evaluated by a Laplace approximation or by Markov chain Monte Carlo integration. An empirical example shows that indeed higher profits result.

Details

Advances in Econometrics
Type: Book
ISBN: 978-1-84950-142-2

Book part
Publication date: 21 December 2010

Tore Selland Kleppe, Jun Yu and H.J. Skaug

In this chapter we develop and implement a method for maximum simulated likelihood estimation of the continuous time stochastic volatility model with the constant elasticity of…

Abstract

In this chapter we develop and implement a method for maximum simulated likelihood estimation of the continuous time stochastic volatility model with the constant elasticity of volatility. The approach does not require observations on option prices, nor volatility. To integrate out latent volatility from the joint density of return and volatility, a modified efficient importance sampling technique is used after the continuous time model is approximated using the Euler–Maruyama scheme. The Monte Carlo studies show that the method works well and the empirical applications illustrate usefulness of the method. Empirical results provide strong evidence against the Heston model.

Details

Maximum Simulated Likelihood Methods and Applications
Type: Book
ISBN: 978-0-85724-150-4

Book part
Publication date: 22 November 2012

Juan Carlos Escanciano, Thomas B. Fomby, R. Carter Hill, Eric Hillebrand and Ivan Jeliazkov

This volume of Advances in Econometrics is devoted to dynamic stochastic general equilibrium (DSGE) models, which have gained popularity in both academic and policy circles as a…

Abstract

This volume of Advances in Econometrics is devoted to dynamic stochastic general equilibrium (DSGE) models, which have gained popularity in both academic and policy circles as a theoretically and methodologically coherent way of analyzing a variety of issues in empirical macroeconomics. The volume is divided into two parts. The first part covers important topics in DSGE modeling and estimation practice, including the modeling and role of expectations, the study of alternative pricing models, the problem of non-invertibility in structural VARs, the possible weak identification in new open economy macro models, and the modeling of trend inflation. The second part is devoted to innovations in econometric methodology. The papers in this section advance new techniques for addressing key theoretical and inferential problems and include discussion and applications of Laplace-type, frequency domain, empirical likelihood, and method of moments estimators.

Details

DSGE Models in Macroeconomics: Estimation, Evaluation, and New Developments
Type: Book
ISBN: 978-1-78190-305-6

Access

Year

Content type

Book part (47)
1 – 10 of 47