Search results

1 – 10 of 63
Book part
Publication date: 22 November 2012

Fabio Milani

This paper surveys the treatment of expectations in estimated Dynamic Stochastic General Equilibrium (DSGE) macroeconomic models.A recent notable development in the empirical…

Abstract

This paper surveys the treatment of expectations in estimated Dynamic Stochastic General Equilibrium (DSGE) macroeconomic models.

A recent notable development in the empirical macroeconomics literature has been the rapid growth of papers that build structural models, which include a number of frictions and shocks, and which are confronted with the data using sophisticated full-information econometric approaches, often using Bayesian methods.

A widespread assumption in these estimated models, as in most of the macroeconomic literature in general, is that economic agents' expectations are formed according to the Rational Expectations Hypothesis (REH). Various alternative ways to model the formation of expectations have, however, emerged: some are simple refinements that maintain the REH, but change the information structure along different dimensions, while others imply more significant departures from rational expectations.

I review here the modeling of the expectation formation process and discuss related econometric issues in current structural macroeconomic models. The discussion includes benchmark models assuming rational expectations, extensions based on allowing for sunspots, news, sticky information, as well as models that abandon the REH to use learning, heuristics, or subjective expectations.

Details

DSGE Models in Macroeconomics: Estimation, Evaluation, and New Developments
Type: Book
ISBN: 978-1-78190-305-6

Keywords

Book part
Publication date: 1 January 2008

Arnold Zellner

After briefly reviewing the past history of Bayesian econometrics and Alan Greenspan's (2004) recent description of his use of Bayesian methods in managing policy-making risk…

Abstract

After briefly reviewing the past history of Bayesian econometrics and Alan Greenspan's (2004) recent description of his use of Bayesian methods in managing policy-making risk, some of the issues and needs that he mentions are discussed and linked to past and present Bayesian econometric research. Then a review of some recent Bayesian econometric research and needs is presented. Finally, some thoughts are presented that relate to the future of Bayesian econometrics.

Details

Bayesian Econometrics
Type: Book
ISBN: 978-1-84855-308-8

Article
Publication date: 7 December 2018

Renato Camodeca, Alex Almici and Umberto Sagliaschi

The purpose of this paper is to use a theoretical and empirical model to investigate the adoption of the integrated reporting (IR) framework as a strategic choice to signal…

Abstract

Purpose

The purpose of this paper is to use a theoretical and empirical model to investigate the adoption of the integrated reporting (IR) framework as a strategic choice to signal intellectual capital (IC) to equity investors, with specific reference to the pharmaceutical industry.

Design/methodology/approach

The choice of drafting an integrated report is modelled as a means for managers to strategically disclose price-relevant information related to IC. The voluntary disclosure model developed by Verrecchia (1983) is used, also introducing the role of financial analysts to derive a directly reproducible empirical equation.

Findings

Theoretically, as IR requires managers to exert an effort in reporting activity, this work shows that in equilibrium, only firms with sufficient IC have decided to adopt IR, resulting in rational investors’ willingness to pay more only for the forecasted earnings of integrated reporters. This theory is tested in the pharmaceutical sector, where the modelling choice is probably more valid, with mixed results.

Research limitations/implications

When compliant with the International Integrated Reporting Council’s (IIRC) standards, IR provides the means to disclose IC in a perfectly verifiable way. Furthermore, since the IIRC has only recently been established, the conclusions have only been tested on a limited data set.

Originality/value

This work connects the value relevance of IR to IC by adopting an equilibrium approach, which, in turn, provides specific indications of how to build a consistent empirical test of the theory.

Details

Journal of Intellectual Capital, vol. 20 no. 1
Type: Research Article
ISSN: 1469-1930

Keywords

Book part
Publication date: 19 November 2014

Enrique Martínez-García and Mark A. Wynne

We investigate the Bayesian approach to model comparison within a two-country framework with nominal rigidities using the workhorse New Keynesian open-economy model of…

Abstract

We investigate the Bayesian approach to model comparison within a two-country framework with nominal rigidities using the workhorse New Keynesian open-economy model of Martínez-García and Wynne (2010). We discuss the trade-offs that monetary policy – characterized by a Taylor-type rule – faces in an interconnected world, with perfectly flexible exchange rates. We then use posterior model probabilities to evaluate the weight of evidence in support of such a model when estimated against more parsimonious specifications that either abstract from monetary frictions or assume autarky by means of controlled experiments that employ simulated data. We argue that Bayesian model comparison with posterior odds is sensitive to sample size and the choice of observable variables for estimation. We show that posterior model probabilities strongly penalize overfitting, which can lead us to favor a less parameterized model against the true data-generating process when the two become arbitrarily close to each other. We also illustrate that the spillovers from monetary policy across countries have an added confounding effect.

Abstract

Details

Essays in Honor of Cheng Hsiao
Type: Book
ISBN: 978-1-78973-958-9

Book part
Publication date: 13 December 2013

Fabio Canova and Matteo Ciccarelli

This article provides an overview of the panel vector autoregressive models (VAR) used in macroeconomics and finance to study the dynamic relationships between heterogeneous…

Abstract

This article provides an overview of the panel vector autoregressive models (VAR) used in macroeconomics and finance to study the dynamic relationships between heterogeneous assets, households, firms, sectors, and countries. We discuss what their distinctive features are, what they are used for, and how they can be derived from economic theory. We also describe how they are estimated and how shock identification is performed. We compare panel VAR models to other approaches used in the literature to estimate dynamic models involving heterogeneous units. Finally, we show how structural time variation can be dealt with.

Details

VAR Models in Macroeconomics – New Developments and Applications: Essays in Honor of Christopher A. Sims
Type: Book
ISBN: 978-1-78190-752-8

Keywords

Book part
Publication date: 30 August 2019

Joshua C. C. Chan, Liana Jacobi and Dan Zhu

Vector autoregressions (VAR) combined with Minnesota-type priors are widely used for macroeconomic forecasting. The fact that strong but sensible priors can substantially improve…

Abstract

Vector autoregressions (VAR) combined with Minnesota-type priors are widely used for macroeconomic forecasting. The fact that strong but sensible priors can substantially improve forecast performance implies VAR forecasts are sensitive to prior hyperparameters. But the nature of this sensitivity is seldom investigated. We develop a general method based on Automatic Differentiation to systematically compute the sensitivities of forecasts – both points and intervals – with respect to any prior hyperparameters. In a forecasting exercise using US data, we find that forecasts are relatively sensitive to the strength of shrinkage for the VAR coefficients, but they are not much affected by the prior mean of the error covariance matrix or the strength of shrinkage for the intercepts.

Details

Topics in Identification, Limited Dependent Variables, Partial Observability, Experimentation, and Flexible Modeling: Part A
Type: Book
ISBN: 978-1-78973-241-2

Keywords

Book part
Publication date: 30 August 2019

Md. Nazmul Ahsan and Jean-Marie Dufour

Statistical inference (estimation and testing) for the stochastic volatility (SV) model Taylor (1982, 1986) is challenging, especially likelihood-based methods which are difficult…

Abstract

Statistical inference (estimation and testing) for the stochastic volatility (SV) model Taylor (1982, 1986) is challenging, especially likelihood-based methods which are difficult to apply due to the presence of latent variables. The existing methods are either computationally costly and/or inefficient. In this paper, we propose computationally simple estimators for the SV model, which are at the same time highly efficient. The proposed class of estimators uses a small number of moment equations derived from an ARMA representation associated with the SV model, along with the possibility of using “winsorization” to improve stability and efficiency. We call these ARMA-SV estimators. Closed-form expressions for ARMA-SV estimators are obtained, and no numerical optimization procedure or choice of initial parameter values is required. The asymptotic distributional theory of the proposed estimators is studied. Due to their computational simplicity, the ARMA-SV estimators allow one to make reliable – even exact – simulation-based inference, through the application of Monte Carlo (MC) test or bootstrap methods. We compare them in a simulation experiment with a wide array of alternative estimation methods, in terms of bias, root mean square error and computation time. In addition to confirming the enormous computational advantage of the proposed estimators, the results show that ARMA-SV estimators match (or exceed) alternative estimators in terms of precision, including the widely used Bayesian estimator. The proposed methods are applied to daily observations on the returns for three major stock prices (Coca-Cola, Walmart, Ford) and the S&P Composite Price Index (2000–2017). The results confirm the presence of stochastic volatility with strong persistence.

Details

Topics in Identification, Limited Dependent Variables, Partial Observability, Experimentation, and Flexible Modeling: Part A
Type: Book
ISBN: 978-1-78973-241-2

Keywords

Abstract

Details

Nonlinear Time Series Analysis of Business Cycles
Type: Book
ISBN: 978-0-44451-838-5

Book part
Publication date: 6 January 2016

Antonello D’Agostino, Domenico Giannone, Michele Lenza and Michele Modugno

We develop a framework for measuring and monitoring business cycles in real time. Following a long tradition in macroeconometrics, inference is based on a variety of indicators of…

Abstract

We develop a framework for measuring and monitoring business cycles in real time. Following a long tradition in macroeconometrics, inference is based on a variety of indicators of economic activity, treated as imperfect measures of an underlying index of business cycle conditions. We extend existing approaches by permitting for heterogenous lead–lag patterns of the various indicators along the business cycles. The framework is well suited for high-frequency monitoring of current economic conditions in real time – nowcasting – since inference can be conducted in the presence of mixed frequency data and irregular patterns of data availability. Our assessment of the underlying index of business cycle conditions is accurate and more timely than popular alternatives, including the Chicago Fed National Activity Index (CFNAI). A formal real-time forecasting evaluation shows that the framework produces well-calibrated probability nowcasts that resemble the consensus assessment of the Survey of Professional Forecasters.

1 – 10 of 63