Search results

1 – 10 of over 3000
Book part
Publication date: 22 November 2012

Enrique Martínez-García, Diego Vilán and Mark A. Wynne

Open-Economy models are central to the discussion of the trade-offs monetary policy faces in an increasingly more globalized world (e.g., Marínez-García & Wynne, 2010), but…

Abstract

Open-Economy models are central to the discussion of the trade-offs monetary policy faces in an increasingly more globalized world (e.g., Marínez-García & Wynne, 2010), but bringing them to the data is not without its challenges. Controlling for misspecification bias, we trace the problem of uncertainty surrounding structural parameter estimation in the context of a fully specified New Open Economy Macro (NOEM) model partly to sample size. We suggest that standard macroeconomic time series with a coverage of less than forty years may not be informative enough for some parameters of interest to be recovered with precision. We also illustrate how uncertainty also arises from weak structural identification, irrespective of the sample size. This remains a concern for empirical research and we recommend estimation with simulated observations before using actual data as a way of detecting structural parameters that are prone to weak identification. We also recommend careful evaluation and documentation of the implementation strategy (specially in the selection of observables) as it can have significant effects on the strength of identification of key model parameters.

Details

DSGE Models in Macroeconomics: Estimation, Evaluation, and New Developments
Type: Book
ISBN: 978-1-78190-305-6

Keywords

Book part
Publication date: 3 June 2008

Nathaniel T. Wilcox

Choice under risk has a large stochastic (unpredictable) component. This chapter examines five stochastic models for binary discrete choice under risk and how they combine with…

Abstract

Choice under risk has a large stochastic (unpredictable) component. This chapter examines five stochastic models for binary discrete choice under risk and how they combine with “structural” theories of choice under risk. Stochastic models are substantive theoretical hypotheses that are frequently testable in and of themselves, and also identifying restrictions for hypothesis tests, estimation and prediction. Econometric comparisons suggest that for the purpose of prediction (as opposed to explanation), choices of stochastic models may be far more consequential than choices of structures such as expected utility or rank-dependent utility.

Details

Risk Aversion in Experiments
Type: Book
ISBN: 978-1-84950-547-5

Book part
Publication date: 1 July 2015

Enrique Martínez-García

The global slack hypothesis is central to the discussion of the trade-offs that monetary policy faces in an increasingly more integrated world. The workhorse New Open Economy…

Abstract

The global slack hypothesis is central to the discussion of the trade-offs that monetary policy faces in an increasingly more integrated world. The workhorse New Open Economy Macro (NOEM) model of Martínez-García and Wynne (2010), which fleshes out this hypothesis, shows how expected future local inflation and global slack affect current local inflation. In this chapter, I propose the use of the orthogonalization method of Aoki (1981) and Fukuda (1993) on the workhorse NOEM model to further decompose local inflation into a global component and an inflation differential component. I find that the log-linearized rational expectations model of Martínez-García and Wynne (2010) can be solved with two separate subsystems to describe each of these two components of inflation.

I estimate the full NOEM model with Bayesian techniques using data for the United States and an aggregate of its 38 largest trading partners from 1980Q1 until 2011Q4. The Bayesian estimation recognizes the parameter uncertainty surrounding the model and calls on the data (inflation and output) to discipline the parameterization. My findings show that the strength of the international spillovers through trade – even in the absence of common shocks – is reflected in the response of global inflation and is incorporated into local inflation dynamics. Furthermore, I find that key features of the economy can have different impacts on global and local inflation – in particular, I show that the parameters that determine the import share and the price-elasticity of trade matter in explaining the inflation differential component but not the global component of inflation.

Details

Monetary Policy in the Context of the Financial Crisis: New Challenges and Lessons
Type: Book
ISBN: 978-1-78441-779-6

Keywords

Book part
Publication date: 13 December 2013

Kirstin Hubrich and Timo Teräsvirta

This survey focuses on two families of nonlinear vector time series models, the family of vector threshold regression (VTR) models and that of vector smooth transition regression…

Abstract

This survey focuses on two families of nonlinear vector time series models, the family of vector threshold regression (VTR) models and that of vector smooth transition regression (VSTR) models. These two model classes contain incomplete models in the sense that strongly exogeneous variables are allowed in the equations. The emphasis is on stationary models, but the considerations also include nonstationary VTR and VSTR models with cointegrated variables. Model specification, estimation and evaluation is considered, and the use of the models illustrated by macroeconomic examples from the literature.

Details

VAR Models in Macroeconomics – New Developments and Applications: Essays in Honor of Christopher A. Sims
Type: Book
ISBN: 978-1-78190-752-8

Keywords

Book part
Publication date: 30 November 2011

Massimo Guidolin

I review the burgeoning literature on applications of Markov regime switching models in empirical finance. In particular, distinct attention is devoted to the ability of Markov…

Abstract

I review the burgeoning literature on applications of Markov regime switching models in empirical finance. In particular, distinct attention is devoted to the ability of Markov Switching models to fit the data, filter unknown regimes and states on the basis of the data, to allow a powerful tool to test hypotheses formulated in light of financial theories, and to their forecasting performance with reference to both point and density predictions. The review covers papers concerning a multiplicity of sub-fields in financial economics, ranging from empirical analyses of stock returns, the term structure of default-free interest rates, the dynamics of exchange rates, as well as the joint process of stock and bond returns.

Details

Missing Data Methods: Time-Series Methods and Applications
Type: Book
ISBN: 978-1-78052-526-6

Keywords

Book part
Publication date: 23 October 2023

Glenn W. Harrison and J. Todd Swarthout

We take Cumulative Prospect Theory (CPT) seriously by rigorously estimating structural models using the full set of CPT parameters. Much of the literature only estimates a subset…

Abstract

We take Cumulative Prospect Theory (CPT) seriously by rigorously estimating structural models using the full set of CPT parameters. Much of the literature only estimates a subset of CPT parameters, or more simply assumes CPT parameter values from prior studies. Our data are from laboratory experiments with undergraduate students and MBA students facing substantial real incentives and losses. We also estimate structural models from Expected Utility Theory (EUT), Dual Theory (DT), Rank-Dependent Utility (RDU), and Disappointment Aversion (DA) for comparison. Our major finding is that a majority of individuals in our sample locally asset integrate. That is, they see a loss frame for what it is, a frame, and behave as if they evaluate the net payment rather than the gross loss when one is presented to them. This finding is devastating to the direct application of CPT to these data for those subjects. Support for CPT is greater when losses are covered out of an earned endowment rather than house money, but RDU is still the best single characterization of individual and pooled choices. Defenders of the CPT model claim, correctly, that the CPT model exists “because the data says it should.” In other words, the CPT model was borne from a wide range of stylized facts culled from parts of the cognitive psychology literature. If one is to take the CPT model seriously and rigorously then it needs to do a much better job of explaining the data than we see here.

Details

Models of Risk Preferences: Descriptive and Normative Challenges
Type: Book
ISBN: 978-1-83797-269-2

Keywords

Book part
Publication date: 23 October 2023

Nathaniel T. Wilcox

The author presents new estimates of the probability weighting functions found in rank-dependent theories of choice under risk. These estimates are unusual in two senses. First…

Abstract

The author presents new estimates of the probability weighting functions found in rank-dependent theories of choice under risk. These estimates are unusual in two senses. First, they are free of functional form assumptions about both utility and weighting functions, and they are entirely based on binary discrete choices and not on matching or valuation tasks, though they depend on assumptions concerning the nature of probabilistic choice under risk. Second, estimated weighting functions contradict widely held priors of an inverse-s shape with fixed point well in the interior of the (0,1) interval: Instead the author usually finds populations dominated by “optimists” who uniformly overweight best outcomes in risky options. The choice pairs used here mostly do not provoke similarity-based simplifications. In a third experiment, the author shows that the presence of choice pairs that provoke similarity-based computational shortcuts does indeed flatten estimated probability weighting functions.

Details

Models of Risk Preferences: Descriptive and Normative Challenges
Type: Book
ISBN: 978-1-83797-269-2

Keywords

Book part
Publication date: 24 March 2006

Ngai Hang Chan and Wilfredo Palma

Since the seminal works by Granger and Joyeux (1980) and Hosking (1981), estimations of long-memory time series models have been receiving considerable attention and a number of…

Abstract

Since the seminal works by Granger and Joyeux (1980) and Hosking (1981), estimations of long-memory time series models have been receiving considerable attention and a number of parameter estimation procedures have been proposed. This paper gives an overview of this plethora of methodologies with special focus on likelihood-based techniques. Broadly speaking, likelihood-based techniques can be classified into the following categories: the exact maximum likelihood (ML) estimation (Sowell, 1992; Dahlhaus, 1989), ML estimates based on autoregressive approximations (Granger & Joyeux, 1980; Li & McLeod, 1986), Whittle estimates (Fox & Taqqu, 1986; Giraitis & Surgailis, 1990), Whittle estimates with autoregressive truncation (Beran, 1994a), approximate estimates based on the Durbin–Levinson algorithm (Haslett & Raftery, 1989), state-space-based maximum likelihood estimates for ARFIMA models (Chan & Palma, 1998), and estimation of stochastic volatility models (Ghysels, Harvey, & Renault, 1996; Breidt, Crato, & de Lima, 1998; Chan & Petris, 2000) among others. Given the diversified applications of these techniques in different areas, this review aims at providing a succinct survey of these methodologies as well as an overview of important related problems such as the ML estimation with missing data (Palma & Chan, 1997), influence of subsets of observations on estimates and the estimation of seasonal long-memory models (Palma & Chan, 2005). Performances and asymptotic properties of these techniques are compared and examined. Inter-connections and finite sample performances among these procedures are studied. Finally, applications to financial time series of these methodologies are discussed.

Details

Econometric Analysis of Financial and Economic Time Series
Type: Book
ISBN: 978-1-84950-388-4

Abstract

Details

Applying Maximum Entropy to Econometric Problems
Type: Book
ISBN: 978-0-76230-187-4

Book part
Publication date: 22 November 2012

Sara Riscado

In this chapter we approach the estimation of dynamic stochastic general equilibrium models through a moments-based estimator, the empirical likelihood. We attempt to show that…

Abstract

In this chapter we approach the estimation of dynamic stochastic general equilibrium models through a moments-based estimator, the empirical likelihood. We attempt to show that this inference process can be a valid alternative to maximum likelihood, which has been one of the preferred choices of the related literature to estimate these models. The empirical likelihood estimator is characterized by a simple setup and only requires knowledge about the moments of the data generating process of the model. In this context, we exploit the fact that these economies can be formulated as a set of moment conditions to infer on their parameters through this technique. For illustrational purposes, we consider a standard real business cycle model with a constant relative risk averse utility function and indivisible labor, driven by a normal technology shock.

Details

DSGE Models in Macroeconomics: Estimation, Evaluation, and New Developments
Type: Book
ISBN: 978-1-78190-305-6

Keywords

1 – 10 of over 3000