Search results

1 – 10 of over 6000

Abstract

Details

Applying Maximum Entropy to Econometric Problems
Type: Book
ISBN: 978-0-76230-187-4

Abstract

Details

New Directions in Macromodelling
Type: Book
ISBN: 978-1-84950-830-8

Book part
Publication date: 15 April 2020

Yubo Tao and Jun Yu

This chapter examines the limit properties of information criteria (such as AIC, BIC, and HQIC) for distinguishing between the unit-root (UR) model and the various kinds of…

Abstract

This chapter examines the limit properties of information criteria (such as AIC, BIC, and HQIC) for distinguishing between the unit-root (UR) model and the various kinds of explosive models. The explosive models include the local-to-unit-root model from the explosive side the mildly explosive (ME) model, and the regular explosive model. Initial conditions with different orders of magnitude are considered. Both the OLS estimator and the indirect inference estimator are studied. It is found that BIC and HQIC, but not AIC, consistently select the UR model when data come from the UR model. When data come from the local-to-unit-root model from the explosive side, both BIC and HQIC select the wrong model with probability approaching 1 while AIC has a positive probability of selecting the right model in the limit. When data come from the regular explosive model or from the ME model in the form of 1 + nα/n with α ∈ (0, 1), all three information criteria consistently select the true model. Indirect inference estimation can increase or decrease the probability for information criteria to select the right model asymptotically relative to OLS, depending on the information criteria and the true model. Simulation results confirm our asymptotic results in finite sample.

Book part
Publication date: 19 November 2014

Enrique Martínez-García and Mark A. Wynne

We investigate the Bayesian approach to model comparison within a two-country framework with nominal rigidities using the workhorse New Keynesian open-economy model of…

Abstract

We investigate the Bayesian approach to model comparison within a two-country framework with nominal rigidities using the workhorse New Keynesian open-economy model of Martínez-García and Wynne (2010). We discuss the trade-offs that monetary policy – characterized by a Taylor-type rule – faces in an interconnected world, with perfectly flexible exchange rates. We then use posterior model probabilities to evaluate the weight of evidence in support of such a model when estimated against more parsimonious specifications that either abstract from monetary frictions or assume autarky by means of controlled experiments that employ simulated data. We argue that Bayesian model comparison with posterior odds is sensitive to sample size and the choice of observable variables for estimation. We show that posterior model probabilities strongly penalize overfitting, which can lead us to favor a less parameterized model against the true data-generating process when the two become arbitrarily close to each other. We also illustrate that the spillovers from monetary policy across countries have an added confounding effect.

Book part
Publication date: 5 April 2024

Bruce E. Hansen and Jeffrey S. Racine

Classical unit root tests are known to suffer from potentially crippling size distortions, and a range of procedures have been proposed to attenuate this problem, including the…

Abstract

Classical unit root tests are known to suffer from potentially crippling size distortions, and a range of procedures have been proposed to attenuate this problem, including the use of bootstrap procedures. It is also known that the estimating equation’s functional form can affect the outcome of the test, and various model selection procedures have been proposed to overcome this limitation. In this chapter, the authors adopt a model averaging procedure to deal with model uncertainty at the testing stage. In addition, the authors leverage an automatic model-free dependent bootstrap procedure where the null is imposed by simple differencing (the block length is automatically determined using recent developments for bootstrapping dependent processes). Monte Carlo simulations indicate that this approach exhibits the lowest size distortions among its peers in settings that confound existing approaches, while it has superior power relative to those peers whose size distortions do not preclude their general use. The proposed approach is fully automatic, and there are no nuisance parameters that have to be set by the user, which ought to appeal to practitioners.

Details

Essays in Honor of Subal Kumbhakar
Type: Book
ISBN: 978-1-83797-874-8

Keywords

Abstract

Details

Applying Maximum Entropy to Econometric Problems
Type: Book
ISBN: 978-0-76230-187-4

Book part
Publication date: 22 November 2012

Tae-Seok Jang

This chapter analyzes the empirical relationship between the pricesetting/consumption behavior and the sources of persistence in inflation and output. First, a small-scale…

Abstract

This chapter analyzes the empirical relationship between the pricesetting/consumption behavior and the sources of persistence in inflation and output. First, a small-scale New-Keynesian model (NKM) is examined using the method of moment and maximum likelihood estimators with US data from 1960 to 2007. Then a formal test is used to compare the fit of two competing specifications in the New-Keynesian Phillips Curve (NKPC) and the IS equation, that is, backward- and forward-looking behavior. Accordingly, the inclusion of a lagged term in the NKPC and the IS equation improves the fit of the model while offsetting the influence of inherited and extrinsic persistence; it is shown that intrinsic persistence plays a major role in approximating inflation and output dynamics for the Great Inflation period. However, the null hypothesis cannot be rejected at the 5% level for the Great Moderation period, that is, the NKM with purely forward-looking behavior and its hybrid variant are equivalent. Monte Carlo experiments investigate the validity of chosen moment conditions and the finite sample properties of the chosen estimation methods. Finally, the empirical performance of the formal test is discussed along the lines of the Akaike's and the Bayesian information criterion.

Details

DSGE Models in Macroeconomics: Estimation, Evaluation, and New Developments
Type: Book
ISBN: 978-1-78190-305-6

Keywords

Book part
Publication date: 1 January 2005

T.J. Brailsford, J.H.W. Penm and R.D. Terrell

Vector error-correction models (VECMs) have become increasingly important in their application to financial markets. Standard full-order VECM models assume non-zero entries in all…

Abstract

Vector error-correction models (VECMs) have become increasingly important in their application to financial markets. Standard full-order VECM models assume non-zero entries in all their coefficient matrices. However, applications of VECM models to financial market data have revealed that zero entries are often a necessary part of efficient modelling. In such cases, the use of full-order VECM models may lead to incorrect inferences. Specifically, if indirect causality or Granger non-causality exists among the variables, the use of over-parameterised full-order VECM models may weaken the power of statistical inference. In this paper, it is argued that the zero–non-zero (ZNZ) patterned VECM is a more straightforward and effective means of testing for both indirect causality and Granger non-causality. For a ZNZ patterned VECM framework for time series of integrated order two, we provide a new algorithm to select cointegrating and loading vectors that can contain zero entries. Two case studies are used to demonstrate the usefulness of the algorithm in tests of purchasing power parity and a three-variable system involving the stock market.

Details

Research in Finance
Type: Book
ISBN: 978-0-76231-277-1

Book part
Publication date: 1 January 2008

Cathy W.S. Chen, Richard Gerlach and Mike K.P. So

It is well known that volatility asymmetry exists in financial markets. This paper reviews and investigates recently developed techniques for Bayesian estimation and model…

Abstract

It is well known that volatility asymmetry exists in financial markets. This paper reviews and investigates recently developed techniques for Bayesian estimation and model selection applied to a large group of modern asymmetric heteroskedastic models. These include the GJR-GARCH, threshold autoregression with GARCH errors, TGARCH, and double threshold heteroskedastic model with auxiliary threshold variables. Further, we briefly review recent methods for Bayesian model selection, such as, reversible-jump Markov chain Monte Carlo, Monte Carlo estimation via independent sampling from each model, and importance sampling methods. Seven heteroskedastic models are then compared, for three long series of daily Asian market returns, in a model selection study illustrating the preferred model selection method. Major evidence of nonlinearity in mean and volatility is found, with the preferred model having a weighted threshold variable of local and international market news.

Details

Bayesian Econometrics
Type: Book
ISBN: 978-1-84855-308-8

Book part
Publication date: 31 January 2015

Wei Zhu and Harry Timmermans

Increasing evidence suggests that choice behaviour in real world may be guided by principles of bounded rationality as opposed to typically assumed fully rational behaviour, based…

Abstract

Purpose

Increasing evidence suggests that choice behaviour in real world may be guided by principles of bounded rationality as opposed to typically assumed fully rational behaviour, based on the principle of utility-maximization. Under such circumstances, conventional rational choice models cannot capture the decision processes. The purpose of the chapter is to propose a modeling framework that can capture both decision outcome and decision process.

Methodology

The modeling framework incorporates a discrete cognitive representation structure and implies several decision heuristics, such as conjunctive, disjunctive and lexicographic rules. This allows modeling unobserved decision heterogeneity involved in a single decision, for example, in the form of a latent-class specification, taking into account mental effort, risk perception and expected outcome as explanatory factors.

Findings

Two models based on this framework are applied to decision problems underlying pedestrian shopping behaviour and compared with conventional multinomial logit models. The results show that the proposed models may not be superior to logit models in terms of model selection criteria due to the extra complexity in selecting heuristics, but suggest more interesting insights to the underlying decision mechanisms.

Research implications

Understanding decision processes additional to outcomes is a promising research direction. A more developed model should take into account more contextual and socio-demographic factors in the heuristic selection part. The assumptions of information processing must be subject to empirical tests to validate the model.

Originality

The proposed modeling framework bridges the long-existing contradicting approaches in the field of decision modeling, namely the rational approach and the bounded rational approach, by proving that non-compensatory decision heuristics can be inferred from compensatory model formulations with discretized information representations and decision criteria assumed. It also incorporates a heuristic choice part into the decision processes in the form of latent-class specifications and shows the viability of the new modeling framework.

Details

Bounded Rational Choice Behaviour: Applications in Transport
Type: Book
ISBN: 978-1-78441-071-1

Keywords

1 – 10 of over 6000