Search results

1 – 10 of over 9000
Book part
Publication date: 30 August 2019

Md. Nazmul Ahsan and Jean-Marie Dufour

Statistical inference (estimation and testing) for the stochastic volatility (SV) model Taylor (1982, 1986) is challenging, especially likelihood-based methods which are difficult…

Abstract

Statistical inference (estimation and testing) for the stochastic volatility (SV) model Taylor (1982, 1986) is challenging, especially likelihood-based methods which are difficult to apply due to the presence of latent variables. The existing methods are either computationally costly and/or inefficient. In this paper, we propose computationally simple estimators for the SV model, which are at the same time highly efficient. The proposed class of estimators uses a small number of moment equations derived from an ARMA representation associated with the SV model, along with the possibility of using “winsorization” to improve stability and efficiency. We call these ARMA-SV estimators. Closed-form expressions for ARMA-SV estimators are obtained, and no numerical optimization procedure or choice of initial parameter values is required. The asymptotic distributional theory of the proposed estimators is studied. Due to their computational simplicity, the ARMA-SV estimators allow one to make reliable – even exact – simulation-based inference, through the application of Monte Carlo (MC) test or bootstrap methods. We compare them in a simulation experiment with a wide array of alternative estimation methods, in terms of bias, root mean square error and computation time. In addition to confirming the enormous computational advantage of the proposed estimators, the results show that ARMA-SV estimators match (or exceed) alternative estimators in terms of precision, including the widely used Bayesian estimator. The proposed methods are applied to daily observations on the returns for three major stock prices (Coca-Cola, Walmart, Ford) and the S&P Composite Price Index (2000–2017). The results confirm the presence of stochastic volatility with strong persistence.

Details

Topics in Identification, Limited Dependent Variables, Partial Observability, Experimentation, and Flexible Modeling: Part A
Type: Book
ISBN: 978-1-78973-241-2

Keywords

Book part
Publication date: 22 November 2012

Tae-Seok Jang

This chapter analyzes the empirical relationship between the pricesetting/consumption behavior and the sources of persistence in inflation and output. First, a small-scale…

Abstract

This chapter analyzes the empirical relationship between the pricesetting/consumption behavior and the sources of persistence in inflation and output. First, a small-scale New-Keynesian model (NKM) is examined using the method of moment and maximum likelihood estimators with US data from 1960 to 2007. Then a formal test is used to compare the fit of two competing specifications in the New-Keynesian Phillips Curve (NKPC) and the IS equation, that is, backward- and forward-looking behavior. Accordingly, the inclusion of a lagged term in the NKPC and the IS equation improves the fit of the model while offsetting the influence of inherited and extrinsic persistence; it is shown that intrinsic persistence plays a major role in approximating inflation and output dynamics for the Great Inflation period. However, the null hypothesis cannot be rejected at the 5% level for the Great Moderation period, that is, the NKM with purely forward-looking behavior and its hybrid variant are equivalent. Monte Carlo experiments investigate the validity of chosen moment conditions and the finite sample properties of the chosen estimation methods. Finally, the empirical performance of the formal test is discussed along the lines of the Akaike's and the Bayesian information criterion.

Details

DSGE Models in Macroeconomics: Estimation, Evaluation, and New Developments
Type: Book
ISBN: 978-1-78190-305-6

Keywords

Book part
Publication date: 22 November 2012

Sara Riscado

In this chapter we approach the estimation of dynamic stochastic general equilibrium models through a moments-based estimator, the empirical likelihood. We attempt to show that…

Abstract

In this chapter we approach the estimation of dynamic stochastic general equilibrium models through a moments-based estimator, the empirical likelihood. We attempt to show that this inference process can be a valid alternative to maximum likelihood, which has been one of the preferred choices of the related literature to estimate these models. The empirical likelihood estimator is characterized by a simple setup and only requires knowledge about the moments of the data generating process of the model. In this context, we exploit the fact that these economies can be formulated as a set of moment conditions to infer on their parameters through this technique. For illustrational purposes, we consider a standard real business cycle model with a constant relative risk averse utility function and indivisible labor, driven by a normal technology shock.

Details

DSGE Models in Macroeconomics: Estimation, Evaluation, and New Developments
Type: Book
ISBN: 978-1-78190-305-6

Keywords

Abstract

Details

Applying Maximum Entropy to Econometric Problems
Type: Book
ISBN: 978-0-76230-187-4

Abstract

Details

Applying Maximum Entropy to Econometric Problems
Type: Book
ISBN: 978-0-76230-187-4

Book part
Publication date: 29 March 2006

Jean-Marie Dufour and Pascale Valéry

In this paper, we consider the estimation of volatility parameters in the context of a linear regression where the disturbances follow a stochastic volatility (SV) model of order…

Abstract

In this paper, we consider the estimation of volatility parameters in the context of a linear regression where the disturbances follow a stochastic volatility (SV) model of order one with Gaussian log-volatility. The linear regression represents the conditional mean of the process and may have a fairly general form, including for example finite-order autoregressions. We provide a computationally simple two-step estimator available in closed form. Under general regularity conditions, we show that this two-step estimator is asymptotically normal. We study its statistical properties by simulation, compare it with alternative generalized method-of-moments (GMM) estimators, and present an application to the S&P composite index.

Details

Econometric Analysis of Financial and Economic Time Series
Type: Book
ISBN: 978-0-76231-274-0

Book part
Publication date: 19 November 2012

Naceur Naguez and Jean-Luc Prigent

Purpose – The purpose of this chapter is to estimate non-Gaussian distributions by means of Johnson distributions. An empirical illustration on hedge fund returns is…

Abstract

Purpose – The purpose of this chapter is to estimate non-Gaussian distributions by means of Johnson distributions. An empirical illustration on hedge fund returns is detailed.

Methodology/approach – To fit non-Gaussian distributions, the chapter introduces the family of Johnson distributions and its general extensions. We use both parametric and non-parametric approaches. In a first step, we analyze the serial correlation of our sample of hedge fund returns and unsmooth the series to correct the correlations. Then, we estimate the distribution by the standard Johnson system of laws. Finally, we search for a more general distribution of Johnson type, using a non-parametric approach.

Findings – We use data from the indexes Credit Suisse/Tremont Hedge Fund (CSFB/Tremont) provided by Credit Suisse. For the parametric approach, we find that the SU Johnson distribution is the most appropriate, except for the Managed Futures. For the non-parametric approach, we determine the best polynomial approximation of the function characterizing the transformation from the initial Gaussian law to the generalized Johnson distribution.

Originality/value of chapter – These findings are novel since we use an extension of the Johnson distributions to better fit non-Gaussian distributions, in particular in the case of hedge fund returns. We illustrate the power of this methodology that can be further developed in the multidimensional case.

Details

Recent Developments in Alternative Finance: Empirical Assessments and Economic Implications
Type: Book
ISBN: 978-1-78190-399-5

Keywords

Book part
Publication date: 19 December 2012

Nicky Grant

Principal component (PC) techniques are commonly used to improve the small sample properties of the linear instrumental variables (IV) estimator. Carrasco (2012) argue that PC…

Abstract

Principal component (PC) techniques are commonly used to improve the small sample properties of the linear instrumental variables (IV) estimator. Carrasco (2012) argue that PC type methods provide a natural ranking of instruments with which to reduce the size of the instrument set. This chapter shows how reducing the size of the instrument based on PC methods can lead to poor small sample properties of IV estimators. A new approach to ordering instruments termed ‘normalized principal components’ (NPCs) is introduced to overcome this problem. A simulation study shows the favourable small samples properties of IV estimators using NPC, methods to reduce the size of the instrument relative to PC. Using NPC we provide evidence that the IV setup in Angrist and Krueger (1992) may not suffer the weak instrument problem.

Details

Essays in Honor of Jerry Hausman
Type: Book
ISBN: 978-1-78190-308-7

Keywords

Book part
Publication date: 23 June 2016

Bao Yong, Fan Yanqin, Su Liangjun and Zinde-Walsh Victoria

This paper examines Aman Ullah’s contributions to robust inference, finite sample econometrics, nonparametrics and semiparametrics, and panel and spatial models. His early works…

Abstract

This paper examines Aman Ullah’s contributions to robust inference, finite sample econometrics, nonparametrics and semiparametrics, and panel and spatial models. His early works on robust inference and finite sample theory were mostly motivated by his thesis advisor, Professor Anirudh Lal Nagar. They eventually led to his most original rethinking of many statistics and econometrics models that developed into the monograph Finite Sample Econometrics published in 2004. His desire to relax distributional and functional-form assumptions lead him in the direction of nonparametric estimation and he summarized his views in his most influential textbook Nonparametric Econometrics (with Adrian Pagan) published in 1999 that has influenced a whole generation of econometricians. His innovative contributions in the areas of seemingly unrelated regressions, parametric, semiparametric and nonparametric panel data models, and spatial models have also inspired a larger literature on nonparametric and semiparametric estimation and inference and spurred on research in robust estimation and inference in these and related areas.

Book part
Publication date: 19 October 2020

Julian TszKin Chan

This chapter studies a snowball sampling method for social networks with endogenous peer selection. Snowball sampling is a sampling design which preserves the dependence structure…

Abstract

This chapter studies a snowball sampling method for social networks with endogenous peer selection. Snowball sampling is a sampling design which preserves the dependence structure of the network. It sequentially collects the information of vertices linked to the vertices collected in the previous iteration. The snowball samples suffer from a sample selection problem because of the endogenous peer selection. The author proposes a new estimation method that uses the relationship between samples in different iterations to correct selection. The author uses the snowball samples collected from Facebook to estimate the proportion of users who support the Umbrella Movement in Hong Kong.

1 – 10 of over 9000