Search results
1 – 10 of 222The BEKK GARCH class of models presents a popular set of tools for applied analysis of dynamic conditional covariances. Within this class the analyst faces a range of model…
Abstract
The BEKK GARCH class of models presents a popular set of tools for applied analysis of dynamic conditional covariances. Within this class the analyst faces a range of model choices that trade off flexibility with parameter parsimony. In the most flexible unrestricted BEKK the parameter dimensionality increases quickly with the number of variables. Covariance targeting decreases model dimensionality but induces a set of nonlinear constraints on the underlying parameter space that are difficult to implement. Recently, the rotated BEKK (RBEKK) has been proposed whereby a targeted BEKK model is applied after the spectral decomposition of the conditional covariance matrix. An easily estimable RBEKK implies a full albeit constrained BEKK for the unrotated returns. However, the degree of the implied restrictiveness is currently unknown. In this paper, we suggest a Bayesian approach to estimation of the BEKK model with targeting based on Constrained Hamiltonian Monte Carlo (CHMC). We take advantage of suitable parallelization of the problem within CHMC utilizing the newly available computing power of multi-core CPUs and Graphical Processing Units (GPUs) that enables us to deal effectively with the inherent nonlinear constraints posed by covariance targeting in relatively high dimensions. Using parallel CHMC we perform a model comparison in terms of predictive ability of the targeted BEKK with the RBEKK in the context of an application concerning a multivariate dynamic volatility analysis of a Dow Jones Industrial returns portfolio. Although the RBEKK does improve over a diagonal BEKK restriction, it is clearly dominated by the full targeted BEKK model.
Details
Keywords
Md. Nazmul Ahsan and Jean-Marie Dufour
Statistical inference (estimation and testing) for the stochastic volatility (SV) model Taylor (1982, 1986) is challenging, especially likelihood-based methods which are difficult…
Abstract
Statistical inference (estimation and testing) for the stochastic volatility (SV) model Taylor (1982, 1986) is challenging, especially likelihood-based methods which are difficult to apply due to the presence of latent variables. The existing methods are either computationally costly and/or inefficient. In this paper, we propose computationally simple estimators for the SV model, which are at the same time highly efficient. The proposed class of estimators uses a small number of moment equations derived from an ARMA representation associated with the SV model, along with the possibility of using “winsorization” to improve stability and efficiency. We call these ARMA-SV estimators. Closed-form expressions for ARMA-SV estimators are obtained, and no numerical optimization procedure or choice of initial parameter values is required. The asymptotic distributional theory of the proposed estimators is studied. Due to their computational simplicity, the ARMA-SV estimators allow one to make reliable – even exact – simulation-based inference, through the application of Monte Carlo (MC) test or bootstrap methods. We compare them in a simulation experiment with a wide array of alternative estimation methods, in terms of bias, root mean square error and computation time. In addition to confirming the enormous computational advantage of the proposed estimators, the results show that ARMA-SV estimators match (or exceed) alternative estimators in terms of precision, including the widely used Bayesian estimator. The proposed methods are applied to daily observations on the returns for three major stock prices (Coca-Cola, Walmart, Ford) and the S&P Composite Price Index (2000–2017). The results confirm the presence of stochastic volatility with strong persistence.
Details
Keywords
Garland Durham and John Geweke
Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially…
Abstract
Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially quite advantageous ways. But to fully exploit these benefits algorithms that conform to parallel computing environments are needed. This paper presents a sequential posterior simulator designed to operate efficiently in this context. The simulator makes fewer analytical and programming demands on investigators, and is faster, more reliable, and more complete than conventional posterior simulators. The paper extends existing sequential Monte Carlo methods and theory to provide a thorough and practical foundation for sequential posterior simulation that is well suited to massively parallel computing environments. It provides detailed recommendations on implementation, yielding an algorithm that requires only code for simulation from the prior and evaluation of prior and data densities and works well in a variety of applications representative of serious empirical work in economics and finance. The algorithm facilitates Bayesian model comparison by producing marginal likelihood approximations of unprecedented accuracy as an incidental by-product, is robust to pathological posterior distributions, and provides estimates of numerical standard error and relative numerical efficiency intrinsically. The paper concludes with an application that illustrates the potential of these simulators for applied Bayesian inference.
Details
Keywords
Pierre Guérin and Danilo Leiva-León
The authors introduce a new approach to estimate high-dimensional factor-augmented vector autoregressive models (FAVAR) where the loadings are subject to idiosyncratic…
Abstract
The authors introduce a new approach to estimate high-dimensional factor-augmented vector autoregressive models (FAVAR) where the loadings are subject to idiosyncratic regime-switching dynamics. Our Bayesian estimation method alleviates computational challenges and makes the estimation of high-dimensional FAVAR with heterogeneous regime-switching straightforward to implement. The authors perform extensive simulation experiments to study the finite sample performance of our estimation method, demonstrating its relevance in high-dimensional settings. Next, the authors illustrate the performance of the proposed framework for studying the impact of credit market disruptions on a large set of macroeconomic variables. The results of this study underline the importance of accounting for non-linearities in factor loadings when evaluating the propagation of aggregate shocks.
Details
Keywords
Richa Srivastava and M A Sanjeev
Several inferential procedures are advocated in the literature. The most commonly used techniques are the frequentist and the Bayesian inferential procedures. Bayesian methods…
Abstract
Several inferential procedures are advocated in the literature. The most commonly used techniques are the frequentist and the Bayesian inferential procedures. Bayesian methods afford inferences based on small data sets and are especially useful in studies with limited data availability. Bayesian approaches also help incorporate prior knowledge, especially subjective knowledge, into predictions. Considering the increasing difficulty in data acquisition, the application of Bayesian techniques can be hugely beneficial to managers, especially in analysing limited data situations like a study of expert opinion. Another factor constraining the broader application of Bayesian statistics in business was computational power requirements and the availability of appropriate analytical tools. However, with the increase in computational power, connectivity and the development of appropriate software programmes, Bayesian applications have become more attractive. This chapter attempts to unravel the applications of the Bayesian inferential procedure in marketing management.
Details
Keywords
Daniel Felix Ahelegbey and Paolo Giudici
The latest financial crisis has stressed the need of understanding the world financial system as a network of interconnected institutions, where financial linkages play a…
Abstract
The latest financial crisis has stressed the need of understanding the world financial system as a network of interconnected institutions, where financial linkages play a fundamental role in the spread of systemic risks. In this paper we propose to enrich the topological perspective of network models with a more structured statistical framework, that of Bayesian Gaussian graphical models. From a statistical viewpoint, we propose a new class of hierarchical Bayesian graphical models that can split correlations between institutions into country specific and idiosyncratic ones, in a way that parallels the decomposition of returns in the well-known Capital Asset Pricing Model. From a financial economics viewpoint, we suggest a way to model systemic risk that can explicitly take into account frictions between different financial markets, particularly suited to study the ongoing banking union process in Europe. From a computational viewpoint, we develop a novel Markov chain Monte Carlo algorithm based on Bayes factor thresholding.
Details
Keywords
In a Bayesian approach, we compare the forecasting performance of five classes of models: ARCH, GARCH, SV, SV-STAR, and MSSV using daily Tehran Stock Exchange (TSE) market data…
Abstract
In a Bayesian approach, we compare the forecasting performance of five classes of models: ARCH, GARCH, SV, SV-STAR, and MSSV using daily Tehran Stock Exchange (TSE) market data. To estimate the parameters of the models, Markov chain Monte Carlo (MCMC) methods is applied. The results show that the models in the fourth and the fifth class perform better than the models in the other classes.
Laura E. Jackson, M. Ayhan Kose, Christopher Otrok and Michael T. Owyang
We compare methods to measure comovement in business cycle data using multi-level dynamic factor models. To do so, we employ a Monte Carlo procedure to evaluate model performance…
Abstract
We compare methods to measure comovement in business cycle data using multi-level dynamic factor models. To do so, we employ a Monte Carlo procedure to evaluate model performance for different specifications of factor models across three different estimation procedures. We consider three general factor model specifications used in applied work. The first is a single-factor model, the second a two-level factor model, and the third a three-level factor model. Our estimation procedures are the Bayesian approach of Otrok and Whiteman (1998), the Bayesian state-space approach of Kim and Nelson (1998) and a frequentist principal components approach. The latter serves as a benchmark to measure any potential gains from the more computationally intensive Bayesian procedures. We then apply the three methods to a novel new dataset on house prices in advanced and emerging markets from Cesa-Bianchi, Cespedes, and Rebucci (2015) and interpret the empirical results in light of the Monte Carlo results.
Details
Keywords
Phillip Li and Mohammad Arshad Rahman
We consider the Bayes estimation of a multivariate sample selection model with p pairs of selection and outcome variables. Each of the variables may be discrete or continuous with…
Abstract
We consider the Bayes estimation of a multivariate sample selection model with p pairs of selection and outcome variables. Each of the variables may be discrete or continuous with a parametric marginal distribution, and their dependence structure is modeled through a Gaussian copula function. Markov chain Monte Carlo methods are used to simulate from the posterior distribution of interest. The methods are illustrated in a simulation study and an application from transportation economics.
Details