To read this content please select one of the options below:

Causality and Markovianity: Information Theoretic Measures

Essays in Honor of Aman Ullah

ISBN: 978-1-78560-787-5, eISBN: 978-1-78560-786-8

Publication date: 23 June 2016

Abstract

Many Information Theoretic Measures have been proposed for a quantitative assessment of causality relationships. While Gouriéroux, Monfort, and Renault (1987) had introduced the so-called “Kullback Causality Measures,” extending Geweke’s (1982) work in the context of Gaussian VAR processes, Schreiber (2000) has set a special focus on Granger causality and dubbed the same measure “transfer entropy.” Both papers measure causality in the context of Markov processes. One contribution of this paper is to set the focus on the interplay between measurement of (non)-markovianity and measurement of Granger causality. Both of them can be framed in terms of prediction: how much is the forecast accuracy deteriorated when forgetting some relevant conditioning information? In this paper we argue that this common feature between (non)-markovianity and Granger causality has led people to overestimate the amount of causality because what they consider as a causality measure may also convey a measure of the amount of (non)-markovianity. We set a special focus on the design of measures that properly disentangle these two components. Furthermore, this disentangling leads us to revisit the equivalence between the Sims and Granger concepts of noncausality and the log-likelihood ratio tests for each of them. We argue that Granger causality implies testing for non-nested hypotheses.

Keywords

Citation

Renault, E. and Scidá, D. (2016), "Causality and Markovianity: Information Theoretic Measures", Essays in Honor of Aman Ullah (Advances in Econometrics, Vol. 36), Emerald Group Publishing Limited, Leeds, pp. 349-385. https://doi.org/10.1108/S0731-905320160000036019

Publisher

:

Emerald Group Publishing Limited

Copyright © 2016 Emerald Group Publishing Limited