To read this content please select one of the options below:

Possibility‐Probability Transformation: A New Result via Information Theory of Deterministic Functions

Guy Jumarie (Université du Québec à Montréal, Montréal, Canada)

Kybernetes

ISSN: 0368-492X

Article publication date: 1 July 1994

669

Abstract

By combining the theory of relative information and the information of deterministic functions, one can obtain a model of possibility‐probability transformation. Relative information defines possibility in terms of syntax‐semantics coupling of natural languages, while entropy of deterministic functions refers to the maximum conditional entropy principle. These two theories use the basic concepts of Shannon information theories, and do not apply any of the new recent notions such as belief, necessity, confusion, dissonance, nonspecificity, and so on. The model appears as a direct consequence of the Shannon theory itself.

Keywords

Citation

Jumarie, G. (1994), "Possibility‐Probability Transformation: A New Result via Information Theory of Deterministic Functions", Kybernetes, Vol. 23 No. 5, pp. 56-59. https://doi.org/10.1108/03684929410064509

Publisher

:

MCB UP Ltd

Copyright © 1994, MCB UP Limited

Related articles