In our previous papers, we proposed an analysis of the relations between information theory and observation theory. Referring to the parameters of the observation and using Shannon's formulation, we defined and calculated the information associated with a result of observation and the entropy of a random variable for a process of observation. This paper presents an extension of this theory for other definitions of entropy (Onicescu, hyperbolic). The method allows in particular to define Onicescu and hyperbolic entropies of continuous random variables. We examine the main properties of these new functions and propose a comparative analysis.
CitationDownload as .RIS
MCB UP Ltd
Copyright © 1985, MCB UP Limited