A definition of entropy of maps which does not involve probability, but nevertheless is fully consistent with Shannon entropy can be derived using the informational equation H(X,Y) = H(X) + H(Y,X). This approach has been extended in order to obtain the “Shannon entropy” of distributed maps. The model that is obtained involves two parameters which characterise the scanning procedures normally used by the cortex in human vision. The results are then used to re‐define the entropy of a fuzzy set and to extract the value of a membership from a small sample of observed data. The measure of entropic distance between patterns without using probability is also considered.
Jumarie, G. (1989), "Informational Entropy of Distributed Deterministic Maps: Applications to Small Samples of Fuzzy Sets and to Pattern Recognition", Kybernetes, Vol. 18 No. 1, pp. 32-47. https://doi.org/10.1108/eb005806Download as .RIS
MCB UP Ltd
Copyright © 1989, MCB UP Limited