In the information theoretic framework, it is customary to address the problem of defining and analyzing complexity and organization of systems either by using Shannon entropy, via Jaynes maximum entropy principle, or by means of the so‐called Kullback informational divergence which measures the informational distance between two probability distributions. In the present paper, it is shown that the so‐called self‐divergence of Markovian processes can be a useful complement in this approach. After a short background on entropy and organization, we recall the definition of divergence of Markovian processes, and then it is used to analyze organization and complexity. We arrive at a principle of maximum self‐divergence which characterizes systems with maximum organization.
CitationDownload as .RIS
MCB UP Ltd
Copyright © 1998, MCB UP Limited