The Mathematical Theory of Information:
Summary
The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters:
1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon's noisy channel. The entropy in the 'classical information theory' is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure.
2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra.
3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byron's law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i.e. the property to bind facts together making 'two plus two greater than four'. Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation.
4. Recipes are provided to generate whole families of different information measures. Entropy is not a unique information measure, in spite of several uniqueness 'proofs'.
5. Selected applications: The determinant measures information in algebraic equations. The number of bits measures information, but only in deterministic systems. And, Popper was wrong; surprise is not an information measure.
6. Infodynamics: How information decays in Markov time-chains is explained by Gerschgorin's theorem. Social and legal consequences of the Law of Diminishing Information are investigated. Information decay does not imply a direction of time: time has an arrow only in systems that are either dissipative or condensing. The Gamow kitchen demonstrates that entropy is not a measure of disorder.
7. The two main applications of statistical information theory are telecommunications (i.e. classical information theory) and thermodynamics. The two kinds of statistical entropy, Gibbs' and Boltzmann's, are identical in ideal gas thermodynamics, but also in classical information theory. Outside telecommunications, the use of classical information theory is limited: eight independent conditions are listed that all must be satisfied for the classical information theory to apply.
8. The algorithmic information theory is built upon software experience. The measurement of algorithmic information is limited by the problem of induction: the inability to provide an objective conclusion. Induction can only tell whether a given hypothesis is more probable than another. Algorithmic theory indicates a limitation to information measurement: there are channels where the transmitted information cannot be measured by any measure.
9. The Fredholm integral equations replace matrix algebra in continuous systems.
10. The information content of a continuous signal can be infinite. Statistical correlation pretends to be a measure of information, but it is not. Information gain, a measure relative to a given hypothesis, is an objective information measure if the hypothesis is ideal: either the ideal divergent hypothesis for dissipative systems (information gain = entropy), or the ideal convergent hypothesis for condensing systems (information gain = Fisher information).
11. Information is never destroyed in a reversible system, but it can disappear. For either diverging trajectories (chaos) or converging trajectories (stability), the information is lost to the macroscopic observer, but not to the ideal receiver. Still, only the macroscopic observer can comprehend reality.
12. Control, communication, and evolution. The Law of Diminishing Information is the missing link in cybernetics: a real system which cannot forget is beyond theory. Biological evolution proceeds by the elimination of history: by random variation (divergence) and by selection (convergence).
13. Communication chains in spacetime are formed by Einstein causality. Another interpretation of the EPR paradox; the nonlocality implied by Bell's inequalities is not a quantum mechanical effect, it is caused by relativity. Not only 'simultaneity' but also 'locality' is relative. The dual universe: at subluminal velocities the Einstein causality rules, at luminal velocities the Cramer transactions take over.
14. The Law of Diminishing information is a prerequisite of comprehensible physics: thermodynamics explains a dispersive world in which negentropy decreases. Quantum mechanics explains a condensing world in which Schrödinger's wave equation derives from a decreasing Fisher information. A cyclic Szilard engine is invented to demonstrate that the second law of thermodynamics is in the eye of the macroscopic observer.
To Top of Page
Contact: jankahre (at) hotmail.com
|