- Papers
- New results

The Mathematical Theory of Information: New results

Contents

NB: you will need either Mathematica or the free MathReader to view the files linked to this page.

The Ordering Theorem

As an answer to the question "Can the selection of information measure influence the ordering of messages according to informativity?", a new theorem was found.

To Top of Page

Rényi information

My book, The Mathematical Theory of Information presents a measure of the information B gives about A, ent_alpha(B@A), derived from Rényi's generalization of Shannon's entropy. The range of alpha was given as 0<=alpha<1, and it was proved that ent_alpha(B@A) conforms to the Law of Diminishing Information.

Then Jan Hajek pointed out in a question that the literature puts no upper limit to alpha. I guessed it could be proved alpha > 1 would be allowed. Then Hajek came back with a paper Renyi's disinformation presenting the result that there are values of alpha > 2 leading to a violation of the Law of Diminishing Information.

To sort this out I had to purchase Mathematica, as my normal pencil-and-paper methods would have been too cumbersome. An experienced user will find my handling of the program lacking elegance. I muddled trough somehow, even when the program refused to recognize that 0^2 = 0 etc.

The paper Rényi Rate Formula (Mathematica/MathReader file, 620 kB) explores ent_alpha(B@A) and finds the allowed range to be 0<=alpha<=2 (where alpha = 1 produces Shannons entropy as a limit).

Hajek also points out in Renyi's disinformation that ent_alpha(B@A) is asymmetric, ent_alpha(B@A) may differ from ent_alpha(A@B). In the paper Symmetric Rényi (Mathematica/MathReader file, 440 kB), a symmetric measure of the information B gives about A, ent_alpha(B;A) is constructed. That is, ent_alpha(B;A) = ent_alpha(A;B). The allowed range was found to be 0<=alpha<=1 (where alpha = 1 produces Shannons entropy as a limit).

To Top of Page

Contact: jankahre (at) hotmail.com