 Contents
 Papers

The Mathematical Theory of Information: New resultsContentsNB: you will need either Mathematica or the free MathReader to view the files linked to this page. Go to <http://www.wolfram.com/products/mathreader/> to download MathReader. Follow the instructions on the MathReader pages to install the program. The MathReader pages will also give you instructions on how to integrate MathReader with your browser so that clicking a link to a MathReader file (.nb) will automatically launch MathReader. Or save the file on your hard disk by rightclicking a link to Mathematica file and choosing "Save Target As" (download file type "all files" or similar, NOT "text file"). When you have saved the Mathematica file on your disk, you can open it by doubleclicking the file. The Ordering TheoremAs an answer to the question "Can the selection of information measure influence the ordering of messages according to informativity?", a new theorem was found. Rényi informationMy book, The Mathematical Theory of Information presents a measure of the information B gives about A, ent_alpha(B@A), derived from Rényi's generalization of Shannon's entropy. The range of alpha was given as 0<=alpha<1, and it was proved that ent_alpha(B@A) conforms to the Law of Diminishing Information. Then Jan Hajek pointed out in a question that the literature puts no upper limit to alpha. I guessed it could be proved alpha > 1 would be allowed. Then Hajek came back with a paper Renyi's disinformation presenting the result that there are values of alpha > 2 leading to a violation of the Law of Diminishing Information. To sort this out I had to purchase Mathematica, as my normal pencilandpaper methods would have been too cumbersome. An experienced user will find my handling of the program lacking elegance. I muddled trough somehow, even when the program refused to recognize that 0^2 = 0 etc. The paper Rényi Rate Formula (Mathematica/MathReader file, 620 kB) explores ent_alpha(B@A) and finds the allowed range to be 0<=alpha<=2 (where alpha = 1 produces Shannons entropy as a limit). Hajek also points out in Renyi's disinformation that ent_alpha(B@A) is asymmetric, ent_alpha(B@A) may differ from ent_alpha(A@B). In the paper Symmetric Rényi (Mathematica/MathReader file, 440 kB), a symmetric measure of the information B gives about A, ent_alpha(B;A) is constructed. That is, ent_alpha(B;A) = ent_alpha(A;B). The allowed range was found to be 0<=alpha<=1 (where alpha = 1 produces Shannons entropy as a limit). Contact: jankahre (at) hotmail.com 