|
- Home
- Contact
- Contents
- Reviews and
Comments
- Questions
- Papers
- New results
- K:)nigsberg
- Jan Hajek's
IT & TI
|
|
The Mathematical Theory of Information:
Reviews and Comments
A book by
Hans Christian von Baeyer Information: The New Language of Science
dedicates a whole chapter to The Mathematical Theory of Information providing an accurate description of this theory. An excerpt: "In five hundred engaging, erudite and aggressively iconoclastic pages Kåhre achieves two principal goals: to demonstrate convincingly that Shannon’s is only one of an infinite number of ways of measuring information, and to propose a fundamental law, a basic principle to which all legitimate information measures must conform."
Reviews by
- Raymond W. Yeung, Professor, Department of Information Engineering, The Chinese University of Hong Kong, author of A First Course in Information Theory.
- Hans Christian
von Baeyer,
Chancellor Professor of Physics,
College of William and Mary in Virginia,
author of Warmth Disperses and Time Passes -
A History of Heat.
- B. Roy Frieden,
Professor of Optical Sciences, University of
Arizona, author of Physics from Fisher Information.
- Jan Hajek,
The Netherlands, has pioneered automated verification/validation
of communication/networking protocols.
- Jarl-Thure Eriksson,
(Professor of Electrical Engineering), President,Tampere University of Technology, Finland.
- Cristian S. Calude,
The University of Auckland, author of Information
and Randomness. An Algorithmic Perspective.
- Gregory Chaitin,
IBM Watson Research Center in New York, author of The Limits of Mathematics, The Unknowable, Exploring Randomness and Conversations with a Mathematician.
- Mikhail Simkin,
Research Engineer, UCLA Electrical Engineering Department.
- Edward Albee, author of Who's Afraid of Virginia Woolf?.
- Emanuel H. Knill, Computer and Computational Sciences Division, Los Alamos National Laboratory.
It reads like a novel and allows the reader to appreciate a wide range of
intimately related subjects in a nutshell.
--- Raymond W. Yeung, Professor, Department of Information Engineering, The Chinese University of Hong Kong, author of A First Course in Information Theory.
To Top of Page
In The Mathematical Theory of Information Jan Kahre
presents a bold new approach to classical information theory. With
profound erudition, refreshing iconoclasm, and an appealing sense of
humor he leaps over half a century's received wisdom and starts
afresh. From common sense observations such as "information is
ABOUT something", "transmission tends to degrade
information," and "noise can be beneficial", he
constructs a rigorous framework that is general enough to encompass a
great variety of conventional results in different fields, yet
discriminating enough to prune the accumulation of dead wood. His
most lasting accomplishment is the demonstration that Claude Shannon's
measure of information, which has dominated the subject since 1948, is
by no means the only candidate, and not always the most appropriate
one. This remarkable book is sure to stretch the horizon and inspire
the imagination of every physicist, computer scientist, mathematician,
engineer, and philosopher interested in the theory and practice of
information science.
--- Hans Christian von Baeyer, Chancellor Professor of Physics,
College of William and Mary in Virginia, author of Warmth
Disperses and Time Passes - A History of Heat.
To Top of Page
The Mathematical Theory of Information offers a new
look at what is by now a well-worked field, that of information
theory. This field has been dominated to such an extent by the Shannon
form that many people are not aware that (i) it is not the first: the
Fisher variety of information long preceded it, and (ii) it is not
generally the best to use: many other forms of information exist and
are extensively used to solve real problems. The author develops a
new theory of information from a set of useful postulates, and takes
the reader on a trip through a realm of real-world applications. This
includes control theory, deterministic dynamics, information physics
and quantum mechanics. The approach is refreshing, iconoclastic and,
above all, interesting. I highly recommend it.
--- B. Roy Frieden, Professor of Optical Sciences, University of
Arizona, author of Physics from Fisher Information.
To Top of Page
HITS STATT BITS i.e. HITS OVER BITS or HITS
BEFORE BITS
Unity in diversity of information measures is the main overall
effect Jan Kahre has succeeded to achieve in his unique book The
Mathematical Theory of Information (MTI). A grand unification theory
is what physicists would like to achieve, and the infoticians (a new
word coined here) now got in MTI. While R.A. Fisher, and
R.V.L. Hartley (in 1920ies they have introduced the first
infomeasures, both logarithmic) may be called Keplers of the classical
infotheories (watch the plural), C.E. Shannon (1948) and R.A. Fisher
(1922) can be called the Newtons of ITs (plural again). Jan Kahre is
likely to deserve to be called the Grand Unifier of Infotheories (note
the small the, because never say never :-). Clearly this
GUI is our guy, if only because MTI is bound to be a success simply
because it has no competition in its class as it fills a huge gap in
its market segment.
The single and seemingly simple though not simplistic unifying element
of MTI is Kahre's Law of Diminishing Information (LDI), which expresses the
fundamentally asymmetrical nature of information transfers. This sharply
contrasts with Shannon's classical Mathematical Theory of Communication (MTC)
which is merely a special (however important) case of much more general MTI.
A common technical sense tells us that the amount of info provided by, say,
a set of symptoms @bout a set of diseases should not be the same as the info
provided by a set of diseases @bout a set of symptoms. Such asymmetry is so
natural and obvious, and yet unlike MTI, the classical MTC puts an equality
sign between both amounts of infos.
Having said all that, I want to add few words of mild criticism of
MTI, if only to make all I said here above more plausible. Like MTC,
MTI does not tell how to obtain estimates of probabilities which are
needed for the computations of infos. MTI semi-implicitly assumes that
probabilities can be obtained simply by counting occurrences of the
N-tuples of events in data. Alas, the amount of data required for
good estimates grows exponentially with N, or for the usually limited
amount of data available (like e.g. in medical applications), the
quality of estimates degrades exponentially fast with the growing N
i.e. with the length of data vectors like e.g. a number of symptoms
employed for diagnosing a (wo)man or a (wo)machine. During the last
years of my R&D I have designed, semi-automatically tested and
compared hundreds of estimators before I succeeded in inventing few
magic formulae. I say this mainly to underline the practical necessity
for, and the importance of, better probability estimators in general
and of the vector estimators in particular for applications like
pattern classification, identification, diagnosing, speech
recognition, etc. For infotheories holds the same GIGO-rule as it
holds for infotech: Garbage In, Garbage Out. In other words: The
better the estimates, the better the information. I had to say this
explicitly in order to expose the pons asinorum between the theory and
practice of (M)TI and IT, which you can find expressed succinctly in
my Anglo-Saxon slogan "HITS STATT BITS" lost in a footnote in the
chapter 1 of MTI. Saxony is in Germany, and STATT means "in place of"
or "rather than" in German. Jan Kahre has translated my slogan as
"HITS BEFORE BITS", which is not bad as it expresses the temporal
priority of hitting before bitting. My translation is "HITS OVER
BITS" expressing the general supremacy of hitting over bitting
(non-native English readers will kindly check their Websters :-).
Life is short and info endless. Even the MTI had to be cut short,
and this review as well.
--- Jan Hajek, The Netherlands, has pioneered automated
verification/validation of communication/networking protocols
(nowadays blooming as model checking; the current leading light is the
freshly ACM-Awarded Dr. Gerard Holzmann of Bell Labs where the
transistor has been invented and Claude Shannon has worked).
P.S.:
In 1977 my program APPROVER - the first automated and proven protocol
verifier - has found design bugs in the model of TCP = Transmission
Control Protocol (then called TCProgram) which is the (hidden)
workhorse of Internet. The TCP modeling was done by Dr. Carl
A. Sunshine then with the then cold war think-tank Rand Corp. and
working for ARPA (later DARPA = Defense Advanced Research Projects
Agency of the DoD aka Pentagon which has so kindly sponsored high
risks projects like then ARPANET = the mother of Internet). Carl
Sunshine was an ex-Stanford Ph.D.-student of prof. Vinton Cerf (now
chair of ICANN = the "steering committee of Internet") who has
co-designed TCP with Dr. Robert Kahn of ARPA in mid 1970ies. At that
time I have started to experiment with infotheories as possible
heuristics for finding bugs quickly before a combinatorial explosion
will hit hard. However it has turned out that "The logic of events is
always stronger than the logic of intentions", as the greatest mass
murderer of all times Iosif Vissarionovich Dzhugashvili aka Stalin
used to say, and it has taken many years before the Evil Empires
(note the plural) were subdued, with the aid of Internet, which still
is a powerful freedom-fighting machine of the Free West. Thereafter
it has taken 10 more years before I could help Jan Kahre with some
chapters of his amazing MTI. However all typos in MTI are his and not
mine, also in my 2 final pages of MTI :-) Not all people of the world
want to be free and/or united, but information wants to be. Unity in
diversity is the theme of Jan Kahre's MTI, and might he be ever
knighted for IT, he should adopt "E Unum Pluribus" as a motto for his
coat of arms. By telling you that the IMPs (= interface message
processors) i.e. early front-ends of ARPANET were called Pluribus
(recall I/O-bus) and that E Pluribus Unum is the motto on the Great
Seal of the US of A, also found on all one dollar bills, we have come
a full circle. InfoSemantics has never been simple and will surely
withstand any MTI.
To Top of Page
There is an increasing need to bring a better understanding of the concept of information to science education. Telecommunications, along with the whole IT sector, underlines the need for a standardized approach to information processing and compression. During the last few decades attempts to apply the methods of exact sciences to economics, social sciences and cognition, have led to new paradigmatic approaches such as game theory, chaos theory, complex systems etc. These theories deal to a large extent with dynamic processes, kept in concert by information exchange.
The Mathematical Theory of Information by Jan Kåhre provides a comprehensive coverage of all the essential aspects of information. The presentation comprises many dimensions. The introductions and discussions are made highly enjoyable by referring to historical events and philosophical argumentation. Among the references, which cover the forefront publications of the last two decades, one can find both hard theory and popular presentations. Throughout the book the mathematical treatment is stringent and probably not completely receivable by a broader audience. However, this choice is necessary considering that the aim is to induce inspiration for further development rather than to review an interesting scientific field.
The book will appeal to a wide cross-section of professionals in the fields of science and technology. But, especially, it fills a gap as a persuasive textbook for a new generation of scientists and research workers. The book serves as an introduction to several hot main subjects on the graduate level, such as computer science, signal processing, automation, telecommunications, knowledge management and, the hottest of them all, bioinformatics.
At its best the book will inspire the devoted reader to deepen the theory of certain parts and to consolidate those parts that support, for instance, a better understanding of global economics.
--- Jarl-Thure Eriksson, (Professor of Electrical Engineering), President,Tampere University of Technology, Finland
To Top of Page
I am impressed with your book, especially with the law of diminishing
information. I read only chapters 1 and 8 and browsed the others. It makes a
lot of sense to try to incorporate your axiom in AIT [Algorithmic Information Theory], in a way or other.
--- Cristian S. Calude, The University of Auckland, author of Information
and Randomness. An Algorithmic Perspective.
To Top of Page
Looks like a labor of love! Has the right smell!
--- Gregory Chaitin, IBM Watson Research Center in New York, author of The Limits of Mathematics, The Unknowable, Exploring Randomness and Conversations with a Mathematician.
To Top of Page
Your approach to information is interesting and intellectually stimulating.
--- Mikhail Simkin, Research Engineer, UCLA Electrical Engineering Department.
To Top of Page
On being quoted in The Mathematical Theory of Information: "It's nice to finally start getting known!"
--- Edward Albee, author of Who's Afraid of Virginia Woolf?
To Top of Page
Review MR1938240 at MathSciNet (American Mathematical Society). Access to the complete review requires a subscription to Mathematical Reviews. Excerpt:
The Mathematical Theory of Information is a philosophical as well as a mathematical treatment of information, in particular measures of information. ... of interest to those who enjoy discourses on a variety of topics related to the idea of information, such as knowledge, cybernetics and causality.
--- Emanuel H. Knill, Computer and Computational Sciences Division, Los Alamos National Laboratory.
To Top of Page
Contact: jankahre (at) hotmail.com
|