Argomenti: [Informatica.AlgoritmieStrutturedati, Lang:en]
Editore: Kluwer Academic Publishers
Journal of Logic, Language and Information 12:
© 2003 Kluwer Academic Publishers.
Abstract. We compare the elementary theories of
Shannon information and Kolmogorov complexity,
the extent to which they have a common purpose, and
where they are fundamentally different.
We discuss and relate the basic notions of both
theories: Shannon entropy, Kolmogorov complexity,
Shannon mutual information and Kolmogorov
(“algorithmic”) mutual information. We
how universal coding may be viewed as a middle ground
between the two theories. We consider
Shannon’s rate distortion theory, which quantifies
useful (in a certain sense) information. We use
the communication of information as our guiding motif,
and we explain how it relates to sequential
Key words: Algorithmic information theory, data
compression, Kolmogorov complexity, mutual
information, prefix codes, rate distortion theory,
Shannon information theory, universal codes.