Algorithmic
information theory studies description complexity and
randomness and is now a well-known field of theoretical computer science and mathematical logic. There are several textbooks and monographs devoted to this theory (Calude, Information and Randomness. An Algorithmic Perspective,
2002, Downey and Hirschfeldt, Algorithmic Randomness and Complexity,
2010, Li and Vitányi, An Introduction to Kolmogorov Complexity and Its Applications,
2008, Nies, Computability and Randomness,
2009, Vereshchagin et al., Kolmogorov Complexity and Algorithmic Randomness, in Russian,
2013) where one can find a detailed exposition of many difficult results as well as historical references. However, it seems that a short survey of its basic notions and main results relating these notions to each other is missing. This chapter attempts to fill this gap and covers the basic notions of algorithmic information theory
: Kolmogorov complexity (plain, conditional, prefix), Solomonoff universal a priori probability, notions of
randomness (Martin-Löf randomness, Mises–Church randomness), and effective Hausdorff dimension
. We prove their basic properties (symmetry of information, connection between a priori probability and prefix complexity, criterion of randomness in terms of complexity, complexity characterization for effective dimension) and show some applications (
incompressibility method in computational complexity theory,
incompleteness theorems). The chapter is based on the lecture notes of a course at Uppsala University given by the author (Shen, Algorithmic information theory and Kolmogorov complexity. Technical Report,
2000).