Skip to main content

1993 | Buch

Selected Works of A. N. Kolmogorov

Volume III: Information Theory and the Theory of Algorithms

herausgegeben von: A. N. Shiryayev

Verlag: Springer Netherlands

Buchreihe : Mathematics and Its Applications

insite
SUCHEN

Inhaltsverzeichnis

Frontmatter

Papers by A. N. Kolmogorov

1. On the Notion of Algorithm
Abstract
We start from the following intuitive considerations about algorithms:
1)
An algorithm Γ applied to any “condition” (“initial state”) A from some set G(Γ) (“domain of applicability” of the algorithm Γ) gives a “solution” (“concluding state”) B.
 
2)
The algorithmic process may be subdivided into separate steps of apriori bounded complexity; each step consists of an “immediate processing” of the state S (that occurs at this step) into the state S* = ΩΓ(S).
 
3)
The processing of A 0 = A into A 1 = ΩΓ(A 0), A 1 into A 2 = ΩΓ(A 1), A 2 into A 3 = ΩΓ(A 2), etc., continues until either a nonresultative stop occurs (if the operator ΩΓ is not defined for the state that just appeared) or until the signal saying that the “solution” has been obtained occurs. It is not excluded that the process will continue indefinitely (if the signal for the solution’s appearance never occurs).
 
4)
Immediate processing of S into S* = ΩΓ(S) is carried out only on the basis of information on the form of an apriori limited “active part” of the state S and involves only this active part.
 
A. N. Shiryayev
2. On the General Definition of the Quantity of Information
Abstract
The definitions (2), (4) and the properties I–IV of the expression I(ξ,η) presented below were given in A. N. Kolmogorov’s report at the meeting on probability theory and mathematical statistics (Leningrad, June 1954). Theorem 4 and Theorem 5 on the semicontinuity of I (ξ,η) under weak convergence of distributions were found by I. M. Gelfand and A. M. Yaglom. After that, A. N. Kolmogorov proposed the final version of this article, which stresses that basically the passage from the finite case to the general one, and the computation and estimates of the amount of information obtained by taking limits are absolutely trivial if the exposition is carried out in terms of normed Boolean algebras. It would not be difficult, of course, to reformulate more general principles of the passage to the limit: not as n → ∞, but by using some partial ordering.
I. M. Gelfand, A. M. Yaglom
3. The Theory of Transmission of Information
Abstract
Speaking for the second time to the General Assembly of our Academy, I should like to begin by remarking that the topic of today’s communication is intimately connected with that of my report “Statistical Theory of Oscillations with Continuous Spectrum” which I delivered at the General Assembly of the Academy in 1947.1
A. N. Shiryayev
4. Amount of Information and Entropy for Continuous Distributions
Abstract
Suppose A is a certain trial which may have n different outcomes A 1, A 2,..., A n with probabilities P(A 1), P(A 2),..., P(A n ); let B be another trial which has outcomes B 1, B 2,..., B m with probabilities P(B 1), P(B 2),..., P(B m ). If by P(A i B k ) we denote the probability of the simultaneous outcome A i of the trial A and outcome B k of the trial B, then according to Shannon [1], the mean amount of information contained in the outcomes of the trial U with respect to the outcome of the trial B is equal to
$$I\left( {U,B} \right) = \sum\limits_{i,k} {P\left( {{A_i}{B_k}} \right)\log \frac{{P\left( {{A_i}{B_k}} \right)}}{{P\left( {{A_i}} \right)P\left( {{B_k}} \right)}}} $$
(1.1)
.
I. M. Gelfand, A. M. Yaglom
5. New Metric Invariant of Transitive Dynamical Systems and Automorphisms of Lebesgue Spaces
Abstract
It is well known that a considerable part of the metric theory of dynamical systems may be developed as an abstract theory of “flows” {S t} on “Lebesgue spaces” M with measure μ in terms invariant with respect to “isomorphisms modulo zero” (see V.A. Rokhlin’s survey [1], whose definitions and notations are used extensively here).
A. N. Shiryayev
6. To the Definition of Algorithms
Abstract
By an algorithm one usually means a system of computations that, for a certain class of mathematical problems, given a text
V. A. Uspenski
7. ε-Entropy and ε-Capacity of Sets In Functional Spaces
Abstract
The article is mainly devoted to the systematic exposition of results that were published in the years 1954–1958 by K. I. Babenko [1], A. G. Vitushkin [2,3], V. D. Yerokhin [4], A. N. Kolmogorov [5,6] and V. M. Tikhomirov [7]. It is natural that when these materials were systematically rewritten, several new theorems were proved and certain examples were computed in more detail. This review also incorporates results not published previously which go beyond the framework of such a systematization, and belong to V. I. Arnold (§6) and V. M. Tikhomirov (§§4,7 and §8).
V. M. Tikhomirov
8. Various Approaches to Estimating the Complexity of Approximate Representation and Calculation of Functions
Abstract
In order to avoid technical complications, we shall consider only real functions
$$y = f\left( x \right)$$
defined in the interval
$$\Delta = \{ x: - 1x + 1\} .$$
A. N. Shiryayev
9. On Tables of Random Numbers
Abstract
The editors of Semiotika i Informatika have felt it appropriate to publish a Russian translation of my article, which reflects a certain stage of my attempts to give meaning to the frequency interpretation of probability due to R. von Mises. The article was written during my stay in India in 1962 and published by the Indian journal Sankhyā,a very authoritative periodical in mathematical statistics in English-speaking countries, but little known here in the USSR. The reader familiar with the work of von Mises will perhaps notice the wider definition of the algorithm A used for selecting samples. But the main difference from von Mises is the strictly finite character of the whole approach and the introduction of a quantitative estimate of the stab ility of frequencies. The problem of bringing together estimates (4.3) and (4.4), despite its elementary character, apparently still awaits its solution. Further stages of my search in this direction are reflected in two notes, published in 1965 and 1969 in the journal Problemy Peredachy Informatsii. In Semiotika i Informatika, see V.V.Vyugin’s survey (1981, vyp. 16).
A. N. Shiryayev
10. Three Approaches to the Definition of the Notion of Amount of Information
Abstract
Suppose the variable x can assume values belonging to a finite set X which consists of N elements. We then say that the “entropy” of the variable x is equal to
$$H(X) = {\log _2}N$$
A. N. Shiryayev
11. On the Realization of Networks in Three-Dimensional Space
Abstract
By a (d, n)-network we shall mean a oriented graph with n numbered vertices α 1, α 2, ..., α n and dn marked edges such that precisely d edges are incident to each vertex and one of them is marked by the weight x 1, another by the weight x 2, etc., and finally the last one by the weight x d .
Ya. M. Barzdin
12. To the Logical Foundations of the Theory of Information and Probability Theory
Abstract
We shall mainly be concerned with the basic notions of information theory. Our starting point will be the notion of conditional entropy of the object x for a given object y, H(x|y), which can be interpreted as the amount of information necessary to determine the object x in the situation when the object y is already known. Denoting by Ø the “necessarily known object”, we obtain the unconditional entropy
$$H(x|\phi ) = H(x$$
A. N. Shiryayev
13. The Combinatorial Foundations of Information Theory and the Probability Calculus
Abstract
I should like to begin with some considerations which go beyond the framework of the main topic of my report. Mathematics formalized according to Hilbert is nothing other than the theory of operations on schemes of special form consisting of a finite number of symbols disposed in a certain order and related by various connections. For example, according to N. Bourbaki’s approach, the entire theory of sets studies only expressions consisting of the signs
$$\tau ,V,\neg , = , \in , \supset $$
and “letters” which are related by “links” [diagram 1] as for example in the expression called.
A. N. Shiryayev

Comments and addenda

On Works in Information Theory and some of Its Applications
Abstract
The appearance of information theory as an independent science is related to the ideas of cybernetics. In its more general form, cybernetics studies control processes, and a rational system of control necessitates the definition of information, its obtention, storage, transmission along channels of communication and its processing. It is therefore natural that the cycle of my works on information theory was largely influenced by the publications of Norbert Wiener and Claude Shannon (1948) at the end of the 50s and the 60s1.
A. N. Kolmogorov
Information Theory
Abstract
There is one trait that A. N. Kolmogorov’s cycle of papers on information theory has in common with any really important research work. After a certain amount of time goes by, their main ideas are understood by experts as truisms, while the situation was hardly the same in the period when these papers were being created. One should remember the atmosphere of the 50s when A. N. Kolmogorov became interested in questions of information theory. This was a time when, under the pressure of the approaching advent of computers, uncritical distrust of the ideas of cybernetics in our country were suddenly replaced by just as uncritical glorification of the subject and, as the result, many superficial and speculative ideas came to the forefront. This noisy eulogy in its turn generated a natural negative reaction.
R. L. Dobrushin
Algorithmic Information Theory
Abstract
In this commentary we list the main results obtained by A. N. Kolmogorov and his pupils and followers in the domain of algorithmic information theory.
A. Kh. Shen
ε-Entropy and ε-Capacity
Abstract
The topic of the given commentary is directly related to the papers [1–5]. However, it is natural to consider a somewhat wider cycle in which we shall include [6–10]. The backbone bringing together the entire cycle of articles [1–10] is the notion of entropy.
V. M. Tikhomirov
Tables of Random Numbers
Abstract
From the point of view of classical probability theory, the occurence of a given sequence of zeros and ones as the result of n tosses of a symmetric coin has probability 2−n. Nevertheless, the appearance of certain of these sequences (for example, the one consisting of only zeros or the sequence 010101...) seems rather strange to us and leads us to suspect that there is something wrong, while the appearance of others does not lead to such a reaction. In other words, some sequences seem less “random”than others, so one may try to classify finite sequences of zeros and ones according to their “degree of randomness”. This classification is the main topic of the article that we are commenting upon here.
A. Sh. Shen
Realization of Networks in 3-Dimensional Space
Abstract
In the original article one may read the following footnote, which was written by Andrei Nikolayevich himself:
“Several years ago A. N. Kolmogorov proved Theorem 1* for networks of bounded branching and gave the following first approximation to Theorem 2: there exists a network with n > 1 elements any realization of which has diameter greater than \(C\sqrt n /\log n\) , where C is a certain constant independent of n. The final versions of all these theorems (and the very idea of setting the question of the property of “almost all” networks) belongs to Ya. M. Barzdin.”
Ya. M. Barzdin
Ergodic Theory
Abstract
This paper by A. N. Kolmogorov played an outstanding role in the development of ergodic theory. First of all it contained the solution of a well-known problem which had in fact stood for more than 25 years, and the success was achieved as the result of the use, in ergodic theory, of absolutely new ideas and methods coming from information theory. Secondly, after this paper appeared, a previously totally unknown extremely fruitful direction in ergodic theory opened up, a direction rich in its deep interesting intrinsic problems and in the variety of its applications to the analysis of specific dynamical systems. It is doubtless that the appearance of the ideas of ergodic theory in physics, the progressively wider and wider use of these ideas in the analysis of systems with stochastic behavior, strange attractors, etc. has as one of its sources precisely to the paper by A. N. Kolmogorov under consideration.
Ya. G. Sinai
Kolmogorov’s Algorithms or Machines
Abstract
At the beginning of the year of 1951, the 5th year student of the mechanics and mathematics department of Moscow State University V. A. Uspensky was given a page of typewritten text with the description of a certain abstract machine by his scientific advisor A. N. Kolmogorov. The states of this machine were one-dimensional topological complexes or, more precisely, complexes with certain additional information. This additional information consisted in the vertices of the complexes being supplied with functions with values in a preliminarily chosen finite set so that the complex turned out to be labeled (by values of this function). The formulation proposed by A. N. Kolmogorov had the following two goals:
1)
to give a mathematical definition of the notion of algorithm that would be as general as possible (so general that it would directly include all algorithms);
 
2)
to verify that even this more general definition does not lead to an extension of the class of computable functions which was already clearly understood by this time on the basis of previous narrower definitions (due to Turing, Post, Markov, and others) and which, if one limits oneself to functions, coincides with the class of partially recursive functions.
 
V. A. Uspenski, A. L. Semyonov
Backmatter
Metadaten
Titel
Selected Works of A. N. Kolmogorov
herausgegeben von
A. N. Shiryayev
Copyright-Jahr
1993
Verlag
Springer Netherlands
Electronic ISBN
978-94-017-2973-4
Print ISBN
978-90-481-8456-9
DOI
https://doi.org/10.1007/978-94-017-2973-4