Skip to main content
Log in

Evidence, information, and surprise

  • Published:
Biological Cybernetics Aims and scope Submit manuscript

Abstract

A numerical measure for “evidence” is defined in a probabilistic framework. The established mathematical concept of information or entropy (as defined in ergodic theory) can be obtained from this definition in a special case, although in general information is greater than evidence. In another, somewhat complementary, special case a numerical measure for “surprise” is derived from the definition of evidence. Some applications of the new concept of evidence are discussed, concerning statistics in general and the special kind of statistics performed by neurophysiologists, when they analyze the “response” of neurons, and perhaps by the neurons themselves.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Billingsley, P.: Ergodic theory and information. New York: Wiley 1965

    Google Scholar 

  • Gerstein, G.L., Mandelbrot, B.: Random walk models for the spike activity of a single neuron. Biophys. J.4, 41–68 (1964)

    Google Scholar 

  • Gerstein, G.L., Perkel, D.H.: Mutual temporal relationships among neuronal spike trains. Biophys. J.12, 453–473 (1972)

    Google Scholar 

  • Hebb, D.O.: The organization of behaviour. New York: Wiley 1949

    Google Scholar 

  • Holden, A.V.: Models of the stochastic activity of neurones. Berlin, Heidelberg, New York: Springer 1976

    Google Scholar 

  • Legéndy, C.R.: Three principles of brain function and structure. Int. Neurosci.6, 237–254 (1975)

    Google Scholar 

  • Marr, D.: Philos. Trans. R. Soc. London Ser. B176, 161 (1970)

    Google Scholar 

  • Marr, D.: Philos. Trans. R. Soc. London Ser. B262, 23 (1971)

    Google Scholar 

  • Palm, G.: A common generalization of topological and measure-theoretic entropy. Société Mathématique de France, Astérisque40, 159–165 (1976)

    Google Scholar 

  • Palm, G.: Entropy for dynamical systems. Fundação Universidade de Brasilia. Trab. Mat.151, 1–22 (1979)

    Google Scholar 

  • Pfaffelhuber, E.: Learning and information theory. Int. J. Neurosci.3, 83–88 (1972)

    Google Scholar 

  • Schlögl, F.: On stability of steady states. Z. Phys.243, 303–310 (1971)

    Google Scholar 

  • Shannon, C.E., Weaver, W.: The mathematical theory of communication. University of Illinois Press 1959

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Palm, G. Evidence, information, and surprise. Biol. Cybern. 42, 57–68 (1981). https://doi.org/10.1007/BF00335160

Download citation

  • Received:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF00335160

Keywords

Navigation