Given the tremendous advances that have been made in our understanding of the mechanics of neural systems, there has been remarkably little progress in understanding how they process information. Here is it proposed that a major obstacle has been confusion about the concepts of information and probability. It is suggested that the correct definition of probabilities is strictly Bayesian, in which probabilities are always entirely conditional on information. It is further proposed that to best understand the brain, we should use Bayesian principles to describe what the brain knows about its world. Although such a “first-person” Bayesian approach has recently become prominent, its success has so far been almost entirely restricted to accounts of phenomena such as perception and cognition. The present work demonstrates how the Bayesian approach can be grounded in biophysics. Boltzmann’s distribution from statistical mechanics is used to derive probability distributions that are conditional entirely on the information held within single molecular sensors. By integrating information from a multitude of sensors within its membrane voltage, a neuron thereby reduces its uncertainty about the state its world. A major virtue of this integrated view of information and biophysics is that it allows us to identify a single and general computational goal for the function of the nervous system, which is to minimize its uncertainty (about the biological goals of the animal). This computational goal has recently served as the basis for a general theory of information processing within the nervous system (Fiorillo, 2008).
Weitere Kapitel dieses Buchs durch Wischen aufrufen
Bitte loggen Sie sich ein, um Zugang zu diesem Inhalt zu erhalten
Sie möchten Zugang zu diesem Inhalt erhalten? Dann informieren Sie sich jetzt über unsere Produkte:
- A New Approach to the Information in Neural Systems
Christopher D. Fiorillo
- Springer Berlin Heidelberg
ec4u, Neuer Inhalt/© ITandMEDIA