Skip to main content

2010 | Buch

Mathematical Foundations of Neuroscience

insite
SUCHEN

Inhaltsverzeichnis

Frontmatter
Chapter 1. The Hodgkin–Huxley Equations
Abstract
All living cells have an electrical voltage, or potential difference, between their inside and outside. Since the cell’s membrane is what separates the inside from the outside, this potential difference is referred to as the membrane potential. In mathematical terms, the membrane potential V M is defined as
$${V }_{\mathrm{M}} = {V }_{\mathrm{in}} - {V }_{\mathrm{out}},$$
where V in is the potential on the inside of the cell and V out is the potential on the outside. This will change during an action potential, for example.
G. Bard Ermentrout, David H. Terman
Chapter 2. Dendrites
Abstract
We now present mathematical theories for describing dendrites. Dendrites are very important for many reasons. Indeed, the majority of the total membrane area of many neurons is occupied by the dendritic tree. Dendrites enable neurons to connect to thousands of other cells, far more than would be possible with just a soma, as there is a huge membrane area to make connections. Dendrites may direct many subthreshold postsynaptic potentials toward the soma, which summates these inputs and determines if the neuron will fire an action potential. In addition to the treelike structure of dendrites, many dendrites have additional fine structures at the ends of the branches called spines. During development, animals that are raised in rich sensory environments have more extensive dendritic trees and more spines.
G. Bard Ermentrout, David H. Terman
Chapter 3. Dynamics
Abstract
Dynamical systems theory provides a powerful tool for analyzing nonlinear systems of differential equations, including those that arise in neuroscience. This theory allows us to interpret solutions geometrically as curves in a phase space. By studying the geometric structure of phase space, we are often able to classify the types of solutions that a model may exhibit and determine how solutions depend on the model’s parameters. For example, we can often predict if a model neuron will generate an action potential, determine for which values of the parameters the model will produce oscillations, and compute how the frequency of oscillations depends on the parameters.
G. Bard Ermentrout, David H. Terman
Chapter 4. The Variety of Channels
Abstract
We have discussed several types of active (voltage-gated) channels for specific neuron models. The Hodgkin–Huxley model for the squid axon consisted of three different ion channels: a passive leak, a transient sodium channel, and the delayed rectifier potassium channel. Similarly, the Morris–Lecar model has a delayed rectifier and a simple calcium channel (with no dynamics). Hodgkin and Huxley were smart and supremely lucky that they used the squid axon as a model to analyze the action potential, as it turns out that most neurons have dozens of different ion channels. In this chapter, we briefly describe a number of them, provide some instances of their formulas, and describe how they influence a cell’s firing properties. The reader who is interested in finding out about other channels and other models for the channels described here should consult http://senselab.med.yale.edu/modeldb/default.asp, which is a database for neural models.
G. Bard Ermentrout, David H. Terman
Chapter 5. Bursting Oscillations
Abstract
Many neurons exhibit much more complicated firing patterns than simple repetitive firing. A common mode of firing in many neurons and other excitable cells is bursting oscillations. This is characterized by a silent phase of near-steady-state resting behavior alternating with an active phase of rapid, spikelike oscillations. Examples of bursting behavior are shown in Fig. 5.1. Note that bursting arises in neuronal structures throughout the central nervous system. Bursting activity in certain thalamic cells, for example, is implicated in the generation of sleep rhythms, whereas patients with parkinsonian tremor exhibit increased bursting activity in neurons within the basal ganglia. Cells involved in the generation of respiratory rhythms within the pre-Botzinger complex also display bursting oscillations.
G. Bard Ermentrout, David H. Terman
Chapter 6. Propagating Action Potentials
Abstract
Neurons need to communicate over long distances. This is accomplished by electrical signals, or action potentials, that propagate along the axon. We have seen that linear cables cannot transmit information very far; neural signals are able to reach long distances because there exist voltage-gated channels in the cell membrane. The combination of ions diffusing along the axon together with the nonlinear flow of ions across the membrane allows for the existence of an action potential that propagates along the axon with a constant shape and velocity.
G. Bard Ermentrout, David H. Terman
Chapter 7. Synaptic Channels
Abstract
So far, we have restricted our modeling and analysis efforts to single neurons. To begin to develop networks and the theoretical background for networks, we need to introduce an additional class of membrane channels. We have already looked at voltage- and ion-gated channels. However, there are many other channels on the surface of nerve cells which respond to various substances. Among the most important of these, at least in computational neuroscience, are synaptic channels.
G. Bard Ermentrout, David H. Terman
Chapter 8. Neural Oscillators: Weak Coupling
Abstract
This chapter begins the second part of the book. By now, we hope that the reader has a thorough knowledge of single cell dynamics and is ready to move onto networks. There are two main approaches to the analysis and modeling of networks of neurons. In one approach, the details of the action potentials (spikes) matter a great deal. In the second approach, we do not care about the timing of individual neurons; rather, we are concerned only with the firing rates of populations. This division is reflected in the sometimes acrimonious battles between those who believe that actual spike times matter and those who believe that the rates are all that the brain cares about. On these issues, we have our own opinions, but for the sake of the reader, we will remain agnostic and try to present both sorts of models.
G. Bard Ermentrout, David H. Terman
Chapter 9. Neuronal Networks: Fast/Slow Analysis
Abstract
In this chapter, we consider a very different approach to studying networks of neurons from that presented in Chap. 8. In Chap. 8, we assumed each cell is an intrinsic oscillator, the coupling is weak, and details of the spikes are not important. By assuming weak coupling, we were able to exploit powerful analytic techniques such as the phase response curve and the method of averaging. In this chapter, we do not assume, in general, weak coupling or the cells are intrinsic oscillators. The main mathematical tool used in this chapter is geometric singular perturbation theory. Here, we assume the model has multiple timescales so we can dissect the full system of equations into fast and slow subsystems. This will allow us to reduce the complexity of the full model to a lower-dimensional system of equations. We have, in fact, introduced this approach in earlier chapters when we discussed bursting oscillations and certain aspects of the Morris–Lecar model.
G. Bard Ermentrout, David H. Terman
Chapter 10. Noise
Abstract
Neurons live in a noisy environment; that is, they are subjected to many sources of noise. For example, we treat ion channels deterministically, but in reality, opening and closing of channels is a probablistic event. Similarly, there is spontaneous release of neurotransmitter which leads to random bombardment of small depolarizations and hyperpolarizations. In vivo, there is increasing evidence that cortical neurons live in a high-conductance state due to the asynchronous firing of the cells which are presynaptic to them. Noise in neural and other excitable systems has been the subject of research since the early 1960s. There are a number of good books and reviews about the subject. We single out the extensive review [179] and the books [274, 169].
G. Bard Ermentrout, David H. Terman
Chapter 11. Firing Rate Models
Abstract
One of the most common ways to model large networks of neurons is to use a simplification called a firing rate model. Rather than track the spiking of every neuron, instead one tracks the averaged behavior of the spike rates of groups of neurons within the circuit. These models are also called population models since they can represent whole populations of neurons rather than single cells. In this book, we will call them rate models although their physical meaning may not be the actual firing rate of a neuron. In general, there will be some invertible relationship between the firing rate of the neuron and the variable at hand. We derive the individual model equation in several different ways, some of the derivations are rigorous and are directly related to some biophysical model and other derivations are ad hoc. After deriving the rate models, we apply them to a number of interesting phenomena, including working memory, hallucinations, binocular rivalry, optical illusions, and traveling waves. We also describe a number of theorems about asymptotic states as well as some of the now classical work on attractor networks.
G. Bard Ermentrout, David H. Terman
Chapter 12. Spatially Distributed Networks
Abstract
In the previous chapters, we focused generally on single neurons, small populations of neurons, and the occasional array of neurons. With the advent of multielectrode recording, intrinsic imaging, calcium imaging, and even functional magnetic resonance imaging, it is becoming possible to explore spatiotemporal patterns of neural activity. This leads to a wealth of interesting fodder for the mathematically inclined and it is the goal of this chapter to provide some examples of this type of analysis. In Chap. 6, we looked at the propagation of action potentials down an axon; this is modeled as a partial differential equation. By looking for traveling waves, we were able to reduce the equations to a set of ordinary differential equations. When neurons are coupled together with chemical synapses, the natural form of coupling is not through partial derivatives with respect to space, but rather through nonlocal spatial interactions such as integral equations. In Sect. 8.4, we also looked at such models under the assumption that there is a single spike wave, much like an action potential. As with the partial differential equations, it is possible to look for specific forms of a solution (such as traveling waves or stationary patterns), but the resulting simplified equations do not reduce to ordinary differential equations. Thus, new techniques must be developed for solving these equations and (if desired) proving their existence and stability.
G. Bard Ermentrout, David H. Terman
Backmatter
Metadaten
Titel
Mathematical Foundations of Neuroscience
verfasst von
G. Bard Ermentrout
David H. Terman
Copyright-Jahr
2010
Verlag
Springer New York
Electronic ISBN
978-0-387-87708-2
Print ISBN
978-0-387-87707-5
DOI
https://doi.org/10.1007/978-0-387-87708-2

Premium Partner