Skip to main content
main-content

Über dieses Buch

This book is devoted to an analysis of general weakly connected neural networks (WCNNs) that can be written in the form (0.1) m Here, each Xi E IR is a vector that summarizes all physiological attributes of the ith neuron, n is the number of neurons, Ii describes the dynam­ ics of the ith neuron, and gi describes the interactions between neurons. The small parameter € indicates the strength of connections between the neurons. Weakly connected systems have attracted much attention since the sec­ ond half of seventeenth century, when Christian Huygens noticed that a pair of pendulum clocks synchronize when they are attached to a light­ weight beam instead of a wall. The pair of clocks is among the first weakly connected systems to have been studied. Systems of the form (0.1) arise in formal perturbation theories developed by Poincare, Liapunov and Malkin, and in averaging theories developed by Bogoliubov and Mitropolsky.

Inhaltsverzeichnis

Frontmatter

Introduction

Frontmatter

1. Introduction

Abstract
In this chapter we give definitions and explanations of basic neurophysio­logical terminology that we use in the book. We do not intend to provide a comprehensive background on various topics.
Frank C. Hoppensteadt, Eugene M. Izhikevich

2. Bifurcations in Neuron Dynamics

Abstract
Bifurcations play important roles in analysis of dynamical systems; they express qualitative changes in behavior. A typical example of a bifurcation in neuron dynamics is the transition from rest to periodic spiking activity. Generation of a single spike is a bifurcation phenomenon too: It can result when the rest potential is close to a bifurcation point, which is often referred to as its threshold value. We discuss these and similar issues in this chapter.
Frank C. Hoppensteadt, Eugene M. Izhikevich

3. Neural Networks

Abstract
Consider a neuron with its membrane potential near a threshold value. When an external input drives the potential to the threshold, the neuron’s activity experiences a bifurcation: The equilibrium corresponding to the rest potential loses stability or disappears, and the neuron fires. This bifurcation is local, but it results in a nonlocal event — the generation of an action potential, or spike. These are observable global phenomena. To model them requires knowledge about global features of neuron dynamics, which usually is not available. However, to predict the onset of such phenomena, one needs only local information about the behavior of the neuron near the rest potential. Thus, one can obtain some global information about behavior of a system by performing local analysis. Our nonhyperbolic neural network approach uses this observation.
Frank C. Hoppensteadt, Eugene M. Izhikevich

4. Introduction to Canonical Models

Abstract
Theoretical studies of the human brain require mathematical models. Unfortunately, mathematical analysis of brain models is of limited value, since the results can depend on particulars of an underlying model; that is, various models of the same brain structure could produce different results. This could discourage biologists from using mathematics and/or mathematicians. A reasonable way to circumvent this problem is to derive results that are largely independent of the model and that can be observed in a broad class of models. For example, if one modifies a model by adding more parameters and variables, similar results should hold.
Frank C. Hoppensteadt, Eugene M. Izhikevich

Derivation of Canonical Models

Frontmatter

5. Local Analysis of WCNNs

Abstract
The local activity of a weakly connected neural network (WCNN) is described by the dynamical system
$$ \dot X_i = F_i \left( {X_i ,\lambda } \right) + \varepsilon G_i \left( {X_1 , \ldots ,X_n ,\lambda ,\rho ,\varepsilon } \right),{\text{ i = 1,}} \ldots {\text{,n,}} $$
(5.1)
where X i ∈ ℝm, λ ∈ Λ, and ρ ∈ R, near an equilibrium point. First we use the Hartman-Grobman theorem to show that the network’s local activity is not interesting from the neurocomputational point of view unless the equilibrium corresponds to a bifurcation point. In biological terms such neurons are said to be near a threshold. Then we use the center manifold theorem to prove the fundamental theorem of WCNN theory, which says that neurons should be close to thresholds in order to participate nontrivially in brain dynamics. Using center manifold reduction we can substantially reduce the dimension of the network. After that, we apply suitable changes of variables, rescaling or averaging, to simplify further the WCNN. All these reductions yield simple dynamical systems that are canonical models according to the definition given in Chapter 4. The models depend on the type of bifurcation encountered. In this chapter we derive canonical models for multiple saddle-node, cusp, pitchfork, Andronov-Hopf, and BogdanovTakens bifurcations. Other bifurcations and critical regimes are considered in subsequent chapters, and we analyze canonical models in the third part of the book.
Frank C. Hoppensteadt, Eugene M. Izhikevich

6. Local Analysis of Singularly Perturbed WCNNs

Abstract
In this chapter we study the local dynamics of singularly perturbed weakly connected neural networks of the form
$$\left\{ {\begin{array}{*{20}{c}} \hfill {\mu {{X}_{i}}^{\prime } = Fi\left( {{{X}_{i}}{{Y}_{i}},\lambda ,\mu } \right) + \varepsilon {{P}_{i}}\left( {X,Y,\lambda ,\rho ,\mu ,\varepsilon } \right)} \\ \hfill {{{Y}_{i}}^{\prime } = {{G}_{i}}\left( {{{X}_{i}},{{Y}_{i}}\lambda ,\mu } \right) + \varepsilon {{Q}_{i}}\left( {X,Y,\lambda ,\rho ,\mu ,\varepsilon } \right)} \\ \end{array} ,\varepsilon ,\mu \ll 1} \right.$$
(6.1)
where X i ∈ ℝ k Y i ∈ ℝ m are fast and slow variables, respectively; τ is a slow time; and ′ = d/dτ. The parameters ε and μ are small, representing the strength of synaptic connections and ratio of time scales, respectively. The parameters λ ∈ Λ and ρ ∈ R. have the same meaning as in the previous chapter: They represent a multidimensional bifurcation parameter and external input from sensor organs, respectively. As before, we assume that all functions F i , G i , which represent the dynamics of each neuron, and all P i and Q i which represent connections between the neurons, are as smooth as necessary for our computations.
Frank C. Hoppensteadt, Eugene M. Izhikevich

7. Local Analysis of Weakly Connected Maps

Abstract
In previous chapters we studied dynamics of weakly connected networks governed by a system of ordinary differential equations. It is also feasible to consider weakly connected networks of difference equations, or mappings, of the form
$$ X_i \mapsto F_i \left( {X_i ,\lambda } \right) + \varepsilon G_i \left( {X_i ,\lambda ,\rho ,\varepsilon } \right),{\text{ i = 1,}} \ldots {\text{,n, }}\varepsilon \ll {\text{1,}} $$
(1)
where the variables X i ∈ ℝ m , the parameters λ ∈ Λ,ρ ∈ R and the functions F i and G i have the same meaning as in previous chapters. The weakly connected mapping (7.1) can be also written in the form
$$ X_i^{k + 1} = F_i \left( {X_i^k ,\lambda } \right) + \varepsilon G_i \left( {X^k ,\lambda ,\rho ,\varepsilon } \right),{\text{ i = 1,}} \ldots {\text{,n, }}\varepsilon \ll {\text{1,}} $$
where X k is the kth iteration of the variable X. In this chapter we use form (7.1) unless we explicitly specify otherwise.
Frank C. Hoppensteadt, Eugene M. Izhikevich

8. Saddle-Node on a Limit Cycle

Abstract
In this chapter we study a weakly connected network
$$ \dot X_i = F_i \left( {X_i ,\lambda } \right) + \varepsilon G_i \left( {X,\lambda ,\rho ,\varepsilon } \right),{\text{ X}}_i \in \mathbb{R}^m {\text{,i = 1,}} \ldots {\text{,n,}} $$
of neurons each having a saddle-node bifurcation on a limit cycle. Such neurons are said to have Class 1 neural excitability. This bifurcation provides an important example of when local analysis at an equilibrium renders global information about the system behavior.
Frank C. Hoppensteadt, Eugene M. Izhikevich

9. Weakly Connected Oscillators

Abstract
In this chapter we study weakly connected networks
$$ \dot X_i = F_i \left( {X_i ,\lambda } \right) + \varepsilon G_i \left( {X,\lambda ,\rho ,\varepsilon } \right),{\text{ i = 1,}} \ldots {\text{,n,}} $$
(9.1)
of oscillatory neurons. Our basic assumption is that there is a value of λ ∈ Λ such that every equation in the uncoupled system (ε = 0)
$$ \dot X_i = F_i \left( {X_i ,\lambda } \right),{\text{ X}}_i \in \mathbb{R}^m , $$
(9.2)
has a hyperbolic stable limit cycle attractor γ ⊂ ℝ m The activity on the limit cycle can be described in terms of its phase \( \theta \in \mathbb{S}^1 \) of oscillation
$$ \dot \theta _i = \Omega _i \left( \lambda \right), $$
where Ωi(λ) is the natural frequency of oscillations. The dynamics of the oscillatory weakly connected system (9.1) can also be described in terms of phase variables:
$$ \dot \theta _i = \Omega _i \left( \lambda \right), + \varepsilon g_i \left( {\theta ,\lambda ,\rho ,\varepsilon } \right),{\text{ i = 1,}} \ldots {\text{,n}}{\text{.}} $$
Frank C. Hoppensteadt, Eugene M. Izhikevich

Analysis of Canonical Models

Frontmatter

10. Multiple Andronov-Hopf Bifurcation

Abstract
A WCNN of the form
$$ \dot X_i = F_i \left( {X_i ,\lambda } \right){\text{ + }}\varepsilon G_i \left( {X,\lambda ,\rho ,\varepsilon } \right) $$
near a multiple Andronov-Hopf bifurcation point was shown (Theorem 5.8) to have a canonical model of the form
$$ z'_i = b_i z_i + d_i z_i \left| {z_i } \right|^2 + \sum\limits_{j \ne i}^n {c_{ij} z_j ,} $$
(10.1)
where ′ = d/d•, • is slow time, and b i , c ij , d i , z i ∈ ℂ. In this chapter we study general properties of this canonical model. In particular, we are interested in the stability of the origin z 1 = … = z n = 0 and in the possibility of in-phase and anti-phase locking.
Frank C. Hoppensteadt, Eugene M. Izhikevich

11. Multiple Cusp Bifurcation

Abstract
In Section 5.3.3 we showed that any weakly connected neural network of the form
$$ \dot X_i = F_i \left( {X_i ,\lambda } \right) + \varepsilon G_i \left( {X,\lambda ,\rho ,\varepsilon } \right) $$
at a multiple cusp bifurcation is governed by the canonical model
$$ x'_i = r_i + b_i x_i + \sigma _i x_i^3 + \sum\limits_{j = 1}^n {c_{ij} } x_{ij} ,{\text{ i = 1,}} \ldots {\text{,n,}} $$
(11.1)
where ′ = d/dτ, τ is slow time, x i , r i , b i c ij are real variables, and σ i = ±1. In this chapter we study some neurocomputational properties of this canonical model. In particular, we use Hirsch’s theorem to prove that the canonical model can work as a globally asymptotically stable neural network (GAS-type NN) for certain choices of the parameters b1,…, b n . We use Cohen and Grossberg’s convergence theorem to show that the canonical model can also operate as a multiple attractor neural network (MA-type NN).
Frank C. Hoppensteadt, Eugene M. Izhikevich

12. Quasi-Static Bifurcations

Abstract
In this chapter we analyze the canonical models (6.17) and (6.24) for singu­larly perturbed WCNNs at quasi-static saddle-node and quasi-static cusp bifurcations. In particular, we consider the canonical models in the special case
$$ \alpha _1 = \ldots = \alpha _n = \alpha > 0. $$
.
Frank C. Hoppensteadt, Eugene M. Izhikevich

13. Synaptic Organizations of the Brain

Abstract
In this chapter we study the relationship between synaptic organizations and dynamical properties of networks of neural oscillators. In particular, we are interested in which synaptic organizations can memorize and reproduce phase information. Most of our results are obtained for neural oscillators near multiple Andronov-Hopf bifurcations.
Frank C. Hoppensteadt, Eugene M. Izhikevich

Backmatter

Weitere Informationen