Skip to main content

2019 | Buch

Models of Neurons and Perceptrons: Selected Problems and Challenges

insite
SUCHEN

Über dieses Buch

This book describes models of the neuron and multilayer neural structures, with a particular focus on mathematical models. It also discusses electronic circuits used as models of the neuron and the synapse, and analyses the relations between the circuits and mathematical models in detail. The first part describes the biological foundations and provides a comprehensive overview of the artificial neural networks. The second part then presents mathematical foundations, reviewing elementary topics, as well as lesser-known problems such as topological conjugacy of dynamical systems and the shadowing property. The final two parts describe the models of the neuron, and the mathematical analysis of the properties of artificial multilayer neural networks. Combining biological, mathematical and electronic approaches, this multidisciplinary book it useful for the mathematicians interested in artificial neural networks and models of the neuron, for computer scientists interested in formal foundations of artificial neural networks, and for the biologists interested in mathematical and electronic models of neural structures and processes.

Inhaltsverzeichnis

Frontmatter
Chapter 1. Introduction
Abstract
In contemporary natural sciences strong mutual relations can be observed. Thus, biology is associated with chemistry and physics. It is connected, furthermore, with computer science, electronics, mathematics, cybernetics and philosophy which contribute significantly into biological studies. Let us specify the aforementioned relations in detail - see Fig. 1.1.
Andrzej Bielecki

Preliminaries

Frontmatter
Chapter 2. Biological Foundations
Abstract
Each type of biological cells, including the simplest bacteria, receives stimuli from its environment and processes the obtained signals. Nevertheless, only metazoans, are the group whose representatives are equipped with neurons - the cells highly specialized in signal processing and transmission. In particular, only neurons are able to transmit signals over long distances. Neurons constitute multicellular structures including the most complex one which is the brain that is built not only from neurons but also from glia cells. The evolutionary development of the brain enable animals to perform intensional, not only reflexive, movements. Each consciously and intentionally initiated movement has its origin in the centers of movement control.
Andrzej Bielecki
Chapter 3. Foundations of Artificial Neural Networks
Abstract
The rapid growth of computational power of computers is one of the basic qualities in the development of computer science. Therefore, informatics is applied to solving more and more complex problems and, what follows, the demand for bigger and more complex software occurs. It is not always possible, however, to use classical algorithmic methods to create such a type of software. There are two reasons for it. First of all, a good model of the relation between the input and output parameters often either does not exist at all or it cannot be created at the present level of scientific knowledge. It is worth of mentioning that the algorithmic approach requires the knowledge of the explicit form of the mapping between the aforementioned sets of parameters. Secondly, even if the model is given, the algorithmic approach can be impossible regarding its over-complexity. It can be both complexity of the task on the stage of the algorithm creating, and too slow working of the implemented system. The latter one is a critical parameter especially in the on-line systems. Therefore, the alternative approaches, in comparison with the classical algorithmic approach, are developed intensively. Artificial neural networks are included into this group of methods.
Andrzej Bielecki

Mathematical Foundations

Frontmatter
Chapter 4. General Foundations
Abstract
In this chapter general mathematical foundations, necessary for the presented studies, are specified. Most of them have basic character.
Andrzej Bielecki
Chapter 5. Foundations of Dynamical Systems Theory
Abstract
In this chapter the issues of dynamical systems theory, which is used for the further analysis, are presented. In the first section some basic facts are recalled. In the subsequent sections some advanced topics are considered. Then, the Euler method on a manifold is discussed. Then linear, weakly nonlinear and gradient dynamical systems are elaborated. In three last sections of this chapter topological conjugacy of cascades, pseudo-orbit tracing property and dynamical systems with control are presented. It should be mentioned that both topological conjugacy and shadowing property are the topics that are far from being worked out completely and a lot of problems in this field are open.
Andrzej Bielecki

Mathematical Models of the Neuron

Frontmatter
Chapter 6. Models of the Whole Neuron
Abstract
A neuron, as it was discussed in Chap. 2, is a biological cell that has complex structure. Furthermore, numerous processes occur within it. Therefore, at the present level of scientific knowledge it is impossible to create any formal model that contains all the structural and dynamical aspects of the neuron. In such a situation two approaches can be applied: either a very simplified model of the neuron is created or there is created a model which describes only a part of a neuron structures or processes. The first group of the models is widely used as the basis for artificial neural networks. The second group of the models is frequently embodied as electronic circuits. Such an approach creates good perspectives for using the electronic circuits in future as the components of more holistic models of the neuron and, as the consequence, as the basis of artificial neural networks. In this chapter both groups of models are discussed. Connections with electronic circuits are presented as well.
Andrzej Bielecki
Chapter 7. Models of Parts of the Neuron
Abstract
As it has been mentioned in the previous section, the models of the whole neuron are too simplified to reflect all crucial aspects of the signal processing which is performed by the nervous system. Therefore, models of parts of the neuron are created. Both mathematical models and electronic circuits are used for modelling the parts of neural cells. These two approaches refer to each other - if a circuit model is given, then the ordinary differential equation that describes the dynamics of potential or current in the circuit can be formulated.
Andrzej Bielecki

Mathematical Models of the Perceptron

Frontmatter
Chapter 8. General Model of the Perceptron
Abstract
The general model of the perceptron is presented in this chapter. The model consists of two parts. The first one is a mathematical description of structure of the artificial neural network. The description is based on graph theory and it is very general. It is valid for each type of neural networks, not only for the perceptron. The formal basis of training process of the perceptron is presented in Sect. 8.2. Next, in Sect. 8.3, the training process of the perceptron is discussed in the context of dynamical systems theory.
Andrzej Bielecki
Chapter 9. Linear Perceptrons
Abstract
In this chapter the linear perceptrons are considered. In Sect. 9.1 some basic properties of structures of the linear perceptrons are discussed whereas in Sect. 9.2 the dynamics of the training process of the linear perceptrons is analyzed. The stability of the training process is studied in Sect. 9.3.
Andrzej Bielecki
Chapter 10. Weakly Nonlinear Perceptrons
Abstract
The character of the dynamics of linear both flows and cascades is well investigated. It is known, among others, that the cascades generated by a linear flow preserve regular dynamics of the flow. In particular, the problems connected with topological conjugacy and shadowing properties are resolved. The regular dynamics of linear dynamical systems is preserved if they are weakly distorted - so called weakly nonlinear dynamical systems. This is formulated strictly in Grobman and Hartman theorems and Fečkan Theorem - see Sect. 5.​6.
Andrzej Bielecki
Chapter 11. Nonlinear Perceptrons
Abstract
In this chapter a training process of the most general class of perceptrons - the nonlinear ones - is considered. Runge-Kutta methods, first of all the gradient descent method (the Euler method), that are used as the numerical training algorithm, are studied in the context of their stability and robustness. It should be stressed that the continuous model of the training process is considered in the Euclidean space \(\mathbb {R}^n.\) The training algorithm is implemented as an iterative numerical rule in \(\mathbb {R}^n,\) as well. However, the theoretical analysis presented in this chapter concerns numerical schemata on the \(n-\)dimensional compact manifold \(\mathcal{M}_S^n,\) which is homeomorphic to the sphere \(\mathcal{S}^n.\) This is possible thanks to the specific compactification procedure, which is described in details in the Step 1 of the proof of Theorem 11.1. Such approach allows us to apply results concerning numerical dynamics on compact manifolds.
Andrzej Bielecki
Chapter 12. Concluding Remarks and Comments
Abstract
In this monograph three topics, mutually complementary, are studied: modelling of the neuron, modelling of the processes that take place in the neuron, mathematical analysis of dynamical properties of gradient training processes of perceptrons
Andrzej Bielecki

Appendix

Frontmatter
Chapter 13. Approximation Properties of Perceptrons
Abstract
As it was mentioned in Sect. 8.​1, an untrained perceptron can be treated as a family of functions \(\mathbb {R}^n\rightarrow \mathbb {R}^m\) indexed by a vector set of all its weights. A given training set, in turn, can be regarded as a set of the points to which a mapping should be approximated in the best way. The investigations of approximation abilities of neural networks are focused on the existence of an arbitrarily close approximation. They are also focused on the problem how accuracy depends on a complexity of a perceptron. In this chapter a few basic theorems that concern the approximation properties of perceptrons are discussed. The presented theorems are the classical results. In this monograph they are presented without the proofs which can be found in literature.
Andrzej Bielecki
Chapter 14. Proofs
Abstract
Two proofs that are not commonly known, are presented in this chapter.
Andrzej Bielecki
Backmatter
Metadaten
Titel
Models of Neurons and Perceptrons: Selected Problems and Challenges
verfasst von
Prof. Andrzej Bielecki
Copyright-Jahr
2019
Electronic ISBN
978-3-319-90140-4
Print ISBN
978-3-319-90139-8
DOI
https://doi.org/10.1007/978-3-319-90140-4