Skip to main content

1994 | Buch

Introduction to Neural Networks

verfasst von: Phil Picton

Verlag: Macmillan Education UK

insite
SUCHEN

Inhaltsverzeichnis

Frontmatter
1. What is a Neural Network?
Abstract
This is the first question that everybody asks. It can be answered more easily if the question is broken down into two parts.
Phil Picton
2. ADALINE
Abstract
In the previous chapter, the function of the ADALINE was briefly outlined. As yet, the method used to adjust the weights has not been described. This chapter will explain how weight adjustment is achieved, and will illustrate the method using some simple examples.
Phil Picton
3. Perceptrons
Abstract
Perceptrons are the most widely used and best understood of all the different neural networks. The term was first used by Frank Rosenblatt (Rosenblatt, 1958) to describe a number of different types of neural networks. It was his idea to take the functional description of how a neuron works and to implement it as an algorithm in software rather than trying to build a physical model of a neuron like the ADALINE.
Phil Picton
4. Boolean Neural Networks
Abstract
Boolean neural networks differ from networks such as the ADALINE and perceptron by using Boolean logic elements as the basic components. The simplest Boolean logic elements are logic gates such as the AND gate. However, more complex devices such as memories can also be described as Boolean logic devices since they could be built from these simpler gates. This means that the basic action of the elements is logical rather than involving the summation of weighted inputs. This gives rise to another term which is sometimes used to describe them, namely weightless neural networks.
Phil Picton
5. Associative Memory and Feedback Networks
Abstract
First, a review of what was said in chapter 1. It was said that neural networks are basically pattern classifiers, and the previous chapters showed how this is achieved. There is another aspect of neural networks that needs including, however, as a kind of consequence of its pattern classifying ability. This other property is its ability to associate one pattern with another.
Phil Picton
6. Probabilistic Networks
Abstract
One of the problems with the Hopfield network is its tendency to settle at local energy minima rather than at the global minimum. Some of the local minima represent points in the ‘energy landscape’ where the stored patterns can be found. When an input vector is applied to the network, if it lies close to one of these minima, it will move towards that minima because of the nature of the changes in the neurons, which always lower the energy in the network. So if a pattern is close to one of the stored patterns, it should produce that stored pattern at the output, once it has settled in the local minima.
Phil Picton
7. Self-Organizing Networks
Abstract
Another variation on the neural network are systems which are said to be self-organizing networks. What this means is that the systems are trained by showing examples of patterns that are to be classified, and the network is allowed to produce its own output code for the classification.
Phil Picton
8. Neural Networks in Control Engineering
Abstract
In the design of classical control systems there are three distinct steps:
  • Step 1 find a mathematical model of the system that you wish to control;
  • Step 2 if the model is linear then move on to step 3, otherwise try to alter the system or its model so that it becomes linear;
  • Step 3 apply some technique from control theory to produce the desired system responses.
Phil Picton
9. Threshold Logic
Abstract
A great deal of research has gone into threshold logic, with some useful theoretical results (Dertouzos, 1965; Muroga, 1971; Hurst, 1978). The main aim of threshold logic is to take what are essentially McCulloch-Pitts neurons, and use them to represent completely specified binary logic functions.
Phil Picton
10. Implementation
Abstract
Historically, neural networks have been built using all kinds of technology — mechanical, electrical and more recently optical. However, many of the neural networks that have been described in this book have never actually been implemented other than as simulations on conventional computers.
Phil Picton
11. Conclusions
Abstract
This book started off by trying to answer what, at first, seemed like a very straightforward question — what is a neural network? What followed was an attempt at answering that question. The first chapter introduced some of the concepts that would be used throughout the book, such as the basic models for a neuron, and some of the architectures in which they are found. It soon became apparent that there is no simple answer, and that this is a research area that is still growing. Each new chapter introduced yet another architecture, and within that there would be several different varieties of network, each slightly different to the others. So many networks have been included, and yet there are still more that have been omitted for lack of space.
Phil Picton
Backmatter
Metadaten
Titel
Introduction to Neural Networks
verfasst von
Phil Picton
Copyright-Jahr
1994
Verlag
Macmillan Education UK
Electronic ISBN
978-1-349-13530-1
Print ISBN
978-0-333-61832-5
DOI
https://doi.org/10.1007/978-1-349-13530-1

Neuer Inhalt