Skip to main content

2010 | Buch

Process Neural Networks

Theory and Applications

insite
SUCHEN

Über dieses Buch

"Process Neural Network: Theory and Applications" proposes the concept and model of a process neural network for the first time, showing how it expands the mapping relationship between the input and output of traditional neural networks and enhances the expression capability for practical problems, with broad applicability to solving problems relating to processes in practice. Some theoretical problems such as continuity, functional approximation capability, and computing capability, are closely examined. The application methods, network construction principles, and optimization algorithms of process neural networks in practical fields, such as nonlinear time-varying system modeling, process signal pattern recognition, dynamic system identification, and process forecast, are discussed in detail. The information processing flow and the mapping relationship between inputs and outputs of process neural networks are richly illustrated.

Xingui He is a member of Chinese Academy of Engineering and also a professor at the School of Electronic Engineering and Computer Science, Peking University, China, where Shaohua Xu also serves as a professor.

Inhaltsverzeichnis

Frontmatter
1. Introduction
Abstract
As an introduction to this book, we will review the development history of artificial intelligence and neural networks, and then give a brief introduction to and analysis of some important problems in the fields of current artificial intelligence and intelligent information processing. This book will begin with the broad topic of “artificial intelligence”, next examine “computational intelligence”, then gradually turn to “neural computing”, namely, “artificial neural networks”, and finally explain “process neural networks”, of which the theories and applications will be discussed in detail.
2. Artificial Neural Networks
Abstract
The modern computer has strong computing and information processing capabilities, and it can be said that the modern computer has already exceeded the capabilities of the human brain. It plays an important function in human society in the fields of human daily life, production, and scientific research. However, current computer hardware and software systems are still based on von Neumann architecture. They can only mechanically solve actual problems by using predefined programs, and their capability is less than that of humans when solving certain problems, such as adaptive pattern recognition, behavior perception, logical thinking, analysis and processing of incomplete and fuzzy information, independent decision-making in a complex environment, etc. What is more, they lack the mechanism and capability for adaptive learning from the environment and active adaptation to the environment.
3. Process Neurons
Abstract
In this chapter, we will begin to discuss in detail the process neural network (PNN) which is the subject of the book. First, the concept of the process neuron is introduced. The process neuron is the basic information-processing unit that constitutes the PNN, and the model used to form it and its operating mechanism determine the properties and information-processing ability of the PNN. In this chapter, we mainly introduce a general definition and basic properties of the process neuron, and the relationship between the process neuron and mathematical concepts, such as compound functions, functional functions, etc.
4. Feedforward Process Neural Networks
Abstract
A feedforward process neural network is a basic process neural network model, and is an information-forward-propagation network model that consists of some process neurons including the traditional time-invariant neuron as a special case, according to certain topological structures. The inputs/outputs, the connection weights among neuron nodes and the activation thresholds of a feedforward process neural network can be time-varying functions. The network can memorize the structure parameters (or functions) and character parameters (or functions) learned from environments through training, and embody the procedural pattern characteristic and the transformation mechanism of the system. It has strong time-varying information processing ability and strong nonlinear mapping capability between inputs and outputs of time-varying systems. It has broad applicability for building models and solving practical problems, such as pattern recognition of process signals, dynamic system simulation and process control, etc. In this chapter, a general feedforward process neural network model and a process neural network model based on weight function basis expansion are introduced, and the basic properties of the network, such as continuity, functional approximation capability, computing capability, etc., are studied.
5. Learning Algorithms for Process Neural Networks
Abstract
There are already many mature learning algorithms for the training of traditional neural networks, e.g., Back Propagation algorithm (BP algorithm)[1], Particle Swarm Optimization algorithm (PSO algorithm)[2], Genetic Algorithm (GA)[3], GA-PSO algorithm [4], Quantum Genetic algorithm (QG algorithm)[5], etc. Amongst these algorithms, the broadest and most effective one in application is the error back propagation algorithm (BP algorithm) based on gradient descent and its various improved forms. For training of process neural networks, the inputs and the connection weights of the network can be time-varying functions, the process neuron includes spatial aggregation operators and temporal accumulation operators, and the network can include different types of neurons with different operation rules, i.e. each neuron processes the input information according to its own algorithm. All of these make the mapping mechanism and learning course of the process neural network quite different from those of the traditional neural network. Furthermore, because of the randomness of the form and parameter position of the network connection weight functions, if the form of function class is not restricted or set to belong to some function class in advance, it is difficult to determine these complex parameters by learning from practical samples through network training. In mathematical terms, there is a variety of basis function systems in continuous function space so that the functions in the function space can be expressed as finite item expansions of the basis functions with a certain degree of precision under certain conditions.
6. Feedback Process Neural Networks
Abstract
A feedback neural network is an artificial neural network model that has been widely applied to signal processing [1], optimal computation [2], convex nonlinear programming[3], seismic data filtering[4], etc. A traditional feedback neural network model generally has time-invariant inputs. However, when a biological neural organization processes information, it actually feeds back time-delay information and the inputs of external signals will last for a period. Its current outputs depend not only on current inputs, but also on the accumulation of all previous inputs, i.e. a temporal accumulation effect. In practical problems, many systems also have feedback control items, e.g. in a real-time process control system, the inputs of control variables usually need adjusting according to the current output quantity of the system; in some multi-objective optimization problems, the system needs to adjust dynamically to the search strategy according to current states. The feedback process neural network is just a process neural network model with information feedback, and all its neuron nodes are connected according to the information flow direction of the system. The information can be passed back to nodes in each previous layer by certain rules, and output information can be fed back to the nodes themselves. There are many forms of the feedback process neural network model. In this chapter, we mainly introduce a three-layer network model and its learning algorithm, then analyze the stability of the model. In addition, several other forms of feedback process neural network model will be given. When the feedback process neural network transports information, there are forward flows as in a feedforward neural network and time-delay feedback information that is from the latter layer nodes to the former layer nodes.
7. Multi-aggregation Process Neural Networks
Abstract
The inputs to the process neural networks introduced in previous chapters are only time-dependent unary functions. In fact, the input/output functions of the process neural networks need not always depend on time or one variable, they may depend on other multiple process factors, i.e. they may be arbitrary multivariate functions. For example, the output of a practical system whose inputs are relative to both a space position (x,y,z) and time t is the joint action result of several inputs depending on these process factors, such as debris flow formation[1], crop growth prediction[2], earthquake magnitude prediction[3], chemical action in a chemical reaction tower[4], etc. The forms of the input functions (such as rainfall, the degree of erosion, pressure, temperature, etc.) of these systems are u t (x,y,z,t) (i=l,2,...,n) which are all multivariate functions (or processes). If neural networks are used to simulate and build models for these dynamic systems, then multi-factor aggregation and accumulation must be considered when the neurons process the input information. Therefore, process neural networks with only a time dimension can be extended into multi-aggregation process neural networks that can process several multivariate functions (or processes). In this chapter, we will give several models of multi-aggregation process neural networks and derive a gradient descent learning algorithm based on the expansion of a multivariate function.
8. Design and Construction of Process Neural Networks
Abstract
As a kind of functional approximator, process pattern associative memory machine and time-varying signal classifier, process neural networks have broad applications in modeling and solving various practical problems related to the time process or multivariate process. For example, Ding and Zhong used a wavelet process neural network to solve time series prediction problems[1],[2], and used a parallel process neural network to solve the problem of aircraft engine health condition monitoring[3]. Zhong et al. used a continuous wavelet process neural network to solve the problem of monitoring of an aero-engine lubricating oil system[4]. Xu et al. used a process neural network and a quantum genetic algorithm to solve oil recovery ratios[5]; Song et al. used a mixed process neural network to predict the churn in mobile communications[6]. In order to solve practical application problems, we must design and construct corresponding process neural networks in terms of concrete problems, including the choice of the network model, the determination of the number of hidden layers and hidden nodes, the selection or the design of the neuron type in each node layer (including the choice of activation function, etc.), and the design of corresponding learning algorithms and parameters, etc.
9. Application of Process Neural Networks
Abstract
Process neural networks have broad applications in practical problems relating to time-varying processes. Some examples have already been given in previous chapters when introducing some specific process neural networks, such as time series prediction[1]–[3], soft sensing in sewage disposal systems[4], simulation and forecasting in the process of reservoir exploitation, sedimentary facies and reservoir water-flooding identification[5],[6], rotating machinery fault diagnosis[7],[8], etc. In this chapter, we will describe more applications of process neural networks in various fields, for example, in process modeling, nonlinear system identification, process control, classification and clustering, process optimization, forecasting and prediction, evaluation and decision-making, macroscopic control, etc. Besides, we will discuss the possible application of process neural networks in more fields.
Backmatter
Metadaten
Titel
Process Neural Networks
verfasst von
Prof. Xingui He
Prof. Shaohua Xu
Copyright-Jahr
2010
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-540-73762-9
Print ISBN
978-3-540-73761-2
DOI
https://doi.org/10.1007/978-3-540-73762-9