Skip to main content
main-content

Über dieses Buch

This book not only provides a comprehensive introduction to neural-based PCA methods in control science, but also presents many novel PCA algorithms and their extensions and generalizations, e.g., dual purpose, coupled PCA, GED, neural based SVD algorithms, etc. It also discusses in detail various analysis methods for the convergence, stabilizing, self-stabilizing property of algorithms, and introduces the deterministic discrete-time systems method to analyze the convergence of PCA/MCA algorithms. Readers should be familiar with numerical analysis and the fundamentals of statistics, such as the basics of least squares and stochastic algorithms. Although it focuses on neural networks, the book only presents their learning law, which is simply an iterative algorithm. Therefore, no a priori knowledge of neural networks is required. This book will be of interest and serve as a reference source to researchers and students in applied mathematics, statistics, engineering, and other related fields.

Inhaltsverzeichnis

Frontmatter

Chapter 1. Introduction

Abstract
Pattern recognition and data compression are two applications that rely critically on efficient data representation.
Xiangyu Kong, Changhua Hu, Zhansheng Duan

Chapter 2. Matrix Analysis Basics

Abstract
In this chapter, we review some basic concepts, properties, and theorems of singular value decomposition (SVD), eigenvalue decomposition (ED), and Rayleigh quotient of a matrix. Moreover, we also introduce some basics of matrix analysis. They are important and useful for our theoretical analysis in subsequent chapters.
Xiangyu Kong, Changhua Hu, Zhansheng Duan

Chapter 3. Neural Networks for Principal Component Analysis

Abstract
PCA is a statistical method, which is directly related to EVD and SVD. Neural networks-based PCA method estimates PC online from the input data sequences, which especially suits for high-dimensional data due to the avoidance of the computation of large covariance matrix, and for the tracking of nonstationary data, where the covariance matrix changes slowly over time.
Xiangyu Kong, Changhua Hu, Zhansheng Duan

Chapter 4. Neural Networks for Minor Component Analysis

Abstract
The minor subspace (MS) is a subspace spanned by all the eigenvectors associated with the minor eigenvalues of the autocorrelation matrix of a high-dimensional vector sequence. The MS, also called the noise subspace (NS), has been extensively used in array signal processing. The NS tracking is a primary requirement in many real-time signal processing applications such as the adaptive direction-of-arrival (DOA) estimation, the data compression in data communications, the solution of a total least squares problem in adaptive signal processing, and the feature extraction technique for a high-dimensional data sequence.
Xiangyu Kong, Changhua Hu, Zhansheng Duan

Chapter 5. Dual Purpose for Principal and Minor Component Analysis

Abstract
The PS is a subspace spanned by all eigenvectors associated with the principal eigenvalues of the autocorrelation matrix of a high-dimensional vector sequence, and the subspace spanned by all eigenvectors associated with the minor eigenvalues is called the MS.
Xiangyu Kong, Changhua Hu, Zhansheng Duan

Chapter 6. Deterministic Discrete-Time System for the Analysis of Iterative Algorithms

Abstract
The convergence of neural network-based PCA or MCA learning algorithms is a difficult topic for direct study and analysis. Traditionally, based on the stochastic approximation theorem, the convergence of these algorithms is indirectly analyzed via corresponding DCT systems. The stochastic approximation theorem requires that some restrictive conditions must be satisfied.
Xiangyu Kong, Changhua Hu, Zhansheng Duan

Chapter 7. Generalized Principal Component Analysis

Abstract
Recently, as a powerful feature extraction technique, generalized eigen decomposition (GED) has been attracting great attention and been widely used in many fields, e.g., spectral estimation (Huanqun et al. IEEE Trans Acoust Speech Signal Process 34(2), 272–284, 1986), blind source separation (Chang et al. IEEE Trans Acoust Speech Signal Process 48(3), 900–907, 2000), digital mobile communications (Comon and Golub Proc IEEE 78(8), 1327–1343, 1990), and antenna array processing (Choi et al. IEEE Trans Veh Technol 51(5), 808–816, 2002; Morgan IEEE Trans Commun 51(3), 476–488, 2003).
Xiangyu Kong, Changhua Hu, Zhansheng Duan

Chapter 8. Coupled Principal Component Analysis

Abstract
Among neural network-based PCA or MCA algorithms, most previously reviewed do not consider eigenvalue estimates in the update equations of the weights, except an attempt to control the learning rate based on the eigenvalue estimates.
Xiangyu Kong, Changhua Hu, Zhansheng Duan

Chapter 9. Singular Feature Extraction and Its Neural Networks

Abstract
From the preceding chapters, we have seen that in the wake of the important initiative work by Oja and Sanger, many neural network learning algorithms for PCA have been developed.
Xiangyu Kong, Changhua Hu, Zhansheng Duan
Weitere Informationen

Premium Partner

    Bildnachweise