Skip to main content
Top

2020 | Book

A Matrix Algebra Approach to Artificial Intelligence

insite
SEARCH

About this book

Matrix algebra plays an important role in many core artificial intelligence (AI) areas, including machine learning, neural networks, support vector machines (SVMs) and evolutionary computation. This book offers a comprehensive and in-depth discussion of matrix algebra theory and methods for these four core areas of AI, while also approaching AI from a theoretical matrix algebra perspective.

The book consists of two parts: the first discusses the fundamentals of matrix algebra in detail, while the second focuses on the applications of matrix algebra approaches in AI. Highlighting matrix algebra in graph-based learning and embedding, network embedding, convolutional neural networks and Pareto optimization theory, and discussing recent topics and advances, the book offers a valuable resource for scientists, engineers, and graduate students in various disciplines, including, but not limited to, computer science, mathematics and engineering.

Table of Contents

Frontmatter

Introduction to Matrix Algebra

Frontmatter
Chapter 1. Basic Matrix Computation
Abstract
In science and engineering, we often encounter the problem of solving a system of linear equations. Matrices provide the most basic and useful mathematical tool for describing and solving such systems. As the introduction to matrix algebra, this chapter presents the basic operations and performance of matrices, followed by a description of vectorization of matrix and matricization of vector.
Xian-Da Zhang
Chapter 2. Matrix Differential
Abstract
The matrix differential is a generalization of the multivariate function differential. The matrix differential (including the matrix partial derivative and gradient) is an important operation tool in matrix algebra and optimization in machine learning, neural networks, support vector machine and evolutional computation. This chapter is concerned with the theory and methods of matrix differential.
Xian-Da Zhang
Chapter 3. Gradient and Optimization
Abstract
Many machine learning methods involve solving complex optimization problems, e.g., neural networks, support vector machines, and evolutionary computation. Optimization theory mainly considers (1) the existence conditions for an extremum value (gradient analysis); (2) the design of optimization algorithms and convergence analysis. This chapter focuses on convex optimization theory and methods by focusing on gradient/subgradient methods in smooth and nonsmooth convex optimizations and constrained convex optimization.
Xian-Da Zhang
Chapter 4. Solution of Linear Systems
Abstract
One of the most common problems in science and engineering is to solve the matrix equation Ax = b for finding x when the data matrix A and the data vector b are given.
Xian-Da Zhang
Chapter 5. Eigenvalue Decomposition
Abstract
This chapter is devoted to another core subject of matrix algebra: the eigenvalue decomposition (EVD) of matrices, including various generalizations of EVD such as the generalized eigenvalue decomposition, the Rayleigh quotient, and the generalized Rayleigh quotient.
Xian-Da Zhang

Artificial Intelligence

Frontmatter
Chapter 6. Machine Learning

Machine learning is a subset of artificial intelligence. This chapter presents first a machine learning tree, and then focuses on the matrix algebra methods in machine learning including single-objective optimization, feature selection, principal component analysis, and canonical correlation analysis together with supervised, unsupervised, and semi-supervised learning and active learning. More importantly, this chapter highlights selected topics and advances in machine learning: graph machine learning, reinforcement learning, Q-learning, and transfer learning.

Xian-Da Zhang
Chapter 7. Neural Networks

Neural networks can be viewed as a kind of cognitive intelligence. This chapter presents a neural network tree, and then deals with the optimization problem in neural networks, activation functions, and basic neural networks from the perspective of matrix algebra. The topical subjects of this chapter are selected topics and advances in neural networks: convolutional neural networks (CNNs), dropout learning, autoencoder, extreme learning machine (ELM), graph embedding, manifold learning, network embedding, graph neural networks (GNNs), batch normalization networks, and generative adversarial networks (GANs).

Xian-Da Zhang
Chapter 8. Support Vector Machines
Abstract
Supervised regression/classification methods learn a model of relation between the target vectors \(\{y_i \}_{i=1}^N\) and the corresponding input vectors \(\{{\mathbf {x}}_i\}_{i=1}^N\) consisting of N training samples and utilize this model to predict/classify target values for the previously unseen inputs.
Xian-Da Zhang
Chapter 9. Evolutionary Computation
Abstract
From the perspective of artificial intelligence, evolutionary computation belongs to computation intelligence. The origins of evolutionary computation can be traced back to the late 1950s (see, e.g., the influencing works (Friedberg, IBM J 2(1):2–13, 1958; Friedberg, et al. IBM J 3(7):282–287, 1959; Box, Appl Stat VI(2):81–101, 1957; Bremermann, Optimization through evolution and recombination. In: Yovits MC et al (eds) Self-organizing systems. Spartan, Washington, 1962)), and has started to receive significant attention during the 1970s (see, e.g., Fogel (Ind Res 4:14–19, 1962); Holland (J Assoc Comput Mach 3:297–314, 1962); Rechenberg (Cybernetic solution path of an experimental problem. Royal Aircraft Establishment, Library translation No. 1122, Farnborough, Hants, 1965)).
Xian-Da Zhang
Backmatter
Metadata
Title
A Matrix Algebra Approach to Artificial Intelligence
Author
Prof. Xian-Da Zhang
Copyright Year
2020
Publisher
Springer Singapore
Electronic ISBN
978-981-15-2770-8
Print ISBN
978-981-15-2769-2
DOI
https://doi.org/10.1007/978-981-15-2770-8

Premium Partner