Skip to main content

2024 | Buch

Linear Algebra in Data Science

insite
SUCHEN

Über dieses Buch

This textbook explores applications of linear algebra in data science at an introductory level, showing readers how the two are deeply connected. The authors accomplish this by offering exercises that escalate in complexity, many of which incorporate MATLAB. Practice projects appear as well for students to better understand the real-world applications of the material covered in a standard linear algebra course. Some topics covered include singular value decomposition, convolution, frequency filtering, and neural networks. Linear Algebra in Data Science is suitable as a supplement to a standard linear algebra course.

Inhaltsverzeichnis

Frontmatter
1. Introduction
Abstract
In the introduction we briefly motivate the value of matrices and the definitions of matrix addition and multiplication.
Peter Zizler, Roberta La Haye
2. Projections
Abstract
The concept of projection is a fundamental tool in data science. It is the idea behind data reduction and compression. We provide a careful introduction to projections and projection matrices, in particular, we emphasize the foundational rank one projection. We provide matrix expressions for projections onto subspaces. Applications to correlation and partial correlation, linear regression and partial linear regression are provided.
Peter Zizler, Roberta La Haye
3. Matrix Algebra
Abstract
The need to understand matrix algebra in the context of underlying applications is paramount. Rank one projections are the building tools and they provide a deeper understanding for the matrix algebra dynamics. We provide worked examples of matrix decompositions and some applications.
Peter Zizler, Roberta La Haye
4. Rotations and Quaternions
Abstract
Rotations and quaternions are not a data science concept. However, they are an excellent excuse for examining the utility of matrix transformations. The normed division algebra of quaternions is introduced, motivated by rotations in three dimensions. The Euler–Rodrigues formula is carefully presented and derived with worked examples. Connections are made to Euler angles. Octonions are introduced as well in similar context. .
Peter Zizler, Roberta La Haye
5. Haar Wavelets
Abstract
An important example of an orthonormal basis is the Haar basis. We provide an introduction to Haar wavelets using the matrix algebra, involving projections and Kronecker products of matrices. We develop this basis with the Haar wavelet decomposition techniques in mind.
Peter Zizler, Roberta La Haye
6. Singular Value Decomposition
Abstract
Singular value decomposition is a fundamental result in linear algebra that has far-reaching applications in applied mathematics, statistics, and data science. The readily available proof of this theorem, presented in linear algebra classes, might hide the tremendous depth of this result and its usefulness in mathematical sciences. We present an intuitive exposition of this result along with some worked examples of relevant applications.
Peter Zizler, Roberta La Haye
7. Convolution
Abstract
Convolution and the resulting circulant matrices are fundamental tools used in signal processing and data analysis. It is a natural place for the use of complex numbers in data sciences, the introduction and the need for complex exponentials as a basis. We provide a detail exposition of diagonalization of circulant matrices over complex eigenspaces. Frequency filtering is one of the main motivating examples.
Peter Zizler, Roberta La Haye
8. Frequency Filtering
Abstract
Frequency filtering is a vast area in applied mathematics and electrical engineering. We introduce the foundations of Fourier analysis and touch on discrete cosine transform and Hilbert transform. The Z transform and transfer functions are explained in this context. We develop the main ideas behind the fast Fourier transform leading into concepts such as subsampling and aliasing which are carefully explained using worked examples.
Peter Zizler, Roberta La Haye
9. Neural Networks
Abstract
Neural networks and machine learning are forthcoming applications of linear algebra, real analysis, and advances in computing. We present the foundations of the back propagation technique in neural network layers. We present straightforward derivations and worked examples using partial derivatives. We provide simple examples with detailed exposition along the way.
Peter Zizler, Roberta La Haye
10. Some Wavelet Transforms
Abstract
The Haar wavelet transform is a foundational technique. It is a first example of wavelet techniques to decompose data to its coarse and detail components. We present the foundation of this algorithm, worked examples in one-dimensional setting as well as two dimensions. We introduce computationally viable lifting schemes in this context along with similar considerations using the orthogonal Daubechies wavelets.
Peter Zizler, Roberta La Haye
Backmatter
Metadaten
Titel
Linear Algebra in Data Science
verfasst von
Peter Zizler
Roberta La Haye
Copyright-Jahr
2024
Electronic ISBN
978-3-031-54908-3
Print ISBN
978-3-031-54907-6
DOI
https://doi.org/10.1007/978-3-031-54908-3

Premium Partner