Skip to main content
main-content
Top

About this book

Linear Algebra and Linear Models comprises a concise and rigorous introduction to linear algebra required for statistics followed by the basic aspects of the theory of linear estimation and hypothesis testing. The emphasis is on the approach using generalized inverses. Topics such as the multivariate normal distribution and distribution of quadratic forms are included.

For this third edition, the material has been reorganised to develop the linear algebra in the first six chapters, to serve as a first course on linear algebra that is especially suitable for students of statistics or for those looking for a matrix theoretic approach to the subject. Other key features include:

coverage of topics such as rank additivity, inequalities for eigenvalues and singular values;

a new chapter on linear mixed models;

over seventy additional problems on rank: the matrix rank is an important and rich topic with connections to many aspects of linear algebra such as generalized inverses, idempotent matrices and partitioned matrices.

This text is aimed primarily at advanced undergraduate and first-year graduate students taking courses in linear algebra, linear models, multivariate analysis and design of experiments. A wealth of exercises, complete with hints and solutions, help to consolidate understanding. Researchers in mathematics and statistics will also find the book a useful source of results and problems.

Table of Contents

Frontmatter

Chapter 1. Vector Spaces and Subspaces

Abstract
Preliminaries concerning matrices and matrix operations are reviewed. Properties of determinant are recalled without proof. Vector spaces, linear independence, basis and dimension are introduced. It is shown that any two bases of a vector space have the same cardinality and that two vector spaces are isomorphic if and only if they have the same dimension. Extension of a linearly independent set to a basis is considered.
R. B. Bapat

Chapter 2. Rank, Inner Product and Nonsingularity

Abstract
It is shown that the column rank of a matrix equals its row rank. Basic properties of rank of a product and sum are proved. Inner product is defined and Gram–Schmidt process is described. Orthogonal projection of a vector and orthogonal complement of a subspace are introduced. the rank plus nullity theorem is proved. Various criteria for nonsingularity are given. The Frobenius inequality for rank is proved.
R. B. Bapat

Chapter 3. Eigenvalues and Positive Definite Matrices

Abstract
Eigenvalues and eigenvectors are defined. The Spectral Theorem for symmetric matrices is proved. Positive definite matrices are introduced. Idempotent matrices are defined and it is shown that for such matrices the rank equals the trace. Schur complement is defined and its application to proving characterizations of positive definite matrices is shown.
R. B. Bapat

Chapter 4. Generalized Inverses

Abstract
Basic properties of generalized inverse are proved. A computational procedure is given. It is shown that a generalized inverse is reflexive if and only if it has the same rank as the matrix. Least squares and minimum norm inverses are defined and their characterization in terms of linear equations is provided. Moore–Penrose inverse is introduced and its existence and uniqueness is established.
R. B. Bapat

Chapter 5. Inequalities for Eigenvalues and Singular Values

Abstract
Extremal representations for the maximum and minimum eigenvalues of a symmetric matrix are proved. Singular values are defined and the Singular Value Decomposition is obtained. Courant–Fischer Minimax Theorem, Cauchy Interlacing Principle and majorization of diagonal elements by eigenvalues of a symmetric matrix are proved. The volume of a matrix is defined as the positive square root of the product of the nonzero singular values. Some basic properties of volume are proved. Minimality properties of the Moore–Penrose inverse involving singular values are established.
R. B. Bapat

Chapter 6. Rank Additivity and Matrix Partial Orders

Abstract
If A, B are m×n matrices then \(\operatorname{rank} B \le \operatorname{rank} A + \operatorname{rank}(B-A)\). When does equality hold in this inequality? This question is related to many notions such as g-inverses, minus partial order and parallel sum. The phenomenon is known as rank additivity. We first prove a characterization result which brings together several conditions equivalent to rank \(B = \operatorname{rank} A + \operatorname{rank} (B-A)\). We then introduce the star order, a partial order on matrices which is a refinement of the minus order. Basic properties of the star order bringing out its relation with the Moore–Penrose inverse are proved.
R. B. Bapat

Chapter 7. Linear Estimation

Abstract
Random vectors, their expectation and dispersion matrix are reviewed. The concept of estimability in a linear model is introduced and the form of the best linear unbiased estimate of an estimable function is derived. The full rank case of the result, which is the Gauss–Markov Theorem, is stated. The Hadamard inequality for the determinant of a positive semidefinite matrix is proved and its application to weighing designs is discussed. An unbiased estimate for the error variance in terms of the residual sum of squares is obtained. The special case of one-way classification is described in greater detail. A general linear model with a possibly singular variance-covariance matrix is considered and the best linear unbiased estimate of an estimable function as well as an estimate of the error variance are obtained.
R. B. Bapat

Chapter 8. Tests of Linear Hypotheses

Abstract
Basic properties of the multivariate normal distribution are proved. The independence of quadratic forms and the independence of a quadratic form and a linear form in a multivariate normal are characterized. The matrix version of the Cochran’s Theorem is formulated with a succinct proof. The results are applied to the one-way and two-way classification, with and without interaction. The general linear hypothesis is considered and the corresponding F-test is developed. Maximum likelihood estimates of the parameters in a linear model are obtained. The multiple correlation coefficient is defined and its relation with the F-statistic for the significance of the regression coefficients is proved.
R. B. Bapat

Chapter 9. Linear Mixed Models

Abstract
Fixed effects and random effects models are introduced with examples. Maximum likelihood (ML), restricted maximum likelihood (REML) and ANOVA methods of estimation of variance components are described, illustrating with the examples of one-way and two-way classification. Prediction of random effects is considered and the estimated best linear unbiased predictor (EBLUP) or the shrinkage estimator is developed.
R. B. Bapat

Chapter 10. Miscellaneous Topics

Abstract
Principal components and canonical correlations are discussed briefly, illustrating the fact that these are important applications of eigenvalues and singular values. Reduced normal equations are described and applied to the case of design of experiments model. Properties of the C-matrix are discussed. The E-, A- and D-optimality are defined and the balanced incomplete block design (BIBD) is shown to be optimal according to these criteria.
R. B. Bapat

Chapter 11. Additional Exercises on Rank

Abstract
The matrix rank is a very important and rich topic with connections to many aspects of linear algebra such as generalized inverses, idempotent matrices, partitioned matrices and positive semidefinite matrices. We collect more than seventy five problems on rank which are generally not found in linear algebra texts. Complete solutions or hints are provided.
R. B. Bapat

Chapter 12. Hints and Solutions to Selected Exercises

Abstract
Solutions or hints, which sketch the solution in many cases, are provided to several exercises in the book. In particular, solutions or hints are provided to all the exercises in Chap. 11 consisting of additional exercises on rank.
R. B. Bapat

Chapter 13. Notes

Abstract
References which consist of books for further reading, and original papers in the case of results which are normally not covered in other texts, are given. Brief comments about some of the results are included.
R. B. Bapat

Backmatter

Additional information

Premium Partner

    Image Credits