main-content

Über dieses Buch

Based on lectures given at Claremont McKenna College, this text constitutes a substantial, abstract introduction to linear algebra. The presentation emphasizes the structural elements over the computational - for example by connecting matrices to linear transformations from the outset - and prepares the student for further study of abstract mathematics. Uniquely among algebra texts at this level, it introduces group theory early in the discussion, as an example of the rigorous development of informal axiomatic systems.

Inhaltsverzeichnis

1. Sets and Functions

Abstract
We begin with an elementary review of the language of functions and, more importantly, the classification of functions according to purely set-theoretic criteria. We assume that the reader has seen the formal definition of a function elsewhere; in any case, we shall need nothing beyond the provisional definition implicit in the following paragraph.
Robert J. Valenza

2. Groups and Group Homomorphisms

Abstract
This chapter introduces the notion of an abstract group, one of the fundamental objects of both algebra and geometry and very much at the heart of linear algebra. In the early stages, the student will perhaps see only a rather arbitrary looking (but attractive!) informal axiomatic system. This is a gross deception. The definition distills millennia of mathematical experience. Another theme also emerges: objects are not nearly so interesting in themselves as the relationships they bear to one another. In the case of groups, these relationships are expressed by group homomorphisms.
Robert J. Valenza

3. Vector Spaces and Linear Transformations

Abstract
Building on our work with groups and group homomorphisms, we now define vector spaces and linear transformations. Again the axioms may at first look arbitrary, but as we shall see in subsequent chapters, they are a masterpiece of abstraction—general enough to admit a vast range of diverse particular instances, but restrictive enough to capture the fundamental geometric notion of dimension.
Robert J. Valenza

4. Dimension

Abstract
This chapter covers the fundamental structure theory for vector spaces. In particular, we shall show that all vector spaces admit coordinate systems (called bases) and that the number of coordinates is intrinsic to the space. These results in turn allow us to define the dimension of a vector space and thus to recast this most basic of geometric notions in purely algebraic terms. In so doing, we extend the application of this concept to many settings that have no obvious a priori geometric interpretation.
Robert J. Valenza

5. Matrices

Abstract
This chapter formally introduces matrices and matrix algebra. First, we take a superficial look at matrices simply as arrays of scalars, divorced from any extrinsic meaning. We define the elements of matrix arithmetic and show that it is formally well behaved, which is surprising since the definition of matrix multiplication looks somewhat unnatural at this point. Second, we consider the connection between matrices and linear systems of equations. This begins to build a bridge between vector space theory and matrix theory that will be completed in the following chapter. Finally, we survey two powerful solution techniques for linear systems and a related method for matrix inversion.
Robert J. Valenza

6. Representation of Linear Transformations

Abstract
We saw in the previous chapter that a matrix in Mat m×n (k) acts by left multiplication as a linear transformation from k n to km. In this chapter we shall see that in a strong sense every linear transformation of finite-dimensional vector spaces over k may be thus realized. (We say that the associated matrix represents the transformation.) In passing, we introduce the notion of a k-algebra, a rich structure that is a hybrid of both vector space and ring. We show that the set of linear transformations from an n-dimensional vector space to itself is in fact isomorphic as a k-algebra to the familiar matrix algebra M n (k).
Robert J. Valenza

7. Inner Product Spaces

Abstract
So far we have seen that the definition of an abstract vector space captures the fundamental geometric notion of dimension. There remain, however, at least two other basic geometric ideas that we have not yet addressed: length and angle. To encompass them in the abstract we need to introduce a bit more structure, and in consequence we shall require that our ground field manifest some notion of order. Hence we no longer operate over some abstract field k but rather, for the most part, over the field R of real numbers. In the final section we shall generalize the results to the complex numbers C.
Robert J. Valenza

8. Determinants

Abstract
The determinant is an amazing function that in some sense measures the in-vertibility of an n×n matrix. In this chapter we first give a formal, functional description of the determinant, showing its existence via a recursive definition. We then exhibit a direct formula, which leads at once to a statement of uniqueness and a surprising multiplicative property. [The determinant turns out to be a group homomorphism from GLn(k) to k*.] Finally, from this multiplicativity one easily deduces that a square matrix is invertible if and only if its determinant is not zero.
Robert J. Valenza

9. Eigenvalues and Eigenvectors

Abstract
This chapter introduces and, to a limited extent, solves one of the classical problems associated with linear processes: their decomposition into well-behaved, independent component subprocesses. What is especially noteworthy and exciting about the material is that it uses all of the major concepts introduced so far, including the representation of linear transformations, real and complex inner product spaces, and the theory of determinants. The final theorem of Section 9.3 is as exquisite as any work of art.
Robert J. Valenza

10. Triangulation and Decomposition of Endomorphisms

Abstract
In Section 9.3 we saw that Hermitian and complex unitary transformations always admit an orthogonal basis of eigenvectors, with respect to which these mappings are represented by simple diagonal matrices. This chapter presses the attack to find out what in general can be said about an arbitrary endomorphism T of a finite-dimensional vector space over an abstract field k. We first establish an astounding property of the characteristic polynomial; this is the content of the Cayley-Hamilton Theorem (10-1). Next, using similar techniques, we show that T is representable at least by an upper triangular matrix, provided that the roots of its characteristic polynomial all lie in k. This leads to the introduction of so-called nilpotent mappings. Maintaining our previous assumption on the characteristic polynomial, we then show that T is expressible as the sum of a diagonal map (easily built out of its eigenvalues) and a nilpotent map. Finally, further analysis of nilpotent endomorphisms yields a special matrix representation called the Jordan normal form.
Robert J. Valenza

Backmatter

Weitere Informationen