main-content

## Über dieses Buch

This monograph brings together my work in mathematical statistics as I have viewed it through the lens of Jordan algebras. Three technical domains are to be seen: applications to random quadratic forms (sums of squares), the investigation of algebraic simplifications of maxi­ mum likelihood estimation of patterned covariance matrices, and a more wide­ open mathematical exploration of the algebraic arena from which I have drawn the results used in the statistical problems just mentioned. Chapters 1, 2, and 4 present the statistical outcomes I have developed using the algebraic results that appear, for the most part, in Chapter 3. As a less daunting, yet quite efficient, point of entry into this material, one avoiding most of the abstract algebraic issues, the reader may use the first half of Chapter 4. Here I present a streamlined, but still fully rigorous, definition of a Jordan algebra (as it is used in that chapter) and its essential properties. These facts are then immediately applied to simplifying the M:-step of the EM algorithm for multivariate normal covariance matrix estimation, in the presence of linear constraints, and data missing completely at random. The results presented essentially resolve a practical statistical quest begun by Rubin and Szatrowski [1982], and continued, sometimes implicitly, by many others. After this, one could then return to Chapters 1 and 2 to see how I have attempted to generalize the work of Cochran, Rao, Mitra, and others, on important and useful properties of sums of squares.

## Inhaltsverzeichnis

### Chapter 1. Introduction

Abstract
In statistics it frequently occurs that we need to know whether two sums of squares are independent and whether they are each chi-square in distribution, In terms of necessary and sufficient conditions, the answers to these questions have been worked out, by Cochran, Craig, Rao, Mitra and others, over a period of several decades. For multivariate normal data, these results are usually expressed in terms of rather complex equations involving the true data mean vector and the true covariance matrix (assumed known at least up to a constant).
James D. Malley

### Chapter 2. Jordan Algebras and the Mixed General Linear Model

Abstract
Jordan algebras are not commonly a standard technical tool for even many researchers in pure algebra. It is important therefore to begin by defining and illustrating them in a statistically accessible context. Hence the required definitions, theorems and proofs will be introduced with the minimum of mathematical embellishment. Some of the abstract algebra thus set aside will then appear as Remarks, some deferred to Chapter 3, and some even further removed to literature citations.
James D. Malley

### Chapter 3. Further Technical Results on Jordan Algebras

Abstract
In this chapter we fill in details of some of the proofs given in Chapter 2. Also, new results are presented about subspaces and ideals in Jordan algebras that should help further the application of Jordan algebras in statistical problems generally. To assist in this latter task, our treatment of the material is designed to make the chapter largely self-contained, hence many of the definitions and first appearing in Chapter 2 are re-introduced, though more compactly. This chapter is unabashedly much more mathematical than any of the others, and unavoidably so, since the complete proofs of many of the results (previously just stated) are comparatively non-trivial. Many, but not all, of the results appearing here first appeared in Malley [1987].
James D. Malley

### Chapter 4. Jordan Algebras, the EM Algorithm, and Covariance Matrices

Abstract
Using Jordan algebras, Galois field theory and the EM algorithm we show how to obtain either essentially closed-form or simplified solutions to the ML equations for estimation of the covariance matrix for multivariate normal data in the following situations:
(a)
Data having a patterned covariance matrix; equivalently, data with a linear covariance structure. This case includes the problems of variance and variance-covariance components estimation; unbalanced repeated measures designs; and some time series models;

(b)
Data vectors with values missing completely at random; in particular, retrospective analyses of long term, clinical trials, incomplete repeated measures, and time series;

(c)
The intersection of (a) and (b): multivariate normal data assumed to have a linear covariance structure but with some values missing (completely at random);

James D. Malley

### Backmatter

Weitere Informationen