Skip to main content
Top

2020 | Book

Non-Asymptotic Analysis of Approximations for Multivariate Statistics

Authors: Prof. Yasunori Fujikoshi, Prof. Vladimir V. Ulyanov

Publisher: Springer Singapore

Book Series : SpringerBriefs in Statistics

insite
SEARCH

About this book

This book presents recent non-asymptotic results for approximations in multivariate statistical analysis. The book is unique in its focus on results with the correct error structure for all the parameters involved. Firstly, it discusses the computable error bounds on correlation coefficients, MANOVA tests and discriminant functions studied in recent papers. It then introduces new areas of research in high-dimensional approximations for bootstrap procedures, Cornish–Fisher expansions, power-divergence statistics and approximations of statistics based on observations with random sample size. Lastly, it proposes a general approach for the construction of non-asymptotic bounds, providing relevant examples for several complicated statistics. It is a valuable resource for researchers with a basic understanding of multivariate statistics.

Table of Contents

Frontmatter
Chapter 1. Non-Asymptotic Bounds
Abstract
Most asymptotic errors in statistical inference are based on error estimates when the sample size n and the dimension p of observations are large. More precisely, such statistical statements are evaluated when n and/or p tend to infinity. On the other hand, “non-asymptotic” results are derived under the condition that n, p, and the parameters involved are fixed. In this chapter, we explain non-asymptotic error bounds, while giving the Edgeworth expansion, Berry–Essen bounds, and high-dimensional approximations for the linear discriminant function.
Yasunori Fujikoshi, Vladimir V. Ulyanov
Chapter 2. Scale-Mixed Distributions
Abstract
In this chapter we present a general theory of approximation of scale-mixed distributions or distributions of scale mixtures, including simple examples of Student’s t-distribution and F-distribution as a scale mixtures of the normal and chi-square distribution, respectively. Such scale mixtures appear as sampling distributions of various statistics such as the studentized version of some estimators. Errors of the approximation are evaluated in \(\sup \) and \(L_1\)-norms. Extension to multivariate scale mixtures with error bounds evaluated in \(L_1\)-norm shall be discussed in Chap. 3.
Yasunori Fujikoshi, Vladimir V. Ulyanov
Chapter 3. MANOVA Test Statistics
Abstract
The main purpose of this chapter is to give a method for obtaining error bounds for asymptotic expansions of the null distributions of Hotelling’s \(T^2\) (or Lawley–Hotelling criterion, \(T_{LH}\)), the likelihood-ratio criterion \(T_{LR}\) and the Bartlett–Nanda–Pillai criterion \(T_{BNP}\) in the MANOVA model when the sample size is large. The results for \(T_{LH}\) and \(T_{LR}\) are obtained by expressing these statistics in terms of a multivariate scale mixture, and using error bounds evaluated in \(L_1\)-norm. The error bound is given for the limiting distribution of \(T_{BNP}\) by using a relationship between \(T_{BNP}\) and \(T_{LH}\). Further, we give error bounds for these criteria when the sample size and the dimension are large.
Yasunori Fujikoshi, Vladimir V. Ulyanov
Chapter 4. Linear and Quadratic Discriminant Functions
Abstract
This chapter is concerned with theoretical accuracies for asymptotic approximations of the expected probabilities of misclassification (EPMC) when the linear discriminant function and the quadratic discriminant function are used. The method in this chapter is based on asymptotic bounds for asymptotic approximations of a location and scale mixture. The asymptotic approximations considered in detail are those in which both the sample size and the dimension are large, and the sample size is large.
Yasunori Fujikoshi, Vladimir V. Ulyanov
Chapter 5. Cornish–Fisher Expansions
Abstract
First, we focus on Cornish–Fisher expansions for quantiles whose distributions have Edgeworth-type expansions. Then, when the Edgeworth-type expansions have computable error bounds, we give computable error bounds for the Cornish–Fisher expansions. Some of the results depend on Bartlett-type corrections. The results are illustrated by examples.
Yasunori Fujikoshi, Vladimir V. Ulyanov
Chapter 6. Likelihood Ratio Tests with Box-Type Moments
Abstract
In this chapter we consider statistics with Box-types of moments. Such statistics appear as various likelihood ratio statistics for multivariate normal populations. First, their large-sample approximation method is explained. Then, we derive their high-dimensional asymptotic expansions. Further, it is noted that an error bound for high-dimensional asymptotic expansions can be derived for some statistics including the lambda distribution.
Yasunori Fujikoshi, Vladimir V. Ulyanov
Chapter 7. Bootstrap Confidence Sets
Abstract
A sample \( X_{1},\ldots ,X_{n} \) consisting of independent identically distributed random vectors in \( \mathbb {R}^{p} \) with zero mean and covariance matrix \( \mathbf {\Sigma }\) is considered. The recovery of spectral projectors of high-dimensional covariance matrices from a sample of observations is a key problem in statistics arising in numerous applications. This chapter describes a bootstrap procedure for constructing confidence sets for the spectral projector \( \mathbf {P}_{r} \) related to rth eigenvalue of the covariance matrix \(\mathbf {\Sigma }\) from given data on the base of corresponding spectral projector \(\widehat{\mathbf {P}}_{r}\) of the sample covariance matrix \(\widehat{\mathbf {\Sigma }}\). This approach does not use the asymptotical distribution of \( \Vert \mathbf {P}_{r} - \widehat{\mathbf {P}}_{r} \Vert _{2} \) and does not require the computation of its moment characteristics. The performance of the bootstrap approximation procedure is analyzed.
Yasunori Fujikoshi, Vladimir V. Ulyanov
Chapter 8. Gaussian Comparison and Anti-concentration
Abstract
We derive tight non-asymptotic bounds for the Kolmogorov distance between the probabilities of two Gaussian elements to hit a ball in a Hilbert space. The key property of these bounds is that they are dimension-free and depend on the nuclear (Schatten-one) norm of the difference between the covariance operators of the elements and on the norm of the mean shift. The obtained bounds significantly improve the bound based on the Pinsker inequality via the Kullback–Leibler divergence. We also establish an anti-concentration bound for a squared norm of a non-centered Gaussian element in a Hilbert space. A number of examples are also provided, motivating the results and its applications to statistical inference and high-dimensional CLT.
Yasunori Fujikoshi, Vladimir V. Ulyanov
Chapter 9. Approximations for Statistics Based on Random Sample Sizes
Abstract
In practice, we often encounter situations where a sample size is not defined in advance and can be random itself. It is known that the asymptotic properties of the statistics can be radically changed when the non-random sample size is replaced by a random value. In this chapter, we consider the second-order Chebyshev–Edgeworth type and Cornish–Fisher type expansions based on Student’s t- and Laplace distributions and their quantiles for samples with random size of a special kind. This is accomplished using the general transfer theorem, which allows constructing asymptotic expansions for distributions of randomly normalized statistics from the distributions of the considered non-randomly normalized statistics and of the random size of the underlying sample.
Yasunori Fujikoshi, Vladimir V. Ulyanov
Chapter 10. Power-Divergence Statistics
Abstract
In this chapter, we focus on approximation problems motivated by studies on the asymptotic behavior of power-divergence family of statistics. These statistics are the goodness-of-fit test statistics and include, in particular, the Pearson chi-squared statistic, the Freeman–Tukey statistic, and the log-likelihood ratio statistic. The distributions of the statistics converge to the chi-squared distribution as sample size \(n\) tends to \(\infty \). We show that the rate of convergence is of order \(n^{-\alpha } \) with \(\alpha : 1/2< \alpha < 1 \). Under some conditions \(\alpha \) is close to 1. The proofs are based on the fundamental number theory results about approximating the number of integer points in convex sets by the Lebesgue measure of the set.
Yasunori Fujikoshi, Vladimir V. Ulyanov
Chapter 11. General Approach to Constructing Non-Asymptotic Bounds
Abstract
In this chapter, we consider asymptotic expansions for a class of sequences of symmetric functions of many variables. It implies a general approach to get the non-asymptotic bounds for accuracy of approximation of nonlinear forms in random elements in terms of Lyapunov type ratios. Applications to classical and free probability theory are discussed. In particular, we apply the general results to the central limit theorem for weighted sums, including the case of dependent summands and the case when the distributions of weighted sums are approximated by the normal distribution with accuracy of order \(\text{ O }(n^{-1})\). We consider also applications for distributions of U-statistics of the second order and higher.
Yasunori Fujikoshi, Vladimir V. Ulyanov
Metadata
Title
Non-Asymptotic Analysis of Approximations for Multivariate Statistics
Authors
Prof. Yasunori Fujikoshi
Prof. Vladimir V. Ulyanov
Copyright Year
2020
Publisher
Springer Singapore
Electronic ISBN
978-981-13-2616-5
Print ISBN
978-981-13-2615-8
DOI
https://doi.org/10.1007/978-981-13-2616-5

Premium Partner