Open Access
April 2012 Estimation in high-dimensional linear models with deterministic design matrices
Jun Shao, Xinwei Deng
Ann. Statist. 40(2): 812-831 (April 2012). DOI: 10.1214/12-AOS982

Abstract

Because of the advance in technologies, modern statistical studies often encounter linear models with the number of explanatory variables much larger than the sample size. Estimation and variable selection in these high-dimensional problems with deterministic design points is very different from those in the case of random covariates, due to the identifiability of the high-dimensional regression parameter vector. We show that a reasonable approach is to focus on the projection of the regression parameter vector onto the linear space generated by the design matrix. In this work, we consider the ridge regression estimator of the projection vector and propose to threshold the ridge regression estimator when the projection vector is sparse in the sense that many of its components are small. The proposed estimator has an explicit form and is easy to use in application. Asymptotic properties such as the consistency of variable selection and estimation and the convergence rate of the prediction mean squared error are established under some sparsity conditions on the projection vector. A simulation study is also conducted to examine the performance of the proposed estimator.

Citation

Download Citation

Jun Shao. Xinwei Deng. "Estimation in high-dimensional linear models with deterministic design matrices." Ann. Statist. 40 (2) 812 - 831, April 2012. https://doi.org/10.1214/12-AOS982

Information

Published: April 2012
First available in Project Euclid: 17 May 2012

zbMATH: 1273.62177
MathSciNet: MR2933667
Digital Object Identifier: 10.1214/12-AOS982

Subjects:
Primary: 62J07
Secondary: 62G20 , 62J05

Keywords: Identifiability , projection , Ridge regression , Sparsity , thresholding , Variable selection

Rights: Copyright © 2012 Institute of Mathematical Statistics

Vol.40 • No. 2 • April 2012
Back to Top