2012 | OriginalPaper | Buchkapitel
Random Projection
verfasst von : Prof. Jianzhong Wang
Erschienen in: Geometric Structure of High-Dimensional Data and Dimensionality Reduction
Verlag: Springer Berlin Heidelberg
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
Principal component analysis (PCA) is a very important linear method for dimensionality reduction. It measures data distortion globally by the Frobenius norm of the matrix of data difference. The reduced data of PCA consists of several leading eigenvectors of the covariance matrix of the data set. Hence, PCA may not preserve the local separation of the original data. To respect local properties of data in dimensionality reduction (DR), we employ Lipschitz embedding. Random projection is a powerful method to construct Lipschitz mappings to realize dimensionality reduction with a high probability. Random projection does not introduce a significant distortion when the dimension and cardinality of data both are large. It randomly projects the original high-dimensional data into a lower-dimensional subspace. Because the projection costs linear computational time, the method is computationally efficient, yet produces sufficient accuracy with a high probability. In Section 7.1, we give a review of Lipschitz embedding. In Section 7.2, we introduce random matrices and random projection algorithms. In Section 7.3, the justification of the validity of random projection is presented in detail. Particularly, Johnson and Lindenstrauss Lemma will be proved in this section. The applications of random projection are given in Section 7.4.