skip to main content
10.1145/1102351.1102450acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicmlConference Proceedingsconference-collections
Article

Analysis and extension of spectral methods for nonlinear dimensionality reduction

Published:07 August 2005Publication History

ABSTRACT

Many unsupervised algorithms for nonlinear dimensionality reduction, such as locally linear embedding (LLE) and Laplacian eigenmaps, are derived from the spectral decompositions of sparse matrices. While these algorithms aim to preserve certain proximity relations on average, their embeddings are not explicitly designed to preserve local features such as distances or angles. In this paper, we show how to construct a low dimensional embedding that maximally preserves angles between nearby data points. The embedding is derived from the bottom eigenvectors of LLE and/or Laplacian eigenmaps by solving an additional (but small) problem in semidefinite programming, whose size is independent of the number of data points. The solution obtained by semidefinite programming also yields an estimate of the data's intrinsic dimensionality. Experimental results on several data sets demonstrate the merits of our approach.

References

  1. Belkin, M., & Niyogi, P. (2003). Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 15(6), 1373--1396. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Brand, M. (2003). Charting a manifold. Advances in Neural Information Processing Systems 15 (pp. 985--992). Cambridge, MA: MIT Press.Google ScholarGoogle Scholar
  3. Burges, C. J. C. (2005). Geometric methods for feature extraction and dimensional reduction. In L. Rokach and O. Maimon (Eds.), Data mining and knowledge discovery handbook: A complete guide for practitioners and researchers. Kluwer Academic Publishers.Google ScholarGoogle Scholar
  4. Chung, F. R. K. (1997). Spectral graph theory. American Mathematical Society.Google ScholarGoogle Scholar
  5. de Silva, V., & Tenenbaum, J. B. (2003). Global versus local methods in nonlinear dimensionality reduction. Advances in Neural Information Processing Systems 15 (pp. 721--728). Cambridge, MA: MIT Press.Google ScholarGoogle Scholar
  6. Donoho, D. L., & Grimes, C. E. (2002). When does Isomap recover the natural parameterization of families of articulated images? (Technical Report 2002--27). Department of Statistics, Stanford University.Google ScholarGoogle Scholar
  7. Donoho, D. L., & Grimes, C. E. (2003). Hessian eigenmaps: locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Arts and Sciences, 100, 5591--5596.Google ScholarGoogle ScholarCross RefCross Ref
  8. Johnson, W., & Lindenstrauss, J. (1984). Extensions of lipschitz maps into a hilbert space. Contemporary Mathematics, 189--206.Google ScholarGoogle Scholar
  9. Magen, A. (2002). Dimensionality reductions that preserve volumes and distance to affine spaces, and their algorithmic applications. In J. Rolim and S. Vadhan (Eds.), Randomization and approximation techniques: Sixth international workshop, RANDOM 2002. Springer-Verlag. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Roweis, S. T., & Saul, L. K. (2000). Nonlinear dimensionality reduction by locally linear embedding. Science, 290, 2323--2326.Google ScholarGoogle ScholarCross RefCross Ref
  11. Saul, L. K., & Roweis, S. T. (2003). Think globally, fit locally: unsupervised learning of low dimensional manifolds. Journal of Machine Learning Research, 4, 119--155. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Tenenbaum, J. B., de Silva, V., & Langford, J. C. (2000). A global geometric framework for nonlinear dimensionality reduction. Science, 290, 2319--2323.Google ScholarGoogle ScholarCross RefCross Ref
  13. Vandenberghe, L., & Boyd, S. P. (1996). Semidefinite programming. SIAM Review, 38(1), 49--95. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Weinberger, K. Q., & Saul, L. K. (2004). Unsupervised learning of image manifolds by semidefinite programming. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR-04) (pp. 988--995). Washington D.C. Google ScholarGoogle ScholarDigital LibraryDigital Library
  1. Analysis and extension of spectral methods for nonlinear dimensionality reduction

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      ICML '05: Proceedings of the 22nd international conference on Machine learning
      August 2005
      1113 pages
      ISBN:1595931805
      DOI:10.1145/1102351

      Copyright © 2005 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 7 August 2005

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • Article

      Acceptance Rates

      Overall Acceptance Rate140of548submissions,26%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader