skip to main content
10.1145/1102351.1102484acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicmlConference Proceedingsconference-collections
Article

Harmonic mixtures: combining mixture models and graph-based methods for inductive and scalable semi-supervised learning

Published:07 August 2005Publication History

ABSTRACT

Graph-based methods for semi-supervised learning have recently been shown to be promising for combining labeled and unlabeled data in classification problems. However, inference for graph-based methods often does not scale well to very large data sets, since it requires inversion of a large matrix or solution of a large linear program. Moreover, such approaches are inherently transductive, giving predictions for only those points in the unlabeled set, and not for an arbitrary test point. In this paper a new approach is presented that preserves the strengths of graph-based semi-supervised learning while overcoming the limitations of scalability and non-inductive inference, through a combination of generative mixture models and discriminative regularization using the graph Laplacian. Experimental results show that this approach preserves the accuracy of purely graph-based transductive methods when the data has "manifold structure," and at the same time achieves inductive learning with significantly reduced computational cost.

References

  1. Belkin, M., Matveeva, I., & Niyogi, P. (2004a). Regularization and semi-supervised learning on large graphs. COLT.Google ScholarGoogle Scholar
  2. Belkin, M., Niyogi, P., & Sindhwani, V. (2004b). Manifold regularization: A geometric framework for learning from examples (Technical Report TR-2004-06). University of Chicago.Google ScholarGoogle Scholar
  3. Blum, A., & Chawla, S. (2001). Learning from labeled and unlabeled data using graph mincuts. Proc. 18th International Conf. on Machine Learning. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Castelli, V., & Cover, T. (1996). The relative value of labeled and unlabeled samples in pattern recognition with an unknown mixing parameter. IEEE Transactions on Information Theory, 42, 2101--2117. Google ScholarGoogle ScholarCross RefCross Ref
  5. Cozman, F., Cohen, I., & Cirelo, M. (2003). Semi-supervised learning of mixture models. ICML-03, 20th International Conference on Machine Learning.Google ScholarGoogle Scholar
  6. Delalleau, O., Bengio, Y., & Roux, N. L. (2005). Efficient non-parametric function induction in semi-supervised learning. Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics (AISTAT 2005).Google ScholarGoogle Scholar
  7. Fowlkes, C., Belongie, S., Chung, F., & Malik, J. (2004). Spectral grouping using the Nyströöm method. IEEE Transactions on Pattern Analysis and Machine Intelligence, 26, 214--225. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Krishnapuram, B., Williams, D., Xue, Y., Hartemink, A., Carin, L., & Figueiredo, M. (2005). On semi-supervised classification. In L. K. Saul, Y. Weiss and L. Bottou (Eds.), Advances in neural information processing systems 17. Cambridge, MA: MIT Press.Google ScholarGoogle Scholar
  9. Nigam, K., McCallum, A. K., Thrun, S., & Mitchell, T. (2000). Text classification from labeled and unlabeled documents using EM. Machine Learning, 39, 103--134. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Ratsaby, J., & Venkatesh, S. (1995). Learning from a mixture of labeled and unlabeled examples with parametric side information. Proceedings of the Eighth Annual Conference on Computational Learning Theory, 412--417. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Weinberger, K. Q., Packer, B. D., & Saul, L. K. (2005). Nonlinear dimensionality reduction by semidefinite programming and kernel matrix factorization. Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics (AISTAT 2005).Google ScholarGoogle Scholar
  12. Weinberger, K. Q., Sha, F., & Saul, L. K. (2004). Learning a kernel matrix for nonlinear dimensionality reduction. Proceedings of ICML-04 (pp. 839--846). Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Zhou, D., Bousquet, O., Lal, T., Weston, J., & Schlkopf, B. (2004). Learning with local and global consistency. Advances in Neural Information Processing System 16.Google ScholarGoogle Scholar
  14. Zhu, X., Ghahramani, Z., & Lafferty, J. (2003). Semi-supervised learning using Gaussian fields and harmonic functions. ICML-03, 20th International Conference on Machine Learning.Google ScholarGoogle Scholar
  15. Zhu, X., Kandola, J., Ghahramani, Z., & Lafferty, J. (2005). Nonparametric transforms of graph kernels for semi-supervised learning. In L. K. Saul, Y. Weiss and L. Bottou (Eds.), Advances in neural information processing systems 17. Cambridge, MA: MIT Press.Google ScholarGoogle Scholar

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Other conferences
    ICML '05: Proceedings of the 22nd international conference on Machine learning
    August 2005
    1113 pages
    ISBN:1595931805
    DOI:10.1145/1102351

    Copyright © 2005 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 7 August 2005

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • Article

    Acceptance Rates

    Overall Acceptance Rate140of548submissions,26%

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader