skip to main content
10.1145/1143844.1143889acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicmlConference Proceedingsconference-collections
Article

Nightmare at test time: robust learning by feature deletion

Published:25 June 2006Publication History

ABSTRACT

When constructing a classifier from labeled data, it is important not to assign too much weight to any single input feature, in order to increase the robustness of the classifier. This is particularly important in domains with nonstationary feature distributions or with input sensor failures. A common approach to achieving such robustness is to introduce regularization which spreads the weight more evenly between the features. However, this strategy is very generic, and cannot induce robustness specifically tailored to the classification task at hand. In this work, we introduce a new algorithm for avoiding single feature over-weighting by analyzing robustness using a game theoretic formalization. We develop classifiers which are optimally resilient to deletion of features in a minimax sense, and show how to construct such classifiers using quadratic programming. We illustrate the applicability of our methods on spam filtering and handwritten digit recognition tasks, where feature deletion is indeed a realistic noise model.

References

  1. Bovik, A. C., Gibson, J. D., & Bovik, A. (Eds.). (2000). Handbook of image and video processing. Orlando, FL, USA: Academic Press, Inc. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Boyd, S., & Vandenberghe, L. (2004). Convex optimization. New York, NY, USA: Cambridge University Press. Google ScholarGoogle ScholarCross RefCross Ref
  3. Cohen, S., Ruppin, E., & Dror, G. (2005). Feature selection based on the shapley value. IJCAI (pp. 665--670). Professional Book Center. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. El Ghaoui, L., Lanckriet, G., & Natsoulis, G. (2003). Robust classification with interval data (Technical Report UCB/CSD-03-1279). EECS Department, University of California, Berkeley.Google ScholarGoogle Scholar
  5. Gilad-Bachrach, R., Navot, A., & Tishby, N. (2004). Margin based feature selection - theory and algorithms. ICML 21 (pp. 43--50). ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Kim, S., Magnani, A., & Boyd, S. (2006). Robust fisher discriminant analysis. NIPS 18 (pp. 659--666). MIT Press.Google ScholarGoogle Scholar
  7. Krupka, E., & Tishby, N. (2006). Generalization in clustering with unobserved features. NIPS 18 (pp. 683--690). MIT Press.Google ScholarGoogle Scholar
  8. Lanckriet, G., Cristianini, N., Bartlett, P., El Ghaoui, L., & Jordan, M. (2004). Learning the kernel matrix with semidefinite programming. Journal of Machine Learning Research, 5, 27--72. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. LeCun, Y., Jackel, L., Bottou, L., Brunot, A., Cortes, C., Denker, J., Drucker, H., Guyon, I., Müller, U., Sackinger, E., Simard, P., & Vapnik, V. (1995). Comparison of learning algorithms for handwritten digit recognition. ICANN (pp. 53--60).Google ScholarGoogle Scholar
  10. Schölkopf, B., & Smola, A. J. (Eds.). (2002). Learning with kernels. Cambridge, MA: MIT Press.Google ScholarGoogle Scholar
  11. Weston, J., Mukherjee, S., Chapelle, O., Pontil, M., Poggio, T., & Vapnik, V. (2000). Feature selection for SVMs. NIPS 13 (pp. 668--674). MIT Press.Google ScholarGoogle Scholar
  12. Yang, Y., & Pedersen, J. O. (1997). A comparative study on feature selection in text categorization. ICML 14 (pp. 412--420). Morgan Kaufmann. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Nightmare at test time: robust learning by feature deletion

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      ICML '06: Proceedings of the 23rd international conference on Machine learning
      June 2006
      1154 pages
      ISBN:1595933832
      DOI:10.1145/1143844

      Copyright © 2006 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 25 June 2006

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • Article

      Acceptance Rates

      ICML '06 Paper Acceptance Rate140of548submissions,26%Overall Acceptance Rate140of548submissions,26%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader