skip to main content
article
Free Access

Winning the KDD99 classification cup: bagged boosting

Published:01 January 2000Publication History
Skip Abstract Section

Abstract

We briefly describe our approach for the KDD99 Classification Cup. The solution is essentially a mixture of bagging and boosting. Additionally, asymmetric error costs are taken into account by minimizing the so-called conditional risk. Furthermore, the standard sampling with replacement methodology of bagging was modified to put a specific focus on the smaller but expensive-if-predicted-wrongly classes.

References

  1. Duda R. O., Hart P. E.: Pattern Classification and Scene Analysis, Wiley, Chichester/London/New York, 1973.Google ScholarGoogle Scholar
  2. MetaL 1999: An ESPRIT Long-Term Research Project: A Meta-Learning Assistant for Providing User Support in Data Mining and Machine Learning, http://www.cs.bris.ac.uk/cgc/METAL/Google ScholarGoogle Scholar

Index Terms

  1. Winning the KDD99 classification cup: bagged boosting
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    Full Access

    • Published in

      cover image ACM SIGKDD Explorations Newsletter
      ACM SIGKDD Explorations Newsletter  Volume 1, Issue 2
      January 2000
      115 pages
      ISSN:1931-0145
      EISSN:1931-0153
      DOI:10.1145/846183
      Issue’s Table of Contents

      Copyright © 2000 Author

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 1 January 2000

      Check for updates

      Qualifiers

      • article

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader