Open Access
April 2007 Fast learning rates for plug-in classifiers
Jean-Yves Audibert, Alexandre B. Tsybakov
Ann. Statist. 35(2): 608-633 (April 2007). DOI: 10.1214/009053606000001217

Abstract

It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, that is, rates faster than n−1/2. The work on this subject has suggested the following two conjectures: (i) the best achievable fast rate is of the order n−1, and (ii) the plug-in classifiers generally converge more slowly than the classifiers based on empirical risk minimization. We show that both conjectures are not correct. In particular, we construct plug-in classifiers that can achieve not only fast, but also super-fast rates, that is, rates faster than n−1. We establish minimax lower bounds showing that the obtained rates cannot be improved.

Citation

Download Citation

Jean-Yves Audibert. Alexandre B. Tsybakov. "Fast learning rates for plug-in classifiers." Ann. Statist. 35 (2) 608 - 633, April 2007. https://doi.org/10.1214/009053606000001217

Information

Published: April 2007
First available in Project Euclid: 5 July 2007

zbMATH: 1118.62041
MathSciNet: MR2336861
Digital Object Identifier: 10.1214/009053606000001217

Subjects:
Primary: 62G07
Secondary: 62G08 , 62H05 , 68T10

Keywords: ‎classification‎ , excess risk , fast rates of convergence , minimax lower bounds , plug-in classifiers , Statistical learning

Rights: Copyright © 2007 Institute of Mathematical Statistics

Vol.35 • No. 2 • April 2007
Back to Top