Open Access
2013 Optimal regression rates for SVMs using Gaussian kernels
Mona Eberts, Ingo Steinwart
Electron. J. Statist. 7: 1-42 (2013). DOI: 10.1214/12-EJS760

Abstract

Support vector machines (SVMs) using Gaussian kernels are one of the standard and state-of-the-art learning algorithms. In this work, we establish new oracle inequalities for such SVMs when applied to either least squares or conditional quantile regression. With the help of these oracle inequalities we then derive learning rates that are (essentially) minmax optimal under standard smoothness assumptions on the target function. We further utilize the oracle inequalities to show that these learning rates can be adaptively achieved by a simple data-dependent parameter selection method that splits the data set into a training and a validation set.

Citation

Download Citation

Mona Eberts. Ingo Steinwart. "Optimal regression rates for SVMs using Gaussian kernels." Electron. J. Statist. 7 1 - 42, 2013. https://doi.org/10.1214/12-EJS760

Information

Published: 2013
First available in Project Euclid: 11 January 2013

zbMATH: 1337.62073
MathSciNet: MR3020412
Digital Object Identifier: 10.1214/12-EJS760

Subjects:
Primary: 62G08
Secondary: 62G05 , 68Q32 , 68T05

Keywords: least squares regression , Quantile estimation , Support vector machines

Rights: Copyright © 2013 The Institute of Mathematical Statistics and the Bernoulli Society

Back to Top