Skip to main content
Top

2003 | OriginalPaper | Chapter

Learning with Rigorous Support Vector Machines

Authors : Jinbo Bi, Vladimir N. Vapnik

Published in: Learning Theory and Kernel Machines

Publisher: Springer Berlin Heidelberg

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

We examine the so-called rigorous support vector machine (RSVM) approach proposed by Vapnik (1998). The formulation of RSVM is derived by explicitly implementing the structural risk minimization principle with a parameter H used to directly control the VC dimension of the set of separating hyperplanes. By optimizing the dual problem, RSVM finds the optimal separating hyperplane from a set of functions with VC dimension approximate to H2+1. RSVM produces classifiers equivalent to those obtained by classic SVMs for appropriate parameter choices, but the use of the parameter H facilitates model selection, thus minimizing VC bounds on the generalization risk more effectively. In our empirical studies, good models are achieved for an appropriate H2 ∈ [5% ℓ, 30% ℓ] where ℓ is the size of training data.

Metadata
Title
Learning with Rigorous Support Vector Machines
Authors
Jinbo Bi
Vladimir N. Vapnik
Copyright Year
2003
Publisher
Springer Berlin Heidelberg
DOI
https://doi.org/10.1007/978-3-540-45167-9_19

Premium Partner