Skip to main content
Erschienen in: Neural Processing Letters 2/2014

01.04.2014

Generalization Bounds of Regularization Algorithm with Gaussian Kernels

verfasst von: Feilong Cao, Yufang Liu, Weiguo Zhang

Erschienen in: Neural Processing Letters | Ausgabe 2/2014

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

In many practical applications, the performance of a learning algorithm is not actually affected only by an unitary factor just like the complexity of hypothesis space, stability of the algorithm and data quality. This paper addresses in the performance of the regularization algorithm associated with Gaussian kernels. The main purpose is to provide a framework of evaluating the generalization performance of the algorithm conjointly in terms of hypothesis space complexity, algorithmic stability and data quality. The new bounds on generalization error of such algorithm measured by regularization error and sample error are established. It is shown that the regularization error has polynomial decays under some conditions, and the new bounds are based on uniform stability of the algorithm, covering number of hypothesis space and data information simultaneously. As an application, the obtained results are applied to several special regularization algorithms, and some new results for the special algorithms are deduced.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
2.
Zurück zum Zitat Bartlett PL (1998) The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network. IEEE Trans Inf Theory 44:525–536CrossRefMATHMathSciNet Bartlett PL (1998) The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network. IEEE Trans Inf Theory 44:525–536CrossRefMATHMathSciNet
3.
4.
Zurück zum Zitat Bousquet O (2003) New approaches to statistical learning theory. Ann Inst Stat Math 55:371–389MATHMathSciNet Bousquet O (2003) New approaches to statistical learning theory. Ann Inst Stat Math 55:371–389MATHMathSciNet
5.
Zurück zum Zitat Boser H, Guyon I, Vapnik V (1992) A training algorithm for optimal margin classifiers. Proc Fifth Annu Workshop Comput Learn Theory 5:144–152CrossRef Boser H, Guyon I, Vapnik V (1992) A training algorithm for optimal margin classifiers. Proc Fifth Annu Workshop Comput Learn Theory 5:144–152CrossRef
6.
Zurück zum Zitat Chang XY, Xu ZB, Zou B, Zhang H (2011) Generalization bounds of regularization algorithms derived simultaneously through hypothesis space complexity, algorithmic stability and data quality. Int. J. Wavel Multiresolut Inf Process 9:549–570CrossRefMATHMathSciNet Chang XY, Xu ZB, Zou B, Zhang H (2011) Generalization bounds of regularization algorithms derived simultaneously through hypothesis space complexity, algorithmic stability and data quality. Int. J. Wavel Multiresolut Inf Process 9:549–570CrossRefMATHMathSciNet
7.
Zurück zum Zitat Chen DR, Wu Q, Ying YM, Zhou DX (2004) Supprot vector machine soft margin clssifiers: error analysis. J Mach Learn Res 5:1143–1175MATHMathSciNet Chen DR, Wu Q, Ying YM, Zhou DX (2004) Supprot vector machine soft margin clssifiers: error analysis. J Mach Learn Res 5:1143–1175MATHMathSciNet
8.
Zurück zum Zitat Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20:273–297MATH Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20:273–297MATH
9.
Zurück zum Zitat Cuker F, Smale S (2001) On the mathematical foundations of learning. Bull Am Math Soc 39:1–49CrossRef Cuker F, Smale S (2001) On the mathematical foundations of learning. Bull Am Math Soc 39:1–49CrossRef
10.
Zurück zum Zitat Cuker F, Smale S (2002) Best choices for regularization parameters in learning theory: on the bias-variance problem. Found Comput Math 2:413–428CrossRefMathSciNet Cuker F, Smale S (2002) Best choices for regularization parameters in learning theory: on the bias-variance problem. Found Comput Math 2:413–428CrossRefMathSciNet
11.
Zurück zum Zitat Cucker F, Zhou DX (2006) Learning theory: an approximation theory viewpoint. Cambridge University Press, Cambridge Cucker F, Zhou DX (2006) Learning theory: an approximation theory viewpoint. Cambridge University Press, Cambridge
12.
Zurück zum Zitat Devroye L, Wagner T (1979) Distribution-free performance bounds for potential function rules. IEEE Trans Inf Theory 25(5):601–604CrossRefMATHMathSciNet Devroye L, Wagner T (1979) Distribution-free performance bounds for potential function rules. IEEE Trans Inf Theory 25(5):601–604CrossRefMATHMathSciNet
13.
Zurück zum Zitat Evgeniou T, Pontil M (1999) On the V-gamma dimension for regression in reproducing kernel Hilbert spaces, in proceedings of algorithmic learning theory. Lecture Notes Comput Sci 1720:106–117CrossRefMathSciNet Evgeniou T, Pontil M (1999) On the V-gamma dimension for regression in reproducing kernel Hilbert spaces, in proceedings of algorithmic learning theory. Lecture Notes Comput Sci 1720:106–117CrossRefMathSciNet
14.
Zurück zum Zitat Kearns M, Ron D (1999) Algorithmic stability and sanity-check bounds for leave-one-out cross-validation. Neural Comput 11:1427–1453CrossRef Kearns M, Ron D (1999) Algorithmic stability and sanity-check bounds for leave-one-out cross-validation. Neural Comput 11:1427–1453CrossRef
15.
Zurück zum Zitat Kutin S, Niyogi P (2002) Almost-everywhere algorithmic stability and generalization error, Tecnical Report TR-2002-03. Department of Computer Science, The University of Chicago, Chicago Kutin S, Niyogi P (2002) Almost-everywhere algorithmic stability and generalization error, Tecnical Report TR-2002-03. Department of Computer Science, The University of Chicago, Chicago
16.
Zurück zum Zitat Martinand A, Bartlett PL (1999) Neural network learning: theoretical foundations. Cambridge University Press, Cambridge Martinand A, Bartlett PL (1999) Neural network learning: theoretical foundations. Cambridge University Press, Cambridge
17.
Zurück zum Zitat Poggio T, Girosi F (1990) Regularization algorithms for learning that are equivalent to multilayer networks. Sci New Ser 247:978–982MATHMathSciNet Poggio T, Girosi F (1990) Regularization algorithms for learning that are equivalent to multilayer networks. Sci New Ser 247:978–982MATHMathSciNet
18.
19.
Zurück zum Zitat Steinwart I, Hush D, Scovel C (2006) An explicit description of the reproducing kernel Hilbert spaces of Gaussian RBF kernels. IEEE Trans Inf Theory 52:4635–4643CrossRefMathSciNet Steinwart I, Hush D, Scovel C (2006) An explicit description of the reproducing kernel Hilbert spaces of Gaussian RBF kernels. IEEE Trans Inf Theory 52:4635–4643CrossRefMathSciNet
20.
Zurück zum Zitat Vapnik V, Chervonenkis A (1971) On the uniform convergence of relative frequencies of events to their probabilities. Theory Probab Appl 16:264–280CrossRefMATH Vapnik V, Chervonenkis A (1971) On the uniform convergence of relative frequencies of events to their probabilities. Theory Probab Appl 16:264–280CrossRefMATH
21.
Zurück zum Zitat Vapnik V (1998) Statistical learning theory. John Wiley, New YorkMATH Vapnik V (1998) Statistical learning theory. John Wiley, New YorkMATH
22.
Zurück zum Zitat De Vito E, Caponnteeo A, Rosasco L (2005) Model selection for regularized least-squares algorithm. Found Comput Math 5(1):59–85CrossRefMATHMathSciNet De Vito E, Caponnteeo A, Rosasco L (2005) Model selection for regularized least-squares algorithm. Found Comput Math 5(1):59–85CrossRefMATHMathSciNet
23.
Zurück zum Zitat Wu Q, Zhou DX (2005) SVM soft margin classifiers: linear programming versus quadratic programming. Neural Comput 17:1160–1187CrossRefMATHMathSciNet Wu Q, Zhou DX (2005) SVM soft margin classifiers: linear programming versus quadratic programming. Neural Comput 17:1160–1187CrossRefMATHMathSciNet
24.
Zurück zum Zitat Xiang DH, Zhou DX (2009) Classification with Gaussians and convex loss. J Mach Learn Res 10: 1447–1468 Xiang DH, Zhou DX (2009) Classification with Gaussians and convex loss. J Mach Learn Res 10: 1447–1468
25.
26.
Zurück zum Zitat Zou B, Li LQ, Xu J (2005) The bounds on the rate of uniform convergence for learning machine. Lect Notes Comput Sci 3496:538–545CrossRef Zou B, Li LQ, Xu J (2005) The bounds on the rate of uniform convergence for learning machine. Lect Notes Comput Sci 3496:538–545CrossRef
Metadaten
Titel
Generalization Bounds of Regularization Algorithm with Gaussian Kernels
verfasst von
Feilong Cao
Yufang Liu
Weiguo Zhang
Publikationsdatum
01.04.2014
Verlag
Springer US
Erschienen in
Neural Processing Letters / Ausgabe 2/2014
Print ISSN: 1370-4621
Elektronische ISSN: 1573-773X
DOI
https://doi.org/10.1007/s11063-013-9298-5

Weitere Artikel der Ausgabe 2/2014

Neural Processing Letters 2/2014 Zur Ausgabe

Neuer Inhalt