Skip to main content
Erschienen in: Neural Processing Letters 2/2014

01.04.2014

Approximation and Estimation Bounds for Subsets of Reproducing Kernel Kreǐn Spaces

verfasst von: Giorgio Gnecco

Erschienen in: Neural Processing Letters | Ausgabe 2/2014

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Reproducing kernel Kreǐn spaces are used in learning from data via kernel methods when the kernel is indefinite. In this paper, a characterization of a subset of the unit ball in such spaces is provided. Conditions are given, under which upper bounds on the estimation error and the approximation error can be applied simultaneously to such a subset. Finally, it is shown that the hyperbolic-tangent kernel and other indefinite kernels satisfy such conditions.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Fußnoten
1
By a domain, we mean the closure of an open and connected set.
 
2
For \(d=1\), such a set \(\varOmega \) is a closed and bounded interval.
 
3
Note that [43, Theorem 8] actually provides an upper bound of order \(O(d^4)\) on the VC dimension of the family of binary-valued functions obtained by thresholding functions belonging to \(\mathcal{F}\). However, an upper bound of the same order \(O(d^4)\) is obtained on the VC dimension of the real-valued family of functions \(\mathcal{F}\) by recalling the relationship between the VC dimension of a family of real-valued functions and the one of the family of binary-valued functions obtained by adding a bias unit and thresholding the resulting output [6, Sect. 3.6].
 
Literatur
1.
Zurück zum Zitat Shawe-Taylor J, Cristianini N (2004) Kernel methods for pattern analysis. Cambridge University Press, CambridgeCrossRef Shawe-Taylor J, Cristianini N (2004) Kernel methods for pattern analysis. Cambridge University Press, CambridgeCrossRef
3.
Zurück zum Zitat Schölkopf B, Smola A (2002) Learning with kernels. MIT Press, Cambridge Schölkopf B, Smola A (2002) Learning with kernels. MIT Press, Cambridge
4.
Zurück zum Zitat Bishop CM (1995) Neural networks for pattern recognition. Oxford University Press, Oxford Bishop CM (1995) Neural networks for pattern recognition. Oxford University Press, Oxford
5.
Zurück zum Zitat Vapnik VP (1998) Statistical learning theory. Springer, HeidelbergMATH Vapnik VP (1998) Statistical learning theory. Springer, HeidelbergMATH
6.
7.
Zurück zum Zitat Mendelson S (2003) A few notes on statistical learning theory. Advanced lectures on machine learning. Springer, New York, pp 1–40 Mendelson S (2003) A few notes on statistical learning theory. Advanced lectures on machine learning. Springer, New York, pp 1–40
8.
Zurück zum Zitat Smola AJ, Óvári ZL, Williamson RC (2001) Regularization with dot-product kernels. In: Leen T, Dietterich T, Tresp V (eds) Proceedings of neural information processing systems 13, pp 308–314 Smola AJ, Óvári ZL, Williamson RC (2001) Regularization with dot-product kernels. In: Leen T, Dietterich T, Tresp V (eds) Proceedings of neural information processing systems 13, pp 308–314
9.
Zurück zum Zitat Lin HT, Lin CJ (2003) A study on sigmoid kernels for SVM and the training of non-PSD kernels by SMO-type methods. National Taiwan University, Technical report Lin HT, Lin CJ (2003) A study on sigmoid kernels for SVM and the training of non-PSD kernels by SMO-type methods. National Taiwan University, Technical report
10.
Zurück zum Zitat Ramon J, Gärtner T (2003) Expressivity versus efficiency of graph kernels. In: Washio T, De Raedt L (eds) Proceedings of 1st international workshop on mining graphs, trees and sequences, pp 65–74 Ramon J, Gärtner T (2003) Expressivity versus efficiency of graph kernels. In: Washio T, De Raedt L (eds) Proceedings of 1st international workshop on mining graphs, trees and sequences, pp 65–74
11.
Zurück zum Zitat Borgwardt KM, Kriegel HP (2005) Shortest path kernels on graphs. In: Proceedings of 5th IEEE international conference on data mining, Washington, pp 74–81 Borgwardt KM, Kriegel HP (2005) Shortest path kernels on graphs. In: Proceedings of 5th IEEE international conference on data mining, Washington, pp 74–81
12.
Zurück zum Zitat Haasdonk B, Keysers D (2002) Tangent distance kernels for support vector machines. In: Proceedings of 16th international conference on pattern recognition, pp 864–868 Haasdonk B, Keysers D (2002) Tangent distance kernels for support vector machines. In: Proceedings of 16th international conference on pattern recognition, pp 864–868
13.
Zurück zum Zitat Saigo H, Vert J, Ueda N, Akutsu T (2004) Protein homology detection using string alignment kernels. Bioinformatics 20:1682–1689CrossRef Saigo H, Vert J, Ueda N, Akutsu T (2004) Protein homology detection using string alignment kernels. Bioinformatics 20:1682–1689CrossRef
14.
Zurück zum Zitat Wu G, Chang EY, Zhang Z (2005) An analysis of transformation on non-positive semidefinite similarity matrix for kernel machines. In: Proceedings of 22nd international conference on machine learning, pp 315–322 Wu G, Chang EY, Zhang Z (2005) An analysis of transformation on non-positive semidefinite similarity matrix for kernel machines. In: Proceedings of 22nd international conference on machine learning, pp 315–322
15.
Zurück zum Zitat Haasdonk B, Bahlmann C (2004) Learning with distance substitution kernels. In: Proceedings of 26th DAGM symposium on pattern recognition, pp 220–227 Haasdonk B, Bahlmann C (2004) Learning with distance substitution kernels. In: Proceedings of 26th DAGM symposium on pattern recognition, pp 220–227
16.
Zurück zum Zitat Luss R, d’Aspremont A (2007) Support vector machine classification with indefinite kernels. In: Proceedings of NIPS 2007, pp 1–9 Luss R, d’Aspremont A (2007) Support vector machine classification with indefinite kernels. In: Proceedings of NIPS 2007, pp 1–9
17.
Zurück zum Zitat Chen J, Ye J (2008) Training SVM with non indefinite kernels. In: Proceedings of 25th international conference on machine learning, New York, pp 136–143 Chen J, Ye J (2008) Training SVM with non indefinite kernels. In: Proceedings of 25th international conference on machine learning, New York, pp 136–143
18.
Zurück zum Zitat Ying Y, Campbell C, Girolami M (2009) Analysis of SVM with indefinite kernels. In: Proceedings of NIPS 2009, Vancouver, pp 2205–2213 Ying Y, Campbell C, Girolami M (2009) Analysis of SVM with indefinite kernels. In: Proceedings of NIPS 2009, Vancouver, pp 2205–2213
19.
20.
Zurück zum Zitat Haasdonk B (2005) Feature space interpretation of SVMs with indefinite kernels. IEEE Trans Pattern Anal Mach Intell 27(4):482–492 Haasdonk B (2005) Feature space interpretation of SVMs with indefinite kernels. IEEE Trans Pattern Anal Mach Intell 27(4):482–492
21.
Zurück zum Zitat Liwicki S, Zafeiriou S, Tzimiropoulos G, Pantic M (2012) Efficient online subspace learning with an indefinite kernel for visual tracking and recognition. IEEE Trans Neural Netw Learn Syst 3:1624–1636CrossRef Liwicki S, Zafeiriou S, Tzimiropoulos G, Pantic M (2012) Efficient online subspace learning with an indefinite kernel for visual tracking and recognition. IEEE Trans Neural Netw Learn Syst 3:1624–1636CrossRef
22.
Zurück zum Zitat Bartlett PL, Mendelson S (2002) Rademacher and Gaussian complexities: risk bounds and structural results. J Mach Learn Res 3:463–482 Bartlett PL, Mendelson S (2002) Rademacher and Gaussian complexities: risk bounds and structural results. J Mach Learn Res 3:463–482
23.
Zurück zum Zitat Ong CS, Mary X, Canu S, Smola AJ (2004) Learning with non positive kernels. In: Proceedings of 21st international conference on machine learning, pp 639–646 Ong CS, Mary X, Canu S, Smola AJ (2004) Learning with non positive kernels. In: Proceedings of 21st international conference on machine learning, pp 639–646
24.
Zurück zum Zitat Gnecco G, Sanguineti M (2008) Approximation error bounds via Rademacher’s complexity. Appl Math Sci 2:153–176MATHMathSciNet Gnecco G, Sanguineti M (2008) Approximation error bounds via Rademacher’s complexity. Appl Math Sci 2:153–176MATHMathSciNet
25.
Zurück zum Zitat Gnecco G, Sanguineti M (2008) Estimates of the approximation error via Rademacher complexity: learning vector-valued functions. J Inequal Appl 2008(640758):16MathSciNet Gnecco G, Sanguineti M (2008) Estimates of the approximation error via Rademacher complexity: learning vector-valued functions. J Inequal Appl 2008(640758):16MathSciNet
26.
Zurück zum Zitat Gnecco G, Sanguineti M (2010) Regularization techniques and suboptimal solutions to optimization problems in learning from data. Neural Comput 22:793–829CrossRefMATHMathSciNet Gnecco G, Sanguineti M (2010) Regularization techniques and suboptimal solutions to optimization problems in learning from data. Neural Comput 22:793–829CrossRefMATHMathSciNet
27.
Zurück zum Zitat Gnecco G, Gori M, Sanguineti M (2013) Learning with boundary conditions. Neural Comput 25 (in press) Gnecco G, Gori M, Sanguineti M (2013) Learning with boundary conditions. Neural Comput 25 (in press)
28.
Zurück zum Zitat Anguita D, Ghio A, Oneto L, Ridella S (2011) Maximal discrepancy vs. Rademacher complexity for error estimation. In: Proceedings of ESANN 2011, pp 257–262 Anguita D, Ghio A, Oneto L, Ridella S (2011) Maximal discrepancy vs. Rademacher complexity for error estimation. In: Proceedings of ESANN 2011, pp 257–262
29.
Zurück zum Zitat Anguita D, Ghio A, Oneto L, Ridella S (2012) In-sample model selection for trimmed hinge loss support vector machine. Neural Process Lett 36:275–283CrossRef Anguita D, Ghio A, Oneto L, Ridella S (2012) In-sample model selection for trimmed hinge loss support vector machine. Neural Process Lett 36:275–283CrossRef
30.
Zurück zum Zitat Friedman A (1992) Foundations of modern analysis. Dover, New York Friedman A (1992) Foundations of modern analysis. Dover, New York
32.
Zurück zum Zitat Birman MS, Solomjak MZ (1987) Spectral theory of self-adjoint operators in Hilbert space. D. Reidel Publishing Company, DordrechtMATH Birman MS, Solomjak MZ (1987) Spectral theory of self-adjoint operators in Hilbert space. D. Reidel Publishing Company, DordrechtMATH
33.
Zurück zum Zitat Tricomi FG (1985) Integral equations. Dover, New York Tricomi FG (1985) Integral equations. Dover, New York
34.
Zurück zum Zitat Smirnov VI (1964) A course of higher mathematics: integration and functional analysis, vol 5. Addison-Wesley, Reading Smirnov VI (1964) A course of higher mathematics: integration and functional analysis, vol 5. Addison-Wesley, Reading
35.
Zurück zum Zitat Girosi F (1995) Approximation error bounds that use VC-bounds. In: Proceedings of international conference on artificial neural networks, pp 295–302 Girosi F (1995) Approximation error bounds that use VC-bounds. In: Proceedings of international conference on artificial neural networks, pp 295–302
36.
Zurück zum Zitat Barron AR (1992) Neural net approximation. In: Proceedings of 7th Yale workshop on adaptive and learning systems, New Haven, pp 69–72 Barron AR (1992) Neural net approximation. In: Proceedings of 7th Yale workshop on adaptive and learning systems, New Haven, pp 69–72
37.
Zurück zum Zitat Barron AR (1993) Universal approximation bounds for superpositions of a sigmoidal function. IEEE Trans Inf Theory 39:930–945CrossRefMATHMathSciNet Barron AR (1993) Universal approximation bounds for superpositions of a sigmoidal function. IEEE Trans Inf Theory 39:930–945CrossRefMATHMathSciNet
38.
Zurück zum Zitat K\(\mathring{\rm u}\)rková V, Kainen PC, Kreinovich V (1997) Estimates of the number of hidden units and variation with respect to half-spaces. Neural Netw 10:1061–1068 K\(\mathring{\rm u}\)rková V, Kainen PC, Kreinovich V (1997) Estimates of the number of hidden units and variation with respect to half-spaces. Neural Netw 10:1061–1068
39.
Zurück zum Zitat Tsybakov AB (2008) Introduction to nonparametric estimation. Springer, New York Tsybakov AB (2008) Introduction to nonparametric estimation. Springer, New York
40.
Zurück zum Zitat Cucker F, Zhou DX (2007) Learning theory: an approximation theory viewpoint. Cambridge University Press, CambridgeCrossRef Cucker F, Zhou DX (2007) Learning theory: an approximation theory viewpoint. Cambridge University Press, CambridgeCrossRef
41.
Zurück zum Zitat Loosli G, Canu S (2011) Non positive SVM. In: OPT NIPS Workshop, pp 1–6 Loosli G, Canu S (2011) Non positive SVM. In: OPT NIPS Workshop, pp 1–6
42.
Zurück zum Zitat Schwabik S, Ye G (2005) Topics in banach space integration. World Scientific, SingaporeMATH Schwabik S, Ye G (2005) Topics in banach space integration. World Scientific, SingaporeMATH
43.
Zurück zum Zitat Bartlett PL, Mass W (2003) The handbook of brain theory and neural networks. In: Arbib MA (ed) Vapnik–Chervonenkis dimension of neural nets, 2nd edn. MIT Press, Cambridge, pp 1188–1192 Bartlett PL, Mass W (2003) The handbook of brain theory and neural networks. In: Arbib MA (ed) Vapnik–Chervonenkis dimension of neural nets, 2nd edn. MIT Press, Cambridge, pp 1188–1192
Metadaten
Titel
Approximation and Estimation Bounds for Subsets of Reproducing Kernel Kreǐn Spaces
verfasst von
Giorgio Gnecco
Publikationsdatum
01.04.2014
Verlag
Springer US
Erschienen in
Neural Processing Letters / Ausgabe 2/2014
Print ISSN: 1370-4621
Elektronische ISSN: 1573-773X
DOI
https://doi.org/10.1007/s11063-013-9294-9

Weitere Artikel der Ausgabe 2/2014

Neural Processing Letters 2/2014 Zur Ausgabe

Neuer Inhalt