Skip to main content
Top
Published in: Neural Processing Letters 2/2014

01-04-2014

Approximation and Estimation Bounds for Subsets of Reproducing Kernel Kreǐn Spaces

Author: Giorgio Gnecco

Published in: Neural Processing Letters | Issue 2/2014

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Reproducing kernel Kreǐn spaces are used in learning from data via kernel methods when the kernel is indefinite. In this paper, a characterization of a subset of the unit ball in such spaces is provided. Conditions are given, under which upper bounds on the estimation error and the approximation error can be applied simultaneously to such a subset. Finally, it is shown that the hyperbolic-tangent kernel and other indefinite kernels satisfy such conditions.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Footnotes
1
By a domain, we mean the closure of an open and connected set.
 
2
For \(d=1\), such a set \(\varOmega \) is a closed and bounded interval.
 
3
Note that [43, Theorem 8] actually provides an upper bound of order \(O(d^4)\) on the VC dimension of the family of binary-valued functions obtained by thresholding functions belonging to \(\mathcal{F}\). However, an upper bound of the same order \(O(d^4)\) is obtained on the VC dimension of the real-valued family of functions \(\mathcal{F}\) by recalling the relationship between the VC dimension of a family of real-valued functions and the one of the family of binary-valued functions obtained by adding a bias unit and thresholding the resulting output [6, Sect. 3.6].
 
Literature
1.
go back to reference Shawe-Taylor J, Cristianini N (2004) Kernel methods for pattern analysis. Cambridge University Press, CambridgeCrossRef Shawe-Taylor J, Cristianini N (2004) Kernel methods for pattern analysis. Cambridge University Press, CambridgeCrossRef
3.
go back to reference Schölkopf B, Smola A (2002) Learning with kernels. MIT Press, Cambridge Schölkopf B, Smola A (2002) Learning with kernels. MIT Press, Cambridge
4.
go back to reference Bishop CM (1995) Neural networks for pattern recognition. Oxford University Press, Oxford Bishop CM (1995) Neural networks for pattern recognition. Oxford University Press, Oxford
5.
go back to reference Vapnik VP (1998) Statistical learning theory. Springer, HeidelbergMATH Vapnik VP (1998) Statistical learning theory. Springer, HeidelbergMATH
7.
go back to reference Mendelson S (2003) A few notes on statistical learning theory. Advanced lectures on machine learning. Springer, New York, pp 1–40 Mendelson S (2003) A few notes on statistical learning theory. Advanced lectures on machine learning. Springer, New York, pp 1–40
8.
go back to reference Smola AJ, Óvári ZL, Williamson RC (2001) Regularization with dot-product kernels. In: Leen T, Dietterich T, Tresp V (eds) Proceedings of neural information processing systems 13, pp 308–314 Smola AJ, Óvári ZL, Williamson RC (2001) Regularization with dot-product kernels. In: Leen T, Dietterich T, Tresp V (eds) Proceedings of neural information processing systems 13, pp 308–314
9.
go back to reference Lin HT, Lin CJ (2003) A study on sigmoid kernels for SVM and the training of non-PSD kernels by SMO-type methods. National Taiwan University, Technical report Lin HT, Lin CJ (2003) A study on sigmoid kernels for SVM and the training of non-PSD kernels by SMO-type methods. National Taiwan University, Technical report
10.
go back to reference Ramon J, Gärtner T (2003) Expressivity versus efficiency of graph kernels. In: Washio T, De Raedt L (eds) Proceedings of 1st international workshop on mining graphs, trees and sequences, pp 65–74 Ramon J, Gärtner T (2003) Expressivity versus efficiency of graph kernels. In: Washio T, De Raedt L (eds) Proceedings of 1st international workshop on mining graphs, trees and sequences, pp 65–74
11.
go back to reference Borgwardt KM, Kriegel HP (2005) Shortest path kernels on graphs. In: Proceedings of 5th IEEE international conference on data mining, Washington, pp 74–81 Borgwardt KM, Kriegel HP (2005) Shortest path kernels on graphs. In: Proceedings of 5th IEEE international conference on data mining, Washington, pp 74–81
12.
go back to reference Haasdonk B, Keysers D (2002) Tangent distance kernels for support vector machines. In: Proceedings of 16th international conference on pattern recognition, pp 864–868 Haasdonk B, Keysers D (2002) Tangent distance kernels for support vector machines. In: Proceedings of 16th international conference on pattern recognition, pp 864–868
13.
go back to reference Saigo H, Vert J, Ueda N, Akutsu T (2004) Protein homology detection using string alignment kernels. Bioinformatics 20:1682–1689CrossRef Saigo H, Vert J, Ueda N, Akutsu T (2004) Protein homology detection using string alignment kernels. Bioinformatics 20:1682–1689CrossRef
14.
go back to reference Wu G, Chang EY, Zhang Z (2005) An analysis of transformation on non-positive semidefinite similarity matrix for kernel machines. In: Proceedings of 22nd international conference on machine learning, pp 315–322 Wu G, Chang EY, Zhang Z (2005) An analysis of transformation on non-positive semidefinite similarity matrix for kernel machines. In: Proceedings of 22nd international conference on machine learning, pp 315–322
15.
go back to reference Haasdonk B, Bahlmann C (2004) Learning with distance substitution kernels. In: Proceedings of 26th DAGM symposium on pattern recognition, pp 220–227 Haasdonk B, Bahlmann C (2004) Learning with distance substitution kernels. In: Proceedings of 26th DAGM symposium on pattern recognition, pp 220–227
16.
go back to reference Luss R, d’Aspremont A (2007) Support vector machine classification with indefinite kernels. In: Proceedings of NIPS 2007, pp 1–9 Luss R, d’Aspremont A (2007) Support vector machine classification with indefinite kernels. In: Proceedings of NIPS 2007, pp 1–9
17.
go back to reference Chen J, Ye J (2008) Training SVM with non indefinite kernels. In: Proceedings of 25th international conference on machine learning, New York, pp 136–143 Chen J, Ye J (2008) Training SVM with non indefinite kernels. In: Proceedings of 25th international conference on machine learning, New York, pp 136–143
18.
go back to reference Ying Y, Campbell C, Girolami M (2009) Analysis of SVM with indefinite kernels. In: Proceedings of NIPS 2009, Vancouver, pp 2205–2213 Ying Y, Campbell C, Girolami M (2009) Analysis of SVM with indefinite kernels. In: Proceedings of NIPS 2009, Vancouver, pp 2205–2213
19.
20.
go back to reference Haasdonk B (2005) Feature space interpretation of SVMs with indefinite kernels. IEEE Trans Pattern Anal Mach Intell 27(4):482–492 Haasdonk B (2005) Feature space interpretation of SVMs with indefinite kernels. IEEE Trans Pattern Anal Mach Intell 27(4):482–492
21.
go back to reference Liwicki S, Zafeiriou S, Tzimiropoulos G, Pantic M (2012) Efficient online subspace learning with an indefinite kernel for visual tracking and recognition. IEEE Trans Neural Netw Learn Syst 3:1624–1636CrossRef Liwicki S, Zafeiriou S, Tzimiropoulos G, Pantic M (2012) Efficient online subspace learning with an indefinite kernel for visual tracking and recognition. IEEE Trans Neural Netw Learn Syst 3:1624–1636CrossRef
22.
go back to reference Bartlett PL, Mendelson S (2002) Rademacher and Gaussian complexities: risk bounds and structural results. J Mach Learn Res 3:463–482 Bartlett PL, Mendelson S (2002) Rademacher and Gaussian complexities: risk bounds and structural results. J Mach Learn Res 3:463–482
23.
go back to reference Ong CS, Mary X, Canu S, Smola AJ (2004) Learning with non positive kernels. In: Proceedings of 21st international conference on machine learning, pp 639–646 Ong CS, Mary X, Canu S, Smola AJ (2004) Learning with non positive kernels. In: Proceedings of 21st international conference on machine learning, pp 639–646
24.
go back to reference Gnecco G, Sanguineti M (2008) Approximation error bounds via Rademacher’s complexity. Appl Math Sci 2:153–176MATHMathSciNet Gnecco G, Sanguineti M (2008) Approximation error bounds via Rademacher’s complexity. Appl Math Sci 2:153–176MATHMathSciNet
25.
go back to reference Gnecco G, Sanguineti M (2008) Estimates of the approximation error via Rademacher complexity: learning vector-valued functions. J Inequal Appl 2008(640758):16MathSciNet Gnecco G, Sanguineti M (2008) Estimates of the approximation error via Rademacher complexity: learning vector-valued functions. J Inequal Appl 2008(640758):16MathSciNet
26.
go back to reference Gnecco G, Sanguineti M (2010) Regularization techniques and suboptimal solutions to optimization problems in learning from data. Neural Comput 22:793–829CrossRefMATHMathSciNet Gnecco G, Sanguineti M (2010) Regularization techniques and suboptimal solutions to optimization problems in learning from data. Neural Comput 22:793–829CrossRefMATHMathSciNet
27.
go back to reference Gnecco G, Gori M, Sanguineti M (2013) Learning with boundary conditions. Neural Comput 25 (in press) Gnecco G, Gori M, Sanguineti M (2013) Learning with boundary conditions. Neural Comput 25 (in press)
28.
go back to reference Anguita D, Ghio A, Oneto L, Ridella S (2011) Maximal discrepancy vs. Rademacher complexity for error estimation. In: Proceedings of ESANN 2011, pp 257–262 Anguita D, Ghio A, Oneto L, Ridella S (2011) Maximal discrepancy vs. Rademacher complexity for error estimation. In: Proceedings of ESANN 2011, pp 257–262
29.
go back to reference Anguita D, Ghio A, Oneto L, Ridella S (2012) In-sample model selection for trimmed hinge loss support vector machine. Neural Process Lett 36:275–283CrossRef Anguita D, Ghio A, Oneto L, Ridella S (2012) In-sample model selection for trimmed hinge loss support vector machine. Neural Process Lett 36:275–283CrossRef
30.
go back to reference Friedman A (1992) Foundations of modern analysis. Dover, New York Friedman A (1992) Foundations of modern analysis. Dover, New York
32.
go back to reference Birman MS, Solomjak MZ (1987) Spectral theory of self-adjoint operators in Hilbert space. D. Reidel Publishing Company, DordrechtMATH Birman MS, Solomjak MZ (1987) Spectral theory of self-adjoint operators in Hilbert space. D. Reidel Publishing Company, DordrechtMATH
33.
go back to reference Tricomi FG (1985) Integral equations. Dover, New York Tricomi FG (1985) Integral equations. Dover, New York
34.
go back to reference Smirnov VI (1964) A course of higher mathematics: integration and functional analysis, vol 5. Addison-Wesley, Reading Smirnov VI (1964) A course of higher mathematics: integration and functional analysis, vol 5. Addison-Wesley, Reading
35.
go back to reference Girosi F (1995) Approximation error bounds that use VC-bounds. In: Proceedings of international conference on artificial neural networks, pp 295–302 Girosi F (1995) Approximation error bounds that use VC-bounds. In: Proceedings of international conference on artificial neural networks, pp 295–302
36.
go back to reference Barron AR (1992) Neural net approximation. In: Proceedings of 7th Yale workshop on adaptive and learning systems, New Haven, pp 69–72 Barron AR (1992) Neural net approximation. In: Proceedings of 7th Yale workshop on adaptive and learning systems, New Haven, pp 69–72
37.
38.
go back to reference K\(\mathring{\rm u}\)rková V, Kainen PC, Kreinovich V (1997) Estimates of the number of hidden units and variation with respect to half-spaces. Neural Netw 10:1061–1068 K\(\mathring{\rm u}\)rková V, Kainen PC, Kreinovich V (1997) Estimates of the number of hidden units and variation with respect to half-spaces. Neural Netw 10:1061–1068
39.
go back to reference Tsybakov AB (2008) Introduction to nonparametric estimation. Springer, New York Tsybakov AB (2008) Introduction to nonparametric estimation. Springer, New York
40.
go back to reference Cucker F, Zhou DX (2007) Learning theory: an approximation theory viewpoint. Cambridge University Press, CambridgeCrossRef Cucker F, Zhou DX (2007) Learning theory: an approximation theory viewpoint. Cambridge University Press, CambridgeCrossRef
41.
go back to reference Loosli G, Canu S (2011) Non positive SVM. In: OPT NIPS Workshop, pp 1–6 Loosli G, Canu S (2011) Non positive SVM. In: OPT NIPS Workshop, pp 1–6
42.
go back to reference Schwabik S, Ye G (2005) Topics in banach space integration. World Scientific, SingaporeMATH Schwabik S, Ye G (2005) Topics in banach space integration. World Scientific, SingaporeMATH
43.
go back to reference Bartlett PL, Mass W (2003) The handbook of brain theory and neural networks. In: Arbib MA (ed) Vapnik–Chervonenkis dimension of neural nets, 2nd edn. MIT Press, Cambridge, pp 1188–1192 Bartlett PL, Mass W (2003) The handbook of brain theory and neural networks. In: Arbib MA (ed) Vapnik–Chervonenkis dimension of neural nets, 2nd edn. MIT Press, Cambridge, pp 1188–1192
Metadata
Title
Approximation and Estimation Bounds for Subsets of Reproducing Kernel Kreǐn Spaces
Author
Giorgio Gnecco
Publication date
01-04-2014
Publisher
Springer US
Published in
Neural Processing Letters / Issue 2/2014
Print ISSN: 1370-4621
Electronic ISSN: 1573-773X
DOI
https://doi.org/10.1007/s11063-013-9294-9

Other articles of this Issue 2/2014

Neural Processing Letters 2/2014 Go to the issue