Skip to main content
Top
Published in: International Journal of Machine Learning and Cybernetics 4/2012

01-12-2012 | Original Article

Learning rates of least-square regularized regression with strongly mixing observation

Authors: Yongquan Zhang, Feilong Cao, Canwei Yan

Published in: International Journal of Machine Learning and Cybernetics | Issue 4/2012

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

This paper considers the regularized learning algorithm associated with the least-square loss, strongly mixing observations and reproducing kernel Hilbert spaces. We first give the bound of the sample error with exponentially strongly mixing observations and the rate of approximation by Jackson-type theorem of approximation based on exponentially strongly mixing sequence. Then the generalization error of the least-square regularized regression is obtained by estimating sample error and regularization error.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Show more products
Literature
2.
go back to reference Ibragimov IA, Linnik YV (1971) Independent and stationary sequence of random variables. Wolters-Noordnoff, Groningen Ibragimov IA, Linnik YV (1971) Independent and stationary sequence of random variables. Wolters-Noordnoff, Groningen
3.
go back to reference Modha S, Masry E (1996) Minimum complexity regression estimation with weakly dependent observations. IEEE Trans Inf Theory 42:2133–2145MathSciNetMATHCrossRef Modha S, Masry E (1996) Minimum complexity regression estimation with weakly dependent observations. IEEE Trans Inf Theory 42:2133–2145MathSciNetMATHCrossRef
4.
go back to reference Xu YL, Chen DR (2008) Learning rates of regularized regression for exponentially strongly mixing sequence. J Stat Plan Inference 138:2180–2189MATHCrossRef Xu YL, Chen DR (2008) Learning rates of regularized regression for exponentially strongly mixing sequence. J Stat Plan Inference 138:2180–2189MATHCrossRef
7.
go back to reference Vrdasagar M (2002) Learning and generalization with application to neural networks. 2nd edn. Springer, Berlin Vrdasagar M (2002) Learning and generalization with application to neural networks. 2nd edn. Springer, Berlin
9.
go back to reference Zou B, Li LQ, Xu ZB (2009) The generalization performance of ERM algorithm with strongly mixing observations. Mach Learn 75:275–295CrossRef Zou B, Li LQ, Xu ZB (2009) The generalization performance of ERM algorithm with strongly mixing observations. Mach Learn 75:275–295CrossRef
10.
go back to reference Sidorov G, Koeppen M, Cruz-Corts N (2011) Recent advances in machine learning techniques and applications. Int J Mach Learn Cybern 2(3):123–124CrossRef Sidorov G, Koeppen M, Cruz-Corts N (2011) Recent advances in machine learning techniques and applications. Int J Mach Learn Cybern 2(3):123–124CrossRef
11.
go back to reference Liu Z, Wu QH, Zhang Y, Philip Chen CL (2011) Adaptive least squares support vector machines filter for hand tremor canceling in microsurgery. Int J Mach Learn Cybern 2(1):37–47CrossRef Liu Z, Wu QH, Zhang Y, Philip Chen CL (2011) Adaptive least squares support vector machines filter for hand tremor canceling in microsurgery. Int J Mach Learn Cybern 2(1):37–47CrossRef
12.
go back to reference Wang XZ, Lu SX, Zhai JH (2008) Fast fuzzy multi-category SVM based on support vector domain description. Int J Pattern Recogn Artif Intell 22(1):109–120CrossRef Wang XZ, Lu SX, Zhai JH (2008) Fast fuzzy multi-category SVM based on support vector domain description. Int J Pattern Recogn Artif Intell 22(1):109–120CrossRef
13.
go back to reference He Q, Wu CX (2011) Separating theorem of samples in Banach space for support vector machine learning. Int J Mach Learn Cybern 2(1):49–54CrossRef He Q, Wu CX (2011) Separating theorem of samples in Banach space for support vector machine learning. Int J Mach Learn Cybern 2(1):49–54CrossRef
14.
go back to reference Small K, Roth D (2010) Margin-based active learning for structured predictions. Int J Mach Learn Cybern 1(1-4):3–25CrossRef Small K, Roth D (2010) Margin-based active learning for structured predictions. Int J Mach Learn Cybern 1(1-4):3–25CrossRef
15.
go back to reference Wang XZ, Zhang SF, Zhai JH (2007) A nonlinear integral defined on partition and its application to decision trees. Soft Comput 11(4):317–321MATHCrossRef Wang XZ, Zhang SF, Zhai JH (2007) A nonlinear integral defined on partition and its application to decision trees. Soft Comput 11(4):317–321MATHCrossRef
16.
go back to reference Vapnik V (1998) Statistical learning theory. Wiley, New YorkMATH Vapnik V (1998) Statistical learning theory. Wiley, New YorkMATH
17.
go back to reference Vapnik V (2000) The nature of statistical learning theory. Springer, New YorkMATH Vapnik V (2000) The nature of statistical learning theory. Springer, New YorkMATH
18.
go back to reference Xie TF, Zhou SP (1998) Real function approximation theory. Hangzhou University Press, Hangzhou Xie TF, Zhou SP (1998) Real function approximation theory. Hangzhou University Press, Hangzhou
20.
go back to reference Cucker F, Zhou DX (2007) Learning theory: an approximation theory viewpoint, Cambridge mongraphs on applied and computional mathematics. Cambridge University Press, CambridgeCrossRef Cucker F, Zhou DX (2007) Learning theory: an approximation theory viewpoint, Cambridge mongraphs on applied and computional mathematics. Cambridge University Press, CambridgeCrossRef
21.
go back to reference Cucker F, Smale S (2002) Best choice for regularization parameters in learning theory: on the bias-variance problem. Found Comput Math 2:413–428MathSciNetMATH Cucker F, Smale S (2002) Best choice for regularization parameters in learning theory: on the bias-variance problem. Found Comput Math 2:413–428MathSciNetMATH
Metadata
Title
Learning rates of least-square regularized regression with strongly mixing observation
Authors
Yongquan Zhang
Feilong Cao
Canwei Yan
Publication date
01-12-2012
Publisher
Springer-Verlag
Published in
International Journal of Machine Learning and Cybernetics / Issue 4/2012
Print ISSN: 1868-8071
Electronic ISSN: 1868-808X
DOI
https://doi.org/10.1007/s13042-011-0058-4

Other articles of this Issue 4/2012

International Journal of Machine Learning and Cybernetics 4/2012 Go to the issue