Skip to main content
Erschienen in: Foundations of Computational Mathematics 4/2018

20.06.2017

Optimal Rates for Regularization of Statistical Inverse Learning Problems

verfasst von: Gilles Blanchard, Nicole Mücke

Erschienen in: Foundations of Computational Mathematics | Ausgabe 4/2018

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

We consider a statistical inverse learning (also called inverse regression) problem, where we observe the image of a function f through a linear operator A at i.i.d. random design points \(X_i\), superposed with an additive noise. The distribution of the design points is unknown and can be very general. We analyze simultaneously the direct (estimation of Af) and the inverse (estimation of f) learning problems. In this general framework, we obtain strong and weak minimax optimal rates of convergence (as the number of observations n grows large) for a large class of spectral regularization methods over regularity classes defined through appropriate source conditions. This improves on or completes previous results obtained in related settings. The optimality of the obtained rates is shown not only in the exponent in n but also in the explicit dependency of the constant factor in the variance of the noise and the radius of the source condition set.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Anhänge
Nur mit Berechtigung zugänglich
Fußnoten
1
This can be extended to the case where g is only approximated in \(L^2(\nu )\) by a sequence of functions in \({\mathcal H}_K\). For the sake of the present discussion, only the case where it is assumed \(g \in {\mathcal H}_K\) is of interest.
 
Literatur
1.
Zurück zum Zitat F. Bauer, S. Pereverzev, and L. Rosasco. On regularization algorithms in learning theory. J. Complexity, 23(1):52–72, 2007.MathSciNetCrossRefMATH F. Bauer, S. Pereverzev, and L. Rosasco. On regularization algorithms in learning theory. J. Complexity, 23(1):52–72, 2007.MathSciNetCrossRefMATH
2.
Zurück zum Zitat R. Bhatia. Matrix Analysis. Springer, 1997. R. Bhatia. Matrix Analysis. Springer, 1997.
3.
Zurück zum Zitat R. Bhatia and J. Holbrook. Fréchet derivatives of the power function. Indiana University Mathematics Journal, 49 (3):1155–1173, 2000.MathSciNetCrossRefMATH R. Bhatia and J. Holbrook. Fréchet derivatives of the power function. Indiana University Mathematics Journal, 49 (3):1155–1173, 2000.MathSciNetCrossRefMATH
4.
Zurück zum Zitat N. H. Bingham, C. M. Goldie, and J. L. Teugels. Regular Variation, volume 27 of Encyclopedia of Mathematics and its Applications. Cambridge University Press, 1987. N. H. Bingham, C. M. Goldie, and J. L. Teugels. Regular Variation, volume 27 of Encyclopedia of Mathematics and its Applications. Cambridge University Press, 1987.
5.
Zurück zum Zitat N. Bissantz, T. Hohage, A. Munk, and F. Ruymgaart. Convergence rates of general regularization methods for statistical inverse problems and applications. SIAM J. Numer. Analysis, 45(6):2610–2636, 2007.MathSciNetCrossRefMATH N. Bissantz, T. Hohage, A. Munk, and F. Ruymgaart. Convergence rates of general regularization methods for statistical inverse problems and applications. SIAM J. Numer. Analysis, 45(6):2610–2636, 2007.MathSciNetCrossRefMATH
6.
Zurück zum Zitat G. Blanchard and N. Krämer. Convergence rates of kernel conjugate gradient for random design regression. Analysis and Applications, 2016. G. Blanchard and N. Krämer. Convergence rates of kernel conjugate gradient for random design regression. Analysis and Applications, 2016.
7.
Zurück zum Zitat G. Blanchard and P. Massart. Discussion of “2004 IMS medallion lecture: Local Rademacher complexities and oracle inequalities in risk minimization”, by V. Koltchinskii. Annals of Statistics, 34(6):2664–2671, 2006. G. Blanchard and P. Massart. Discussion of “2004 IMS medallion lecture: Local Rademacher complexities and oracle inequalities in risk minimization”, by V. Koltchinskii. Annals of Statistics, 34(6):2664–2671, 2006.
8.
Zurück zum Zitat P. Bühlmann and B. Yu. Boosting with the \(l_2\)-loss: Regression and classification. Journal of American Statistical Association, 98(462):324–339, 2003.MathSciNetCrossRefMATH P. Bühlmann and B. Yu. Boosting with the \(l_2\)-loss: Regression and classification. Journal of American Statistical Association, 98(462):324–339, 2003.MathSciNetCrossRefMATH
9.
Zurück zum Zitat A. Caponnetto. Optimal rates for regularization operators in learning theory. Technical report, MIT, 2006. A. Caponnetto. Optimal rates for regularization operators in learning theory. Technical report, MIT, 2006.
10.
Zurück zum Zitat A. Caponnetto and Y. Yao. Cross-validion based adaptation for regularization operators in learning theory. Analysis and Applications, 8(2):161–183, 2010.MathSciNetCrossRefMATH A. Caponnetto and Y. Yao. Cross-validion based adaptation for regularization operators in learning theory. Analysis and Applications, 8(2):161–183, 2010.MathSciNetCrossRefMATH
11.
Zurück zum Zitat F. Cucker and S. Smale. Best choices for regularization parameters in learning theory: on the bias-variance problem. Foundations of Computational Mathematics, 2(4):413–428, 2002.MathSciNetCrossRefMATH F. Cucker and S. Smale. Best choices for regularization parameters in learning theory: on the bias-variance problem. Foundations of Computational Mathematics, 2(4):413–428, 2002.MathSciNetCrossRefMATH
12.
Zurück zum Zitat E. De Vito and A. Caponnetto. Optimal rates for regularized least-squares algorithm. Foundations of Computational Mathematics, 7(3):331–368, 2006.MathSciNetMATH E. De Vito and A. Caponnetto. Optimal rates for regularized least-squares algorithm. Foundations of Computational Mathematics, 7(3):331–368, 2006.MathSciNetMATH
13.
Zurück zum Zitat E. De Vito, L. Rosasco, and A. Caponnetto. Discretization error analysis for Tikhonov regularization. Analysis and Applications, 4(1):81–99, 2006.MathSciNetCrossRefMATH E. De Vito, L. Rosasco, and A. Caponnetto. Discretization error analysis for Tikhonov regularization. Analysis and Applications, 4(1):81–99, 2006.MathSciNetCrossRefMATH
14.
Zurück zum Zitat E. De Vito, L. Rosasco, A. Caponnetto, and U. De Giovannini. Learning from examples as an inverse problem. J. of Machine Learning Research, 6:883–904, 2005.MathSciNetMATH E. De Vito, L. Rosasco, A. Caponnetto, and U. De Giovannini. Learning from examples as an inverse problem. J. of Machine Learning Research, 6:883–904, 2005.MathSciNetMATH
15.
Zurück zum Zitat R. DeVore, G. Kerkyacharian, D. Picard, and V.Temlyakov. Mathematical methods for supervised learning. Foundations of Computational Mathematics, 6(1):3–58, 2006.MathSciNetCrossRefMATH R. DeVore, G. Kerkyacharian, D. Picard, and V.Temlyakov. Mathematical methods for supervised learning. Foundations of Computational Mathematics, 6(1):3–58, 2006.MathSciNetCrossRefMATH
16.
Zurück zum Zitat L. Dicker, D. Foster, and D. Hsu. Kernel methods and regularization techniques for nonparametric regression: Minimax optimality and adaptation. Technical report, Rutgers University, 2015. L. Dicker, D. Foster, and D. Hsu. Kernel methods and regularization techniques for nonparametric regression: Minimax optimality and adaptation. Technical report, Rutgers University, 2015.
17.
Zurück zum Zitat H. Engl, M. Hanke, and A. Neubauer. Regularization of Inverse Problems. Kluwer Academic Publishers, 2000. H. Engl, M. Hanke, and A. Neubauer. Regularization of Inverse Problems. Kluwer Academic Publishers, 2000.
18.
Zurück zum Zitat K. Fukumizu, F. R. Bach, and A. Gretton. Statistical consistency of kernel canonical correlation analysis. Journal of Machine Learning Research, 8:361–383, 2007.MathSciNetMATH K. Fukumizu, F. R. Bach, and A. Gretton. Statistical consistency of kernel canonical correlation analysis. Journal of Machine Learning Research, 8:361–383, 2007.MathSciNetMATH
19.
Zurück zum Zitat L. L. Gerfo, L. Rosasco, F. Odone, E. De Vito, and A. Verri. Spectral algorithms for supervised learning. Neural Computation, 20(7):1873–1897, 2008.MathSciNetCrossRefMATH L. L. Gerfo, L. Rosasco, F. Odone, E. De Vito, and A. Verri. Spectral algorithms for supervised learning. Neural Computation, 20(7):1873–1897, 2008.MathSciNetCrossRefMATH
20.
Zurück zum Zitat F. Girosi, M. Jones, and T. Poggio. Regularization theory and neural network architectures. Neural Computation, 7(2):219–269, 1993.CrossRef F. Girosi, M. Jones, and T. Poggio. Regularization theory and neural network architectures. Neural Computation, 7(2):219–269, 1993.CrossRef
21.
Zurück zum Zitat L. Györfi, M. Kohler, A. Krzyzak, and H. Walk. A Distribution-free Theory of Nonparametric Regression. Springer, 2002. L. Györfi, M. Kohler, A. Krzyzak, and H. Walk. A Distribution-free Theory of Nonparametric Regression. Springer, 2002.
22.
Zurück zum Zitat P. Halmos and V. Sunder. Bounded Integral Operators on \(L^2\) -Spaces. Springer, 1978. P. Halmos and V. Sunder. Bounded Integral Operators on \(L^2\) -Spaces. Springer, 1978.
23.
Zurück zum Zitat L. Hörmander. The analysis of linear partial differential operators I. Springer, 1983. L. Hörmander. The analysis of linear partial differential operators I. Springer, 1983.
25.
Zurück zum Zitat S. Loustau and C. Marteau. Minimax fast rates for discriminant analysis with errors in variables. Bernoulli, 21(1):176–208, 2015.MathSciNetCrossRefMATH S. Loustau and C. Marteau. Minimax fast rates for discriminant analysis with errors in variables. Bernoulli, 21(1):176–208, 2015.MathSciNetCrossRefMATH
26.
Zurück zum Zitat P. Mathé and S. Pereverzev. Geometry of linear ill-posed problems in variable Hilbert scales. Inverse Problems, 19(3):789, 2003.MathSciNetCrossRefMATH P. Mathé and S. Pereverzev. Geometry of linear ill-posed problems in variable Hilbert scales. Inverse Problems, 19(3):789, 2003.MathSciNetCrossRefMATH
27.
28.
Zurück zum Zitat F. O’Sullivan. Convergence characteristics of methods of regularization estimators for nonlinear operator equations. SIAM J. Numer. Anal., 27(6):1635–1649, 1990.MathSciNetCrossRefMATH F. O’Sullivan. Convergence characteristics of methods of regularization estimators for nonlinear operator equations. SIAM J. Numer. Anal., 27(6):1635–1649, 1990.MathSciNetCrossRefMATH
29.
Zurück zum Zitat I. F. Pinelis and A. I. Sakhanenko. Remarks on inequalities for probabilities of large deviations. Theory Probab. Appl., 30(1):143–148, 1985.MathSciNetCrossRefMATH I. F. Pinelis and A. I. Sakhanenko. Remarks on inequalities for probabilities of large deviations. Theory Probab. Appl., 30(1):143–148, 1985.MathSciNetCrossRefMATH
30.
Zurück zum Zitat S. Smale and D. Zhou. Shannon sampling II: Connections to learning theory. Appl. Comput. Harmon. Analysis, 19(3):285–302, 2005.MathSciNetCrossRefMATH S. Smale and D. Zhou. Shannon sampling II: Connections to learning theory. Appl. Comput. Harmon. Analysis, 19(3):285–302, 2005.MathSciNetCrossRefMATH
31.
Zurück zum Zitat S. Smale and D. Zhou. Learning theory estimates via integral operators and their approximation. Constructive Approximation, 26(2):153–172, 2007.MathSciNetCrossRefMATH S. Smale and D. Zhou. Learning theory estimates via integral operators and their approximation. Constructive Approximation, 26(2):153–172, 2007.MathSciNetCrossRefMATH
32.
Zurück zum Zitat I. Steinwart and A. Christman. Support Vector Machines. Springer, 2008. I. Steinwart and A. Christman. Support Vector Machines. Springer, 2008.
33.
Zurück zum Zitat I. Steinwart, D. Hush, and C. Scovel. Optimal rates for regularized least squares regression. Proceedings of the 22nd Annual Conference on Learning Theory, pages 79–93, 2009. I. Steinwart, D. Hush, and C. Scovel. Optimal rates for regularized least squares regression. Proceedings of the 22nd Annual Conference on Learning Theory, pages 79–93, 2009.
35.
Zurück zum Zitat A. Tsybakov. Introduction to Nonparametric Estimation. Springer, 2008. A. Tsybakov. Introduction to Nonparametric Estimation. Springer, 2008.
36.
Zurück zum Zitat G. Wahba. Spline Models for Observational Data, volume 59. SIAM CBMS-NSF Series in Applied Mathematics, 1990. G. Wahba. Spline Models for Observational Data, volume 59. SIAM CBMS-NSF Series in Applied Mathematics, 1990.
37.
Zurück zum Zitat C. Wang and D.-X. Zhou. Optimal learning rates for least squares regularized regression with unbounded sampling. Journal of Complexity, 27(1):55–67, 2011.MathSciNetCrossRefMATH C. Wang and D.-X. Zhou. Optimal learning rates for least squares regularized regression with unbounded sampling. Journal of Complexity, 27(1):55–67, 2011.MathSciNetCrossRefMATH
38.
Zurück zum Zitat Y. Yao, L. Rosasco, and A. Caponnetto. On early stopping in gradient descent learning. Constructive Approximation, 26(2):289–315, 2007.MathSciNetCrossRefMATH Y. Yao, L. Rosasco, and A. Caponnetto. On early stopping in gradient descent learning. Constructive Approximation, 26(2):289–315, 2007.MathSciNetCrossRefMATH
Metadaten
Titel
Optimal Rates for Regularization of Statistical Inverse Learning Problems
verfasst von
Gilles Blanchard
Nicole Mücke
Publikationsdatum
20.06.2017
Verlag
Springer US
Erschienen in
Foundations of Computational Mathematics / Ausgabe 4/2018
Print ISSN: 1615-3375
Elektronische ISSN: 1615-3383
DOI
https://doi.org/10.1007/s10208-017-9359-7

Weitere Artikel der Ausgabe 4/2018

Foundations of Computational Mathematics 4/2018 Zur Ausgabe