Skip to main content

2019 | OriginalPaper | Buchkapitel

Deep Learning for Trivial Inverse Problems

verfasst von : Peter Maass

Erschienen in: Compressed Sensing and Its Applications

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Deep learning is producing most remarkable results when applied to some of the toughest large-scale nonlinear problems such as classification tasks in computer vision or speech recognition. Recently, deep learning has also been applied to inverse problems, in particular, in medical imaging. Some of these applications are motivated by mathematical reasoning, but a solid and at least partially complete mathematical theory for understanding neural networks and deep learning is missing. In this paper, we do not address large-scale problems but aim at understanding neural networks for solving some small and rather naive inverse problems. Nevertheless, the results of this paper highlight the particular complications of inverse problems, e.g., we show that applying a natural network design for mimicking Tikhonov regularization fails when applied to even the most trivial inverse problems. The proofs of this paper utilize basic and well-known results from the theory of statistical inverse problems. We include the proofs in order to provide some material ready to be used in student projects or general mathematical courses on data analysis. We only assume that the reader is familiar with the standard definitions of feedforward networks, e.g., the backpropagation algorithm for training such networks. We also include—without proof—numerical experiments for analyzing the influence of the network design, which include comparisons with learned iterative soft-thresholding algorithm (LISTA).

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat J. Adler, O. Öktem, Solving ill-posed inverse problems using iterative deep neural networks. Inverse Probl. 33(12), 124007 (2017)MathSciNetCrossRef J. Adler, O. Öktem, Solving ill-posed inverse problems using iterative deep neural networks. Inverse Probl. 33(12), 124007 (2017)MathSciNetCrossRef
2.
Zurück zum Zitat J. Bioucas-Dias, M. Figueiredo, A new twist: two-step iterative shrinkage/thresholding algorithms for image restoration. 16, 2992–3004 (2008) J. Bioucas-Dias, M. Figueiredo, A new twist: two-step iterative shrinkage/thresholding algorithms for image restoration. 16, 2992–3004 (2008)
4.
Zurück zum Zitat T. Bonesky, K. Bredies, D.A. Lorenz, P. Maass, A generalized conditional gradient method for nonlinear operator equations with sparsity constraints. Inverse Probl. 23(5), 2041 (2007) T. Bonesky, K. Bredies, D.A. Lorenz, P. Maass, A generalized conditional gradient method for nonlinear operator equations with sparsity constraints. Inverse Probl. 23(5), 2041 (2007)
5.
Zurück zum Zitat R.H. Byrd, G.M. Chin, J. Nocedal, W. Yuchen, Sample size selection in optimization methods for machine learning. Math. Programm. 134(1), 127–155 (2012)MathSciNetCrossRef R.H. Byrd, G.M. Chin, J. Nocedal, W. Yuchen, Sample size selection in optimization methods for machine learning. Math. Programm. 134(1), 127–155 (2012)MathSciNetCrossRef
6.
Zurück zum Zitat Y. Chen, T. Pock, Trainable nonlinear reaction diffusion: a flexible framework for fast and effective image restoration. IEEE Trans. Pattern Anal. Mach. Intell. 39(6), 1256–1272 (2017)CrossRef Y. Chen, T. Pock, Trainable nonlinear reaction diffusion: a flexible framework for fast and effective image restoration. IEEE Trans. Pattern Anal. Mach. Intell. 39(6), 1256–1272 (2017)CrossRef
7.
Zurück zum Zitat C. Chung Van, J.C. De los Reyes, C.B. Schoenlieb, Learning optimal spatially-dependent regularization parameters in total variation image denoising. Inverse Probl. 33(7), 074005 (2017) C. Chung Van, J.C. De los Reyes, C.B. Schoenlieb, Learning optimal spatially-dependent regularization parameters in total variation image denoising. Inverse Probl. 33(7), 074005 (2017)
9.
Zurück zum Zitat I. Daubechies, M. Defrise, C. De Mol, An iterative thresholding algorithm for linear inverse problems with a sparsity constraint. Commun. Pure Appl. Math. 57(11), 1413–1457 (2004)MathSciNetCrossRef I. Daubechies, M. Defrise, C. De Mol, An iterative thresholding algorithm for linear inverse problems with a sparsity constraint. Commun. Pure Appl. Math. 57(11), 1413–1457 (2004)MathSciNetCrossRef
10.
Zurück zum Zitat A. Edelman, B.D. Sutton, Y. Wang, Random Matrix Theory, Numerical Computation and Applications A. Edelman, B.D. Sutton, Y. Wang, Random Matrix Theory, Numerical Computation and Applications
11.
Zurück zum Zitat A. Edelman, N.R. Rao, Random matrix theory. Acta Numer. 14, 233–297 (2005) A. Edelman, N.R. Rao, Random matrix theory. Acta Numer. 14, 233–297 (2005)
13.
Zurück zum Zitat K. Gregor, Y. LeCun, Learning fast approximations of sparse coding, in Proceedings of the 27th International Conference on International Conference on Machine Learning, ICML’10 (Omnipress, USA, 2010), pp. 399–406 K. Gregor, Y. LeCun, Learning fast approximations of sparse coding, in Proceedings of the 27th International Conference on International Conference on Machine Learning, ICML’10 (Omnipress, USA, 2010), pp. 399–406
14.
Zurück zum Zitat A. Hauptmann, F. Lucka, M. Betcke, N. Huynh, J. Adler, B. Cox, P. Beard, S. Ourselin, S. Arridge, Model based learning for accelerated, limited-view 3D photoacoustic tomography. IEEE Trans. Med. Imaging (2018). In Press A. Hauptmann, F. Lucka, M. Betcke, N. Huynh, J. Adler, B. Cox, P. Beard, S. Ourselin, S. Arridge, Model based learning for accelerated, limited-view 3D photoacoustic tomography. IEEE Trans. Med. Imaging (2018). In Press
15.
Zurück zum Zitat B. Jin, P. Maass, Sparsity regularization for parameter identification problems. Inverse Probl. 28(12), 123001 (2012)MathSciNetCrossRef B. Jin, P. Maass, Sparsity regularization for parameter identification problems. Inverse Probl. 28(12), 123001 (2012)MathSciNetCrossRef
16.
Zurück zum Zitat J. Kaipio, E. Somersalo, Statistical and Computational Inverse Problems (Springer, 2005) J. Kaipio, E. Somersalo, Statistical and Computational Inverse Problems (Springer, 2005)
17.
18.
Zurück zum Zitat Y. LeCun, Y. Bengio, G. Hinton, Deep learning. Nature 521(7553), 436–444 (2015)CrossRef Y. LeCun, Y. Bengio, G. Hinton, Deep learning. Nature 521(7553), 436–444 (2015)CrossRef
20.
Zurück zum Zitat J. Martens, I. Sutskever, Training Deep and Recurrent Networks with Hessian-Free Optimization (Springer, Berlin, Heidelberg, 2012), pp. 479–535 J. Martens, I. Sutskever, Training Deep and Recurrent Networks with Hessian-Free Optimization (Springer, Berlin, Heidelberg, 2012), pp. 479–535
21.
Zurück zum Zitat J.L. Mueller, S. Siltanen, Linear and Nonlinear Inverse Problems with Practical Applications (SIAM, 2012) J.L. Mueller, S. Siltanen, Linear and Nonlinear Inverse Problems with Practical Applications (SIAM, 2012)
22.
Zurück zum Zitat D.E. Rumelhart, G.E. Hinton, R.J. Williams, Neurocomputing: Foundations of Research. Chapter Learning Representations by Back-propagating Errors (MIT Press, Cambridge, MA, USA, 1988), pp. 696–699 D.E. Rumelhart, G.E. Hinton, R.J. Williams, Neurocomputing: Foundations of Research. Chapter Learning Representations by Back-propagating Errors (MIT Press, Cambridge, MA, USA, 1988), pp. 696–699
23.
Zurück zum Zitat M. Unser (2018) A representer theorem for deep neural networks. ArXiv e-prints M. Unser (2018) A representer theorem for deep neural networks. ArXiv e-prints
24.
Zurück zum Zitat R. van Handel (2015) On the spectral norm of Gaussian random matrices. ArXiv e-prints R. van Handel (2015) On the spectral norm of Gaussian random matrices. ArXiv e-prints
Metadaten
Titel
Deep Learning for Trivial Inverse Problems
verfasst von
Peter Maass
Copyright-Jahr
2019
DOI
https://doi.org/10.1007/978-3-319-73074-5_6