Skip to main content

2021 | OriginalPaper | Buchkapitel

Uncertainty Quantification in Data Fitting Neural and Hilbert Networks

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

We analyze the uncertainties in Neural and Hilbert networks. The first source of uncertainty is the variability of partition of the data in subsets for training, testing and validating. The analysis is made on the variability of the performance and of the weights of the networks. The effects of additive and multiplicative noises in the data are studied. The results of Neural and Hilbert Networks are compared. It appears that Hilbert Networks are more robust but imply a higher computational cost. The distributions of the outputs are studied and it appears that their means and modes may be used to improve the estimates furnished by the nets. The extension of Hilbert Networks to Element Based Networks is considered.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Talmon, J.L., McNair, P.: The effect of noise and biases on the performance of machine learning algorithms. Int. J. Biomed. Comput. 31, 45–57 (1992)CrossRef Talmon, J.L., McNair, P.: The effect of noise and biases on the performance of machine learning algorithms. Int. J. Biomed. Comput. 31, 45–57 (1992)CrossRef
2.
Zurück zum Zitat Napolitano, A.: Classification techniques for noisy and unbalanced data. Ph. D. Thesis. Florida Atlantic University (2009) Napolitano, A.: Classification techniques for noisy and unbalanced data. Ph. D. Thesis. Florida Atlantic University (2009)
4.
Zurück zum Zitat Zhu, X., Wu, X.: Class noise vs. Attribute noise: a quantitative study of their impacts. Artif. Intell. Rev, 22, 177–210 (2004) Zhu, X., Wu, X.: Class noise vs. Attribute noise: a quantitative study of their impacts. Artif. Intell. Rev, 22, 177–210 (2004)
5.
Zurück zum Zitat García, S., Luengo, J., Herrera, F.: Dealing with noisy data. In: Data Preprocessing in Data Mining. Intelligent Systems Reference Library, vol 72. Springer, Cham (2015) García, S., Luengo, J., Herrera, F.: Dealing with noisy data. In: Data Preprocessing in Data Mining. Intelligent Systems Reference Library, vol 72. Springer, Cham (2015)
7.
Zurück zum Zitat Sessions, V., Valtorta, M.: The Effects of Data Quality on Machine Learning Algorithms, pp. 485–498 (2006) Sessions, V., Valtorta, M.: The Effects of Data Quality on Machine Learning Algorithms, pp. 485–498 (2006)
8.
Zurück zum Zitat Raychev, V., Bielik, P., Vechev, M., Krause, A.: Learning programs from noisy data. IEEE Trans. Inf. Theory 37(4), 1085–1094 (1991)CrossRef Raychev, V., Bielik, P., Vechev, M., Krause, A.: Learning programs from noisy data. IEEE Trans. Inf. Theory 37(4), 1085–1094 (1991)CrossRef
10.
Zurück zum Zitat Song, S., Chaudhuri, K., Sarwate, A.D.: Learning from data with heterogeneous noise using SGD. In: Proceedings of the 18th International Conference on Artificial Intelligence and Statistics (AISTATS), San Diego, CA, USA. JMLR: W&CP, vol. 38 (2015) Song, S., Chaudhuri, K., Sarwate, A.D.: Learning from data with heterogeneous noise using SGD. In: Proceedings of the 18th International Conference on Artificial Intelligence and Statistics (AISTATS), San Diego, CA, USA. JMLR: W&CP, vol. 38 (2015)
12.
Zurück zum Zitat Meechan-Maddon, A.: The effect of noise in the training of convolutional neural networks for text summarization. Master’s Thesis, Uppsala University (2019) Meechan-Maddon, A.: The effect of noise in the training of convolutional neural networks for text summarization. Master’s Thesis, Uppsala University (2019)
13.
Zurück zum Zitat Al-Mashouq, K.A.: Prediction of neural net tolerance to noise. Neural Process. Lett. 5, 25–34 (1997)CrossRef Al-Mashouq, K.A.: Prediction of neural net tolerance to noise. Neural Process. Lett. 5, 25–34 (1997)CrossRef
14.
Zurück zum Zitat Khardon, R., Wachman, G.: Noise tolerant variants of the perceptron algorithm. J. Mach. Learn. Res. 8, 227–224 (2007)MATH Khardon, R., Wachman, G.: Noise tolerant variants of the perceptron algorithm. J. Mach. Learn. Res. 8, 227–224 (2007)MATH
15.
Zurück zum Zitat Burger, Martin, Engl, Heinz W.: Training neural networks with noisy data as an ill-posed problem. Adv. Comput. Math. 13, 335–354 (2000)MathSciNetCrossRef Burger, Martin, Engl, Heinz W.: Training neural networks with noisy data as an ill-posed problem. Adv. Comput. Math. 13, 335–354 (2000)MathSciNetCrossRef
16.
Zurück zum Zitat Noh, H., You, T., Mun, J., Han, B.: Regularizing deep neural networks by noise: its interpretation and optimization. In: 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA (2017). https://arxiv.org/pdf/1710.05179.pdf. Accessed 22 Jan 2020 Noh, H., You, T., Mun, J., Han, B.: Regularizing deep neural networks by noise: its interpretation and optimization. In: 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA (2017). https://​arxiv.​org/​pdf/​1710.​05179.​pdf. Accessed 22 Jan 2020
18.
Zurück zum Zitat Bishop, C.M.: Training with noise is equivalent to tikhonov regularization. Neural Comput. 7(1), 108–116 (1995)CrossRef Bishop, C.M.: Training with noise is equivalent to tikhonov regularization. Neural Comput. 7(1), 108–116 (1995)CrossRef
22.
Zurück zum Zitat de Cursi, E.S.: Variational methods for engineers with Matlab. ISTE, Ltd., Wiley, London, Hoboken (2015) de Cursi, E.S.: Variational methods for engineers with Matlab. ISTE, Ltd., Wiley, London, Hoboken (2015)
23.
Zurück zum Zitat de Cursi, E.S., Sampaio, R.: Uncertainty Quantification and Stochastic Modeling with Matlab. Elsevier Science Publishers B.V., NLD (2015) de Cursi, E.S., Sampaio, R.: Uncertainty Quantification and Stochastic Modeling with Matlab. Elsevier Science Publishers B.V., NLD (2015)
24.
Zurück zum Zitat de Cursi, E.S., Sampaio, R.: Modeling and convexity. ISTE, Ltd., Wiley, London, Hoboken (2010) de Cursi, E.S., Sampaio, R.: Modeling and convexity. ISTE, Ltd., Wiley, London, Hoboken (2010)
Metadaten
Titel
Uncertainty Quantification in Data Fitting Neural and Hilbert Networks
verfasst von
Leila Khalij
Eduardo Souza de Cursi
Copyright-Jahr
2021
DOI
https://doi.org/10.1007/978-3-030-53669-5_17

Neuer Inhalt