Skip to main content

2016 | OriginalPaper | Buchkapitel

6. Regression

verfasst von : Thomas A. Runkler

Erschienen in: Data Analytics

Verlag: Springer Fachmedien Wiesbaden

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Regression estimates functional dependencies between features. Linear regression models can be efficiently computed from covariances but are restricted to linear dependencies. Substitution allows us to identify specific nonlinear dependencies by linear regression. Robust regression finds models that are robust against outliers. A popular family of nonlinear regression methods are universal approximators. We present two well-known examples for universal approximators from the field of artificial neural networks: the multilayer perceptron and radial basis function networks. Universal approximators can realize arbitrarily small training errors, but cross-validation is required to find models with low validation errors that generalize well on other data sets. Feature selection allows us to include only relevant features in regression models leading to more accurate models.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat H. Akaike. A new look at the statistical model identification. IEEE Transactions on Automatic Control, AC–19:716–723, 1974. H. Akaike. A new look at the statistical model identification. IEEE Transactions on Automatic Control, AC–19:716–723, 1974.
3.
Zurück zum Zitat P. Craven and G. Wahba. Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross–validation. Numerical Mathematics, 31:377–403, 1979.MathSciNetCrossRefMATH P. Craven and G. Wahba. Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross–validation. Numerical Mathematics, 31:377–403, 1979.MathSciNetCrossRefMATH
4.
Zurück zum Zitat R. Hecht-Nielsen. Neurocomputing. Addison-Wesley, 1990. R. Hecht-Nielsen. Neurocomputing. Addison-Wesley, 1990.
5.
Zurück zum Zitat K. Hornik, M. Stinchcombe, and H. White. Multilayer feedforward networks are universal approximators. Neural Networks, 2(5):359–366, 1989.CrossRef K. Hornik, M. Stinchcombe, and H. White. Multilayer feedforward networks are universal approximators. Neural Networks, 2(5):359–366, 1989.CrossRef
7.
Zurück zum Zitat A. N. Kolmogorov. On the representation of continuous functions of many variables by superposition of continuous functions of one variable and addition. Doklady Akademii Nauk SSSR, 144:679–681, 1957.MathSciNetMATH A. N. Kolmogorov. On the representation of continuous functions of many variables by superposition of continuous functions of one variable and addition. Doklady Akademii Nauk SSSR, 144:679–681, 1957.MathSciNetMATH
8.
Zurück zum Zitat R. Meiria and J. Zahavi. Using simulated annealing to optimize the feature selection problem in marketing applications. European Journal of Operational Research, 171(3):842–858, 2006.CrossRefMATH R. Meiria and J. Zahavi. Using simulated annealing to optimize the feature selection problem in marketing applications. European Journal of Operational Research, 171(3):842–858, 2006.CrossRefMATH
9.
Zurück zum Zitat J. Moody and C. Darken. Fast learning in networks of locally-tuned processing units. Neural Computation, 1:281–294, 1989.CrossRef J. Moody and C. Darken. Fast learning in networks of locally-tuned processing units. Neural Computation, 1:281–294, 1989.CrossRef
10.
Zurück zum Zitat F. Rosenblatt. The perceptron: A probabilistic model for information storage and organization in the brain. Psychological Reviews, 65:386–408, 1958.MathSciNetCrossRef F. Rosenblatt. The perceptron: A probabilistic model for information storage and organization in the brain. Psychological Reviews, 65:386–408, 1958.MathSciNetCrossRef
11.
Zurück zum Zitat P. J. Rousseeuw and A. M. Leroy. Robust Regression and Outlier Detection. Wiley, New York, 1987.CrossRefMATH P. J. Rousseeuw and A. M. Leroy. Robust Regression and Outlier Detection. Wiley, New York, 1987.CrossRefMATH
12.
Zurück zum Zitat D. E. Rumelhart, G. E. Hinton, and R. J. Williams. Learning internal representations by error backpropagation. In D. E. Rumelhart and J. L. McClelland, editors, Parallel Distributed Processing. Explorations in the Microstructure of Cognition, volume 1, pages 318–362. MIT Press, Cambridge, 1986. D. E. Rumelhart, G. E. Hinton, and R. J. Williams. Learning internal representations by error backpropagation. In D. E. Rumelhart and J. L. McClelland, editors, Parallel Distributed Processing. Explorations in the Microstructure of Cognition, volume 1, pages 318–362. MIT Press, Cambridge, 1986.
13.
Zurück zum Zitat W. Siedlecki and J. Sklansky. A note on genetic algorithms for large–scale feature selection. Pattern Recognition Letters, 10(5):335–347, 1989.CrossRefMATH W. Siedlecki and J. Sklansky. A note on genetic algorithms for large–scale feature selection. Pattern Recognition Letters, 10(5):335–347, 1989.CrossRefMATH
14.
Zurück zum Zitat S. Vieira, J. M. Sousa, and T. A. Runkler. Multi-criteria ant feature selection using fuzzy classifiers. In C. A. Coello Coello, S. Dehuri, and S. Ghosh, editors, Swarm Intelligence for Multi-objective Problems in Data Mining, pages 19–36. Springer, 2009. S. Vieira, J. M. Sousa, and T. A. Runkler. Multi-criteria ant feature selection using fuzzy classifiers. In C. A. Coello Coello, S. Dehuri, and S. Ghosh, editors, Swarm Intelligence for Multi-objective Problems in Data Mining, pages 19–36. Springer, 2009.
15.
Zurück zum Zitat P. J. Werbos. The Roots of Backpropagation: From Ordered Derivatives to Neural Networks and Political Forecasting (Adaptive and Learning Systems for Signal Processing, Communications and Control Series). Wiley–Interscience, 1994. P. J. Werbos. The Roots of Backpropagation: From Ordered Derivatives to Neural Networks and Political Forecasting (Adaptive and Learning Systems for Signal Processing, Communications and Control Series). Wiley–Interscience, 1994.
Metadaten
Titel
Regression
verfasst von
Thomas A. Runkler
Copyright-Jahr
2016
DOI
https://doi.org/10.1007/978-3-658-14075-5_6