Abstract
This chapter introduces the topic of optimization. It outlines the key ideas in an intuitive fashion. Furthermore, the distinction between supervised and unsupervised learning is explained, and their strengths and weaknesses are discussed. For both learning approaches, the choice of the loss function and their implications are discussed in detail.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
- 2.
Some “exotic” distributions (e.g., the Cauchy distribution) exist that do not meet the so-called Lindeberg condition, and therefore their sum is not asymptotically Gaussian distributed. But these exceptions are without practical significance in the context of this book.
References
Draper, N.R., Smith, H.: Applied Regression Analysis. Probability and Mathematical Statistics. John Wiley & Sons, New York (1981)
Eykhoff, P.: System Identification. John Wiley & Sons, London (1974)
Isermann, R.: Identifikation dynamischer Syteme – Band 2, 2. ed. Springer, Berlin (1992)
Peterka, V.: Bayesian approach to system identification. In: Eykhoff, P. (ed.) Trends and Progress in System Identification. Pergamon Press, Oxford (1991)
Plate, T.: Re: Kangaroos (Was Re: BackProp without Calculus?). Usenet article <93Sep8.162519edt.997@neuron.ai.toronto.edu> in comp.ai.neural-nets (1993)
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2020 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Nelles, O. (2020). Introduction to Optimization. In: Nonlinear System Identification. Springer, Cham. https://doi.org/10.1007/978-3-030-47439-3_2
Download citation
DOI: https://doi.org/10.1007/978-3-030-47439-3_2
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-47438-6
Online ISBN: 978-3-030-47439-3
eBook Packages: Physics and AstronomyPhysics and Astronomy (R0)