2013 | OriginalPaper | Buchkapitel
Newton-Based Smoothed Functional Algorithms
verfasst von : S. Bhatnagar, H. Prasad, L. Prashanth
Erschienen in: Stochastic Recursive Algorithms for Optimization
Verlag: Springer London
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
Gaussian SF estimates of the Hessian are derived by taking the convolution of the Hessian of the objective function with a multi-variate Gaussian density functional. Through an integration-by-parts argument applied twice, the same is seen to be the convolution of the function itself with a scaled multi-variate Gaussian density. This results in a one-simulation estimate of the Hessian. The same simulation also helps in obtaining a one-simulation gradient estimate (see Chapter 6). Thus, one obtains a one-simulation Newton-based SF algorithm. A two-simulation estimate of the Hessian is also derived that incorporates the same two simulations as for the two-simulation gradient estimate, also derived in Chapter 6. This results in a two-simulation Newton SF algorithm. We limit the discussion in this chapter to Gaussian-based SF estimates only.