1997 | OriginalPaper | Buchkapitel
Computing Sparse Hessian and Jacobian Approximations with Optimal Hereditary Properties
verfasst von : Roger Fletcher, Andreas Grothey, Sven Leyffer
Erschienen in: Large-Scale Optimization with Applications
Verlag: Springer New York
Enthalten in: Professional Book Archive
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
In nonlinear optimization it is often important to estimate large sparse Hessian or Jacobian matrices, to be used for example in a trust region method. We propose an algorithm for computing a matrix B with a given sparsity pattern from a bundle of the m most recent difference vectors $$\Delta = \left[ {{{\delta }^{{k - m + 1}}} \ldots {{\delta }^{k}}} \right],\Gamma = \left[ {{{\gamma }^{{k - m + 1}}} \ldots {{\gamma }^{k}}} \right] $$ where B should approximately map △ into Г. In this paper B is chosen such that it satisfies m quasi—Newton conditions B△ = Г in the least squares sense.We show that B can always be computed by solving a positive semi—definite system of equations in the nonzero components of B. We give necessary and sufficient conditions under which this system is positive definite and indicate how B can be computed efficiently using a conjugate gradient method.In the case of unconstrained optimization we use the technique to determine a Hessian approximation which is used in a trust region method. Some numerical results are presented for a range of unconstrained test problems.