Skip to main content
Erschienen in: Calcolo 2/2022

01.06.2022

Accelerated memory-less SR1 method with generalized secant equation for unconstrained optimization

verfasst von: Neculai Andrei

Erschienen in: Calcolo | Ausgabe 2/2022

Einloggen, um Zugang zu erhalten

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

The memory-less SR1 with generalized secant equation (MM-SR1gen) is presented and developed together with its numerical performances for solving a collection of 800 unconstrained optimization problems with the number of variables in the range [1000, 10000]. The convergence of the MM-SR1gen method is proved under the classical assumptions. Comparison between the MM-SR1gen versus the memory-less SR1 method, versus the memory-less BFGS method and versus the BFGS in implementation of Shanno and Phua from CONMIN show that MM-SR1gen is more efficient and more robust than these algorithms. By solving five applications from MINPACK-2 collection, each of them with 40,000 variables, we have the computational evidence that MM-SR1gen is more efficient than memory-less SR1 and than memory-less BFGS. The conclusion of this study is that the memory-less SR1 method with generalized secant equation is a rapid and reliable method for solving large-scale minimizing problems. Besides, it is shown that the accuracy of the Hessian approximations along the iterations in quasi-Newton methods is not as crucial in these methods as it is believed.
Literatur
1.
Zurück zum Zitat Andrei, N.: An acceleration of gradient descent algorithm with backtracking for unconstrained optimization. Numer. Algorithms 42(1), 63–73 (2006)MathSciNetCrossRef Andrei, N.: An acceleration of gradient descent algorithm with backtracking for unconstrained optimization. Numer. Algorithms 42(1), 63–73 (2006)MathSciNetCrossRef
2.
Zurück zum Zitat Andrei, N.: Acceleration of conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 213(2), 361–369 (2009)MathSciNetMATH Andrei, N.: Acceleration of conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 213(2), 361–369 (2009)MathSciNetMATH
3.
Zurück zum Zitat Andrei, N.: Nonlinear conjugate gradient methods for unconstrained optimization. Springer Optimization and Its Applications (2020) Andrei, N.: Nonlinear conjugate gradient methods for unconstrained optimization. Springer Optimization and Its Applications (2020)
4.
Zurück zum Zitat Aris, R.: The mathematical theory of diffusion and reaction in permeable catalysts. Clarendon Press, Oxford (1975)MATH Aris, R.: The mathematical theory of diffusion and reaction in permeable catalysts. Clarendon Press, Oxford (1975)MATH
5.
Zurück zum Zitat Averick, B.M., Carter, R.G., Moré, J.J., Xue, G.L.: The MINPACK-2 test problem collection, Mathematics and Computer Science Division, Argonne National Laboratory, Preprint MCS-P153-0692 (1992) Averick, B.M., Carter, R.G., Moré, J.J., Xue, G.L.: The MINPACK-2 test problem collection, Mathematics and Computer Science Division, Argonne National Laboratory, Preprint MCS-P153-0692 (1992)
6.
Zurück zum Zitat Bebernes, J., Eberly, D.: Mahematical problems from combustion theory. In: Applied Mathematical Sciences, vol. 83. Springer-Verlag (1989) Bebernes, J., Eberly, D.: Mahematical problems from combustion theory. In: Applied Mathematical Sciences, vol. 83. Springer-Verlag (1989)
7.
Zurück zum Zitat Bongartz, I., Conn, A.R., Gould, N.I.M., Toint, Ph.L.: CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Softw. 21, 123–160 (1995)CrossRef Bongartz, I., Conn, A.R., Gould, N.I.M., Toint, Ph.L.: CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Softw. 21, 123–160 (1995)CrossRef
8.
Zurück zum Zitat Cimatti, G.: On a problem of the theory of lubrication governed by a variational inequality. Appl. Math. Optim. 3, 227–242 (1977)MathSciNetCrossRef Cimatti, G.: On a problem of the theory of lubrication governed by a variational inequality. Appl. Math. Optim. 3, 227–242 (1977)MathSciNetCrossRef
9.
Zurück zum Zitat Conn, A.R., Gould, N.I.M., Toint, Ph.L.: Convergence of quasi-newton matrices generated by the symmetric rank one update. Math. Program. 50(1–3), 177–195 (1991)MathSciNetCrossRef Conn, A.R., Gould, N.I.M., Toint, Ph.L.: Convergence of quasi-newton matrices generated by the symmetric rank one update. Math. Program. 50(1–3), 177–195 (1991)MathSciNetCrossRef
10.
Zurück zum Zitat Dai, Y.H., Liao, L.Z.: New conjugate conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)MathSciNetCrossRef Dai, Y.H., Liao, L.Z.: New conjugate conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)MathSciNetCrossRef
11.
Zurück zum Zitat Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)MathSciNetCrossRef Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)MathSciNetCrossRef
12.
Zurück zum Zitat Fiacco, A.V., McCormick, G.P.: Nonlinear programming: sequential unconstrained minimization techniques. Research Analysis Corporation, McLean Virginia (1968). Republished in 1990 by SIAM, Philadelphia Fiacco, A.V., McCormick, G.P.: Nonlinear programming: sequential unconstrained minimization techniques. Research Analysis Corporation, McLean Virginia (1968). Republished in 1990 by SIAM, Philadelphia
13.
Zurück zum Zitat Glowinski, R.: Numerical methods for nonlinear variational problems. Springer-Verlag, Berlin (1984)CrossRef Glowinski, R.: Numerical methods for nonlinear variational problems. Springer-Verlag, Berlin (1984)CrossRef
14.
Zurück zum Zitat Goodman, J., Kohn, R., Reyna, L.: Numerical study of a relaxed variational problem from optimal design. Comput. Methods Appl. Mech. Eng. 57, 107–127 (1986)MathSciNetCrossRef Goodman, J., Kohn, R., Reyna, L.: Numerical study of a relaxed variational problem from optimal design. Comput. Methods Appl. Mech. Eng. 57, 107–127 (1986)MathSciNetCrossRef
15.
Zurück zum Zitat Kelley, C.T., Sachs, E.W.: Local convergence of the symmetric rank one iteration. Comput. Optim. Appl. 9, 43–63 (1998)MathSciNetCrossRef Kelley, C.T., Sachs, E.W.: Local convergence of the symmetric rank one iteration. Comput. Optim. Appl. 9, 43–63 (1998)MathSciNetCrossRef
16.
Zurück zum Zitat Khalfan, H.F., Byrd, R.H., Schnabel, R.B.: A theoretical and experimental study of the symmetric rank-one update. SIAM J. Optim. 3(1), 1–24 (1993)MathSciNetCrossRef Khalfan, H.F., Byrd, R.H., Schnabel, R.B.: A theoretical and experimental study of the symmetric rank-one update. SIAM J. Optim. 3(1), 1–24 (1993)MathSciNetCrossRef
17.
Zurück zum Zitat Nitsche, J.C.C.: Lectures on minimal surfaces, vol. 1. Cambridge University Press, Cambridge (1989)MATH Nitsche, J.C.C.: Lectures on minimal surfaces, vol. 1. Cambridge University Press, Cambridge (1989)MATH
19.
20.
Zurück zum Zitat Shanno, D.F.: On the convergence of a new conjugate gradient algorithm. SIAM J. Numer. Anal. 15, 1247–1257 (1978)MathSciNetCrossRef Shanno, D.F.: On the convergence of a new conjugate gradient algorithm. SIAM J. Numer. Anal. 15, 1247–1257 (1978)MathSciNetCrossRef
21.
Zurück zum Zitat Shanno, D.F.: CONMIN—A Fortran subroutine for minimizing an unconstrained nonlinear scalar valued function of a vector variable x either by the BFGS variable metric algorithm or by a Beale restarted conjugate gradient algorithm. Private Commun. (1983) Shanno, D.F.: CONMIN—A Fortran subroutine for minimizing an unconstrained nonlinear scalar valued function of a vector variable x either by the BFGS variable metric algorithm or by a Beale restarted conjugate gradient algorithm. Private Commun. (1983)
22.
Zurück zum Zitat Shanno, D.F., Phua, K.H.: Algorithm 500. Minimization of unconstrained multivariable functions. ACM Trans. Math. Softw. 2, 87–94 (1976)CrossRef Shanno, D.F., Phua, K.H.: Algorithm 500. Minimization of unconstrained multivariable functions. ACM Trans. Math. Softw. 2, 87–94 (1976)CrossRef
24.
Metadaten
Titel
Accelerated memory-less SR1 method with generalized secant equation for unconstrained optimization
verfasst von
Neculai Andrei
Publikationsdatum
01.06.2022
Verlag
Springer International Publishing
Erschienen in
Calcolo / Ausgabe 2/2022
Print ISSN: 0008-0624
Elektronische ISSN: 1126-5434
DOI
https://doi.org/10.1007/s10092-022-00460-x

Weitere Artikel der Ausgabe 2/2022

Calcolo 2/2022 Zur Ausgabe

Premium Partner