Skip to main content
Top
Published in: Calcolo 2/2022

01-06-2022

Accelerated memory-less SR1 method with generalized secant equation for unconstrained optimization

Author: Neculai Andrei

Published in: Calcolo | Issue 2/2022

Login to get access

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

The memory-less SR1 with generalized secant equation (MM-SR1gen) is presented and developed together with its numerical performances for solving a collection of 800 unconstrained optimization problems with the number of variables in the range [1000, 10000]. The convergence of the MM-SR1gen method is proved under the classical assumptions. Comparison between the MM-SR1gen versus the memory-less SR1 method, versus the memory-less BFGS method and versus the BFGS in implementation of Shanno and Phua from CONMIN show that MM-SR1gen is more efficient and more robust than these algorithms. By solving five applications from MINPACK-2 collection, each of them with 40,000 variables, we have the computational evidence that MM-SR1gen is more efficient than memory-less SR1 and than memory-less BFGS. The conclusion of this study is that the memory-less SR1 method with generalized secant equation is a rapid and reliable method for solving large-scale minimizing problems. Besides, it is shown that the accuracy of the Hessian approximations along the iterations in quasi-Newton methods is not as crucial in these methods as it is believed.
Literature
1.
go back to reference Andrei, N.: An acceleration of gradient descent algorithm with backtracking for unconstrained optimization. Numer. Algorithms 42(1), 63–73 (2006)MathSciNetCrossRef Andrei, N.: An acceleration of gradient descent algorithm with backtracking for unconstrained optimization. Numer. Algorithms 42(1), 63–73 (2006)MathSciNetCrossRef
2.
go back to reference Andrei, N.: Acceleration of conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 213(2), 361–369 (2009)MathSciNetMATH Andrei, N.: Acceleration of conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 213(2), 361–369 (2009)MathSciNetMATH
3.
go back to reference Andrei, N.: Nonlinear conjugate gradient methods for unconstrained optimization. Springer Optimization and Its Applications (2020) Andrei, N.: Nonlinear conjugate gradient methods for unconstrained optimization. Springer Optimization and Its Applications (2020)
4.
go back to reference Aris, R.: The mathematical theory of diffusion and reaction in permeable catalysts. Clarendon Press, Oxford (1975)MATH Aris, R.: The mathematical theory of diffusion and reaction in permeable catalysts. Clarendon Press, Oxford (1975)MATH
5.
go back to reference Averick, B.M., Carter, R.G., Moré, J.J., Xue, G.L.: The MINPACK-2 test problem collection, Mathematics and Computer Science Division, Argonne National Laboratory, Preprint MCS-P153-0692 (1992) Averick, B.M., Carter, R.G., Moré, J.J., Xue, G.L.: The MINPACK-2 test problem collection, Mathematics and Computer Science Division, Argonne National Laboratory, Preprint MCS-P153-0692 (1992)
6.
go back to reference Bebernes, J., Eberly, D.: Mahematical problems from combustion theory. In: Applied Mathematical Sciences, vol. 83. Springer-Verlag (1989) Bebernes, J., Eberly, D.: Mahematical problems from combustion theory. In: Applied Mathematical Sciences, vol. 83. Springer-Verlag (1989)
7.
go back to reference Bongartz, I., Conn, A.R., Gould, N.I.M., Toint, Ph.L.: CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Softw. 21, 123–160 (1995)CrossRef Bongartz, I., Conn, A.R., Gould, N.I.M., Toint, Ph.L.: CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Softw. 21, 123–160 (1995)CrossRef
8.
go back to reference Cimatti, G.: On a problem of the theory of lubrication governed by a variational inequality. Appl. Math. Optim. 3, 227–242 (1977)MathSciNetCrossRef Cimatti, G.: On a problem of the theory of lubrication governed by a variational inequality. Appl. Math. Optim. 3, 227–242 (1977)MathSciNetCrossRef
9.
go back to reference Conn, A.R., Gould, N.I.M., Toint, Ph.L.: Convergence of quasi-newton matrices generated by the symmetric rank one update. Math. Program. 50(1–3), 177–195 (1991)MathSciNetCrossRef Conn, A.R., Gould, N.I.M., Toint, Ph.L.: Convergence of quasi-newton matrices generated by the symmetric rank one update. Math. Program. 50(1–3), 177–195 (1991)MathSciNetCrossRef
10.
go back to reference Dai, Y.H., Liao, L.Z.: New conjugate conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)MathSciNetCrossRef Dai, Y.H., Liao, L.Z.: New conjugate conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)MathSciNetCrossRef
11.
go back to reference Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)MathSciNetCrossRef Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)MathSciNetCrossRef
12.
go back to reference Fiacco, A.V., McCormick, G.P.: Nonlinear programming: sequential unconstrained minimization techniques. Research Analysis Corporation, McLean Virginia (1968). Republished in 1990 by SIAM, Philadelphia Fiacco, A.V., McCormick, G.P.: Nonlinear programming: sequential unconstrained minimization techniques. Research Analysis Corporation, McLean Virginia (1968). Republished in 1990 by SIAM, Philadelphia
13.
go back to reference Glowinski, R.: Numerical methods for nonlinear variational problems. Springer-Verlag, Berlin (1984)CrossRef Glowinski, R.: Numerical methods for nonlinear variational problems. Springer-Verlag, Berlin (1984)CrossRef
14.
go back to reference Goodman, J., Kohn, R., Reyna, L.: Numerical study of a relaxed variational problem from optimal design. Comput. Methods Appl. Mech. Eng. 57, 107–127 (1986)MathSciNetCrossRef Goodman, J., Kohn, R., Reyna, L.: Numerical study of a relaxed variational problem from optimal design. Comput. Methods Appl. Mech. Eng. 57, 107–127 (1986)MathSciNetCrossRef
15.
go back to reference Kelley, C.T., Sachs, E.W.: Local convergence of the symmetric rank one iteration. Comput. Optim. Appl. 9, 43–63 (1998)MathSciNetCrossRef Kelley, C.T., Sachs, E.W.: Local convergence of the symmetric rank one iteration. Comput. Optim. Appl. 9, 43–63 (1998)MathSciNetCrossRef
16.
go back to reference Khalfan, H.F., Byrd, R.H., Schnabel, R.B.: A theoretical and experimental study of the symmetric rank-one update. SIAM J. Optim. 3(1), 1–24 (1993)MathSciNetCrossRef Khalfan, H.F., Byrd, R.H., Schnabel, R.B.: A theoretical and experimental study of the symmetric rank-one update. SIAM J. Optim. 3(1), 1–24 (1993)MathSciNetCrossRef
17.
go back to reference Nitsche, J.C.C.: Lectures on minimal surfaces, vol. 1. Cambridge University Press, Cambridge (1989)MATH Nitsche, J.C.C.: Lectures on minimal surfaces, vol. 1. Cambridge University Press, Cambridge (1989)MATH
20.
go back to reference Shanno, D.F.: On the convergence of a new conjugate gradient algorithm. SIAM J. Numer. Anal. 15, 1247–1257 (1978)MathSciNetCrossRef Shanno, D.F.: On the convergence of a new conjugate gradient algorithm. SIAM J. Numer. Anal. 15, 1247–1257 (1978)MathSciNetCrossRef
21.
go back to reference Shanno, D.F.: CONMIN—A Fortran subroutine for minimizing an unconstrained nonlinear scalar valued function of a vector variable x either by the BFGS variable metric algorithm or by a Beale restarted conjugate gradient algorithm. Private Commun. (1983) Shanno, D.F.: CONMIN—A Fortran subroutine for minimizing an unconstrained nonlinear scalar valued function of a vector variable x either by the BFGS variable metric algorithm or by a Beale restarted conjugate gradient algorithm. Private Commun. (1983)
22.
go back to reference Shanno, D.F., Phua, K.H.: Algorithm 500. Minimization of unconstrained multivariable functions. ACM Trans. Math. Softw. 2, 87–94 (1976)CrossRef Shanno, D.F., Phua, K.H.: Algorithm 500. Minimization of unconstrained multivariable functions. ACM Trans. Math. Softw. 2, 87–94 (1976)CrossRef
24.
Metadata
Title
Accelerated memory-less SR1 method with generalized secant equation for unconstrained optimization
Author
Neculai Andrei
Publication date
01-06-2022
Publisher
Springer International Publishing
Published in
Calcolo / Issue 2/2022
Print ISSN: 0008-0624
Electronic ISSN: 1126-5434
DOI
https://doi.org/10.1007/s10092-022-00460-x

Other articles of this Issue 2/2022

Calcolo 2/2022 Go to the issue

Premium Partner