Abstract
This paper presents a new supermemory gradient method for unconstrained optimization problems. It can be regarded as a combination of ODE-based methods, line search and subspace techniques. The main characteristic of this method is that, at each iteration, a lower dimensional system of linear equations is solved only once to obtain a trial step, thus avoiding solving a quadratic trust region subproblem. Another is that when a trial step is not accepted, this proposed method generates an iterative point whose step-length satisfies Armijo line search rule, thus avoiding resolving linear system of equations. Under some reasonable assumptions, the method is proven to be globally convergent. Numerical results show the efficiency of this proposed method in practical computation.
Similar content being viewed by others
References
Sun W.Y., Yuan Y.X.: Optimization Theory and Methods: Nonlinear Programming, Springer Optimization and Its Applications, Vol.1. Springer, New York (2006)
Pardalos P.M., Resende M.G.C.: Handbook of Applied Optimization. Oxford University Press, Oxford (2002)
Nazareth J.L.: Conjugate Gradient Methods. In: Floudas, C.A., Pardalos, P.M. (eds) Encyclopedia of Optimization (2nd ed.) Part 3, pp. 466–470. Springer, Berlin (2009)
Dai Y.H., Yuan Y.X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)
Yuan G.L.: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems. Optim. Lett. 3, 11–21 (2009)
Shi Z.J., Shen J.: Convergence of supermemory gradient method. J. Appl. Math. Comput. 24(1-2), 367–376 (2007)
Yu Z.S., Zhang W.G., Wu B.F.: Stong global convergence an adaptive nonmonotone memory gradient method. Appl. Math. Comput. 185, 681–688 (2007)
Narushima Y., Yabe H.: Global convergence of a memory gradient method for unconstrained optimization. Comput. Optim. Appl. 35, 325–346 (2006)
Powell M.J.D.: Restart procedures for the conjugate gradient method. Math. Program. 12, 154–241 (1977)
Shi Z.J.: A new supermemory gradient method for unconstrained optimization. Adv. Math. 35(3), 265–274 (2006)
Shi Z.J., Shen J.: A new supermemory gradient method with curve search rule. Appl. Math. Comput. 170, 1–16 (2005)
Liu, J., Liu, H.B., Zheng, Y.: A new supermemory gradient method without line search for unconstrained optimization. In: Wang, H. et al. (eds.) The Sixth International Symposium on Neural Networks (ISNN 2009), Advances in Soft Computing, vol. 56, pp. 641–647 (2009)
Yu Z.S.: Global convergence of a memory gradient method without line search. J. App. Math. Comput. 26(1-2), 545–553 (2008)
Shi Z.J., Shen J.: A new class of supermemory gradient methods. Appl. Math. Comput. 183, 748–760 (2006)
Shi Z.J., Shen J.: On memory gradient method with trust region for unconstrained optimization. Numer. Algorithm 41, 173–196 (2006)
Brown A.A., Biggs M.C.: Some effective methods for unconstrained optimization based on the solution of system of ordinary differentiable equations. J. Optim. Theorem Appl. 62, 211–224 (1989)
Shi Z.J., Xu Z.W.: The convergence of subspace trust region methods. J. Comput. Appl. Math. 231, 365–377 (2009)
Ou Y.G., Zhou Q., Lin H.C.: An ODE-based trust region method for unconstrained optimization problems. J. Comput. Appl. Math. 232, 318–326 (2009)
Liu D.C., Nocedal J.: On the limited memory BFGS method for large scale optimization. Math. Program. 45, 503–528 (1989)
Byrd R., Nocedal J., Schnabel R.: Representations of quasi-Newton matrices and their use in limited memory methods. Math. Program. 63, 129–156 (1994)
More J., Garbow B., Hillstrom K.: Testing unconstrained optimization software. ACM Trans. Math. Softw. 7, 17–41 (1981)
Zhou F.J., Xiao Y.: A class of nonmonotone stabilization trust region methods. Computing. 53, 119–136 (1994)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Ou, Yg., Wang, Gs. A new supermemory gradient method for unconstrained optimization problems. Optim Lett 6, 975–992 (2012). https://doi.org/10.1007/s11590-011-0328-9
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11590-011-0328-9