Skip to main content
Top
Published in: Journal of Applied Mathematics and Computing 1-2/2014

01-10-2014 | Original Research

A nonmonotone supermemory gradient algorithm for unconstrained optimization

Authors: Yigui Ou, Yuanwen Liu

Published in: Journal of Applied Mathematics and Computing | Issue 1-2/2014

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

This paper presents a nonmonotone supermemory gradient algorithm for unconstrained optimization problems. At each iteration, this proposed method sufficiently uses the previous multi-step iterative information and avoids the storage and computation of matrices associated with the Hessian of objective functions, thus it is suitable to solve large-scale optimization problems and can converge stably. Under some assumptions, the convergence properties of the proposed algorithm are analyzed. Numerical results are also reported to show the efficiency of this proposed method.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Sun, W.Y., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer Optimization and Its Applications, vol. 1. Springer, New York (2006) Sun, W.Y., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer Optimization and Its Applications, vol. 1. Springer, New York (2006)
2.
go back to reference Dai, Y.H., Yuan, Y.X.: Nonlinear Conjugate Gradient Methods. Shanghai Sci. Technol., Shanghai (2000) (in Chines) Dai, Y.H., Yuan, Y.X.: Nonlinear Conjugate Gradient Methods. Shanghai Sci. Technol., Shanghai (2000) (in Chines)
3.
go back to reference Hager, W.W., Zhang, H.C.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006) MATHMathSciNet Hager, W.W., Zhang, H.C.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006) MATHMathSciNet
4.
go back to reference Wei, Z.X., Li, G.Y., Qi, L.Q.: Global convergence of the Polak-Ribière-Polyak conjugate gradient methods with inexact line search for nonconvex unconstrained optimization problems. Math. Comput. 77, 2173–2193 (2008) CrossRefMATHMathSciNet Wei, Z.X., Li, G.Y., Qi, L.Q.: Global convergence of the Polak-Ribière-Polyak conjugate gradient methods with inexact line search for nonconvex unconstrained optimization problems. Math. Comput. 77, 2173–2193 (2008) CrossRefMATHMathSciNet
5.
go back to reference Yuan, G.L.: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems. Optim. Lett. 3, 11–21 (2009) CrossRefMATHMathSciNet Yuan, G.L.: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems. Optim. Lett. 3, 11–21 (2009) CrossRefMATHMathSciNet
6.
go back to reference Zhang, L., Zhou, W.J., Li, D.H.: A descent modified Polak-Ribière-Polyak conjugate method and its global convergence. IMA J. Numer. Anal. 26, 629–649 (2006) CrossRefMATHMathSciNet Zhang, L., Zhou, W.J., Li, D.H.: A descent modified Polak-Ribière-Polyak conjugate method and its global convergence. IMA J. Numer. Anal. 26, 629–649 (2006) CrossRefMATHMathSciNet
7.
go back to reference Hager, W.W., Zhang, H.C.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005) CrossRefMATHMathSciNet Hager, W.W., Zhang, H.C.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005) CrossRefMATHMathSciNet
8.
go back to reference Cheng, W.Y., Liu, Q.F.: Sufficient descent nonlinear conjugate gradient methods with conjugacy condition. Numer. Algorithms 53, 113–131 (2010) CrossRefMATHMathSciNet Cheng, W.Y., Liu, Q.F.: Sufficient descent nonlinear conjugate gradient methods with conjugacy condition. Numer. Algorithms 53, 113–131 (2010) CrossRefMATHMathSciNet
9.
go back to reference Narushima, Y., Yabe, H., Ford, J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21, 212–230 (2011) CrossRefMATHMathSciNet Narushima, Y., Yabe, H., Ford, J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21, 212–230 (2011) CrossRefMATHMathSciNet
10.
go back to reference Miele, A., Cantrell, J.W.: Study on a memory gradient method for the minimization of functions. J. Optim. Theory Appl. 3, 459–470 (1969) CrossRefMATHMathSciNet Miele, A., Cantrell, J.W.: Study on a memory gradient method for the minimization of functions. J. Optim. Theory Appl. 3, 459–470 (1969) CrossRefMATHMathSciNet
11.
go back to reference Cragg, E.E., Levy, A.V.: Study on a supermemory gradient method for the minimization of functions. J. Optim. Theory Appl. 4, 191–205 (1969) CrossRefMATHMathSciNet Cragg, E.E., Levy, A.V.: Study on a supermemory gradient method for the minimization of functions. J. Optim. Theory Appl. 4, 191–205 (1969) CrossRefMATHMathSciNet
12.
go back to reference Narushima, Y., Yabe, H.: Global convergence of a memory gradient method for unconstrained optimization. Comput. Optim. Appl. 35, 325–346 (2006) CrossRefMATHMathSciNet Narushima, Y., Yabe, H.: Global convergence of a memory gradient method for unconstrained optimization. Comput. Optim. Appl. 35, 325–346 (2006) CrossRefMATHMathSciNet
14.
go back to reference Zheng, Y., Wan, Z.P.: A new variant of the memory gradient method for unconstrained optimization. Optim. Lett. 6, 1643–1655 (2012) CrossRefMATHMathSciNet Zheng, Y., Wan, Z.P.: A new variant of the memory gradient method for unconstrained optimization. Optim. Lett. 6, 1643–1655 (2012) CrossRefMATHMathSciNet
15.
go back to reference Shi, Z.J., Shen, J.: On memory gradient method with trust region for unconstrained optimization. Numer. Algorithms 41, 173–196 (2006) CrossRefMATHMathSciNet Shi, Z.J., Shen, J.: On memory gradient method with trust region for unconstrained optimization. Numer. Algorithms 41, 173–196 (2006) CrossRefMATHMathSciNet
16.
17.
go back to reference Ou, Y.G., Wang, G.S.: A hybrid ODE-based method for unconstrained optimization problems. Comput. Optim. Appl. 53, 249–270 (2012) CrossRefMATHMathSciNet Ou, Y.G., Wang, G.S.: A hybrid ODE-based method for unconstrained optimization problems. Comput. Optim. Appl. 53, 249–270 (2012) CrossRefMATHMathSciNet
18.
go back to reference Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23, 707–716 (1986) CrossRefMATHMathSciNet Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23, 707–716 (1986) CrossRefMATHMathSciNet
19.
go back to reference Liu, G.H., Jing, L.Z., Han, L.X., Han, D.: A class of nonmonotone conjugate gradient method for unconstrained optimization. J. Optim. Theory Appl. 101, 127–140 (1999) CrossRefMATHMathSciNet Liu, G.H., Jing, L.Z., Han, L.X., Han, D.: A class of nonmonotone conjugate gradient method for unconstrained optimization. J. Optim. Theory Appl. 101, 127–140 (1999) CrossRefMATHMathSciNet
20.
go back to reference Shi, Z.J., Wang, S.Q., Xu, Z.W.: The convergence of conjugate gradient method with nonmonotone line search. Appl. Math. Comput. 217, 1921–1932 (2010) CrossRefMATHMathSciNet Shi, Z.J., Wang, S.Q., Xu, Z.W.: The convergence of conjugate gradient method with nonmonotone line search. Appl. Math. Comput. 217, 1921–1932 (2010) CrossRefMATHMathSciNet
21.
go back to reference Narushima, Y.: A nonmonotone memory gradient method for unconstrained optimization. J. Oper. Res. Soc. Jpn. 50, 31–45 (2007) MATHMathSciNet Narushima, Y.: A nonmonotone memory gradient method for unconstrained optimization. J. Oper. Res. Soc. Jpn. 50, 31–45 (2007) MATHMathSciNet
22.
go back to reference Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7, 26–33 (1997) CrossRefMATHMathSciNet Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7, 26–33 (1997) CrossRefMATHMathSciNet
23.
go back to reference Sun, W.Y., Zhou, Q.Y.: An unconstrained optimization method using nonmonotone second order Goldstein’s linesearch. Sci. China Ser. A, Math. 50, 1389–1400 (2007) CrossRefMATHMathSciNet Sun, W.Y., Zhou, Q.Y.: An unconstrained optimization method using nonmonotone second order Goldstein’s linesearch. Sci. China Ser. A, Math. 50, 1389–1400 (2007) CrossRefMATHMathSciNet
25.
go back to reference Mo, J.T., Zhang, K.C., Wei, Z.X.: A nonmonotone trust region method for unconstrained optimization. Appl. Math. Comput. 171, 371–384 (2005) CrossRefMATHMathSciNet Mo, J.T., Zhang, K.C., Wei, Z.X.: A nonmonotone trust region method for unconstrained optimization. Appl. Math. Comput. 171, 371–384 (2005) CrossRefMATHMathSciNet
26.
go back to reference Ou, Y.G., Zhou, Q.: A nonmonotonic trust region algorithm for a class of semi-infinite minimax programming. Appl. Math. Comput. 215(2), 474–480 (2009) CrossRefMATHMathSciNet Ou, Y.G., Zhou, Q.: A nonmonotonic trust region algorithm for a class of semi-infinite minimax programming. Appl. Math. Comput. 215(2), 474–480 (2009) CrossRefMATHMathSciNet
27.
28.
go back to reference Ji, Y., Li, Y.J., Zhang, K.C., Zhang, X.L.: A new nonmonotone trust region method for conic model for solving unconstrained optimization. J. Comput. Appl. Math. 233, 1746–1754 (2010) CrossRefMATHMathSciNet Ji, Y., Li, Y.J., Zhang, K.C., Zhang, X.L.: A new nonmonotone trust region method for conic model for solving unconstrained optimization. J. Comput. Appl. Math. 233, 1746–1754 (2010) CrossRefMATHMathSciNet
30.
go back to reference Toint, Ph.L.: An assessment of nonmonotone line search techniques for unconstrained optimization. SIAM J. Sci. Stat. Comput. 17, 725–739 (1996) CrossRefMATHMathSciNet Toint, Ph.L.: An assessment of nonmonotone line search techniques for unconstrained optimization. SIAM J. Sci. Stat. Comput. 17, 725–739 (1996) CrossRefMATHMathSciNet
31.
go back to reference Zhang, H.C., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14, 1043–1056 (2004) CrossRefMATHMathSciNet Zhang, H.C., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14, 1043–1056 (2004) CrossRefMATHMathSciNet
32.
go back to reference Hu, S.L., Huang, Z.H., Lu, N.: A nonmonotone line search algorithm for unconstrained optimization. J. Sci. Comput. 42, 38–53 (2010) CrossRefMATHMathSciNet Hu, S.L., Huang, Z.H., Lu, N.: A nonmonotone line search algorithm for unconstrained optimization. J. Sci. Comput. 42, 38–53 (2010) CrossRefMATHMathSciNet
33.
go back to reference Cui, Z.C., Wu, B.Y.: A new modified nonmonotone adaptive trust region method for unconstrained optimization. Comput. Optim. Appl. 53, 795–806 (2012) CrossRefMATHMathSciNet Cui, Z.C., Wu, B.Y.: A new modified nonmonotone adaptive trust region method for unconstrained optimization. Comput. Optim. Appl. 53, 795–806 (2012) CrossRefMATHMathSciNet
34.
35.
go back to reference Andrei, N.: An unconstrained optimization test functions. Adv. Model. Optim. 10(1), 147–161 (2008) MATHMathSciNet Andrei, N.: An unconstrained optimization test functions. Adv. Model. Optim. 10(1), 147–161 (2008) MATHMathSciNet
36.
go back to reference Dolan, E.D., More, J.J.: Benchmarking optimization software with performance profiles. Math. Program., Ser. A 91, 201–213 (2002) CrossRefMATHMathSciNet Dolan, E.D., More, J.J.: Benchmarking optimization software with performance profiles. Math. Program., Ser. A 91, 201–213 (2002) CrossRefMATHMathSciNet
37.
go back to reference Wen, Z.W., Yin, W.T., Liu, X., Zhang, Y.: Introduction to compressive sensing and sparse optimization. Oper. Res. Trans. 16, 49–64 (2012) (in Chinese) MathSciNet Wen, Z.W., Yin, W.T., Liu, X., Zhang, Y.: Introduction to compressive sensing and sparse optimization. Oper. Res. Trans. 16, 49–64 (2012) (in Chinese) MathSciNet
38.
go back to reference Vasant, P.: Meta-Heuristics Optimization Algorithms in Engineering, Business, Economics, and Finance. IGI Global, Hershey (2013) Vasant, P.: Meta-Heuristics Optimization Algorithms in Engineering, Business, Economics, and Finance. IGI Global, Hershey (2013)
39.
go back to reference Vasant, P.M.: Handbook of Research on Novel Soft Computing Intelligent Algorithms: Theory and Practical Applications. IGI Global, Hershey (2014) Vasant, P.M.: Handbook of Research on Novel Soft Computing Intelligent Algorithms: Theory and Practical Applications. IGI Global, Hershey (2014)
40.
go back to reference Kolda, T.G., Lewis, R.M., Torczon, V.: Optimization by direct search: new perspectives on some classical and modern methods. SIAM Rev. 45, 385–482 (2003) CrossRefMATHMathSciNet Kolda, T.G., Lewis, R.M., Torczon, V.: Optimization by direct search: new perspectives on some classical and modern methods. SIAM Rev. 45, 385–482 (2003) CrossRefMATHMathSciNet
41.
go back to reference Ganesan, T., Vasant, P., Elamvazuthi, I.: Hybrid neuro-genetic programming approach for optimizing operations in 3-D land seismic surveys. Math. Comput. Model. 54, 2913–2922 (2011) CrossRefMATH Ganesan, T., Vasant, P., Elamvazuthi, I.: Hybrid neuro-genetic programming approach for optimizing operations in 3-D land seismic surveys. Math. Comput. Model. 54, 2913–2922 (2011) CrossRefMATH
42.
go back to reference Ganesan, T., Vasant, P., Elamvazuthi, I.: Hybrid PSO approach for solving non-convex optimization problems. Arch. Control Sci. 22, 5–23 (2012) Ganesan, T., Vasant, P., Elamvazuthi, I.: Hybrid PSO approach for solving non-convex optimization problems. Arch. Control Sci. 22, 5–23 (2012)
43.
go back to reference Vasant, P., Barsoum, N.: Hybrid genetic algorithms and line search method for industrial production planning with non-linear fitness function. Eng. Appl. Artif. Intell. 22, 767–777 (2009) CrossRef Vasant, P., Barsoum, N.: Hybrid genetic algorithms and line search method for industrial production planning with non-linear fitness function. Eng. Appl. Artif. Intell. 22, 767–777 (2009) CrossRef
44.
go back to reference Vasant, P., Barsoum, N.: Hybrid pattern search and simulated annealing for fuzzy production planning problems. Comput. Math. Appl. 60, 1058–1067 (2010) CrossRefMATH Vasant, P., Barsoum, N.: Hybrid pattern search and simulated annealing for fuzzy production planning problems. Comput. Math. Appl. 60, 1058–1067 (2010) CrossRefMATH
45.
go back to reference Vasant, P.: Hybrid simulated annealing and genetic algorithms for industrial production management problems. Int. J. Comput. Methods 7, 279–297 (2010) CrossRef Vasant, P.: Hybrid simulated annealing and genetic algorithms for industrial production management problems. Int. J. Comput. Methods 7, 279–297 (2010) CrossRef
46.
go back to reference Vasant Hybrid, P.: LS-SA-PS methods for solving fuzzy non-linear programming problems. Math. Comput. Model. 57, 180–188 (2013) CrossRef Vasant Hybrid, P.: LS-SA-PS methods for solving fuzzy non-linear programming problems. Math. Comput. Model. 57, 180–188 (2013) CrossRef
47.
go back to reference Gong, B.: A new Fourier neural network based on Quasi-Newton method. ICIC Express Lett., Part B, Appl. 3, 355–360 (2012) Gong, B.: A new Fourier neural network based on Quasi-Newton method. ICIC Express Lett., Part B, Appl. 3, 355–360 (2012)
Metadata
Title
A nonmonotone supermemory gradient algorithm for unconstrained optimization
Authors
Yigui Ou
Yuanwen Liu
Publication date
01-10-2014
Publisher
Springer Berlin Heidelberg
Published in
Journal of Applied Mathematics and Computing / Issue 1-2/2014
Print ISSN: 1598-5865
Electronic ISSN: 1865-2085
DOI
https://doi.org/10.1007/s12190-013-0747-0

Other articles of this Issue 1-2/2014

Journal of Applied Mathematics and Computing 1-2/2014 Go to the issue

Premium Partner