Skip to main content
Erschienen in: Journal of Applied Mathematics and Computing 1-2/2014

01.10.2014 | Original Research

A nonmonotone supermemory gradient algorithm for unconstrained optimization

verfasst von: Yigui Ou, Yuanwen Liu

Erschienen in: Journal of Applied Mathematics and Computing | Ausgabe 1-2/2014

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

This paper presents a nonmonotone supermemory gradient algorithm for unconstrained optimization problems. At each iteration, this proposed method sufficiently uses the previous multi-step iterative information and avoids the storage and computation of matrices associated with the Hessian of objective functions, thus it is suitable to solve large-scale optimization problems and can converge stably. Under some assumptions, the convergence properties of the proposed algorithm are analyzed. Numerical results are also reported to show the efficiency of this proposed method.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Sun, W.Y., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer Optimization and Its Applications, vol. 1. Springer, New York (2006) Sun, W.Y., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer Optimization and Its Applications, vol. 1. Springer, New York (2006)
2.
Zurück zum Zitat Dai, Y.H., Yuan, Y.X.: Nonlinear Conjugate Gradient Methods. Shanghai Sci. Technol., Shanghai (2000) (in Chines) Dai, Y.H., Yuan, Y.X.: Nonlinear Conjugate Gradient Methods. Shanghai Sci. Technol., Shanghai (2000) (in Chines)
3.
Zurück zum Zitat Hager, W.W., Zhang, H.C.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006) MATHMathSciNet Hager, W.W., Zhang, H.C.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006) MATHMathSciNet
4.
Zurück zum Zitat Wei, Z.X., Li, G.Y., Qi, L.Q.: Global convergence of the Polak-Ribière-Polyak conjugate gradient methods with inexact line search for nonconvex unconstrained optimization problems. Math. Comput. 77, 2173–2193 (2008) CrossRefMATHMathSciNet Wei, Z.X., Li, G.Y., Qi, L.Q.: Global convergence of the Polak-Ribière-Polyak conjugate gradient methods with inexact line search for nonconvex unconstrained optimization problems. Math. Comput. 77, 2173–2193 (2008) CrossRefMATHMathSciNet
5.
Zurück zum Zitat Yuan, G.L.: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems. Optim. Lett. 3, 11–21 (2009) CrossRefMATHMathSciNet Yuan, G.L.: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems. Optim. Lett. 3, 11–21 (2009) CrossRefMATHMathSciNet
6.
Zurück zum Zitat Zhang, L., Zhou, W.J., Li, D.H.: A descent modified Polak-Ribière-Polyak conjugate method and its global convergence. IMA J. Numer. Anal. 26, 629–649 (2006) CrossRefMATHMathSciNet Zhang, L., Zhou, W.J., Li, D.H.: A descent modified Polak-Ribière-Polyak conjugate method and its global convergence. IMA J. Numer. Anal. 26, 629–649 (2006) CrossRefMATHMathSciNet
7.
Zurück zum Zitat Hager, W.W., Zhang, H.C.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005) CrossRefMATHMathSciNet Hager, W.W., Zhang, H.C.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005) CrossRefMATHMathSciNet
8.
Zurück zum Zitat Cheng, W.Y., Liu, Q.F.: Sufficient descent nonlinear conjugate gradient methods with conjugacy condition. Numer. Algorithms 53, 113–131 (2010) CrossRefMATHMathSciNet Cheng, W.Y., Liu, Q.F.: Sufficient descent nonlinear conjugate gradient methods with conjugacy condition. Numer. Algorithms 53, 113–131 (2010) CrossRefMATHMathSciNet
9.
Zurück zum Zitat Narushima, Y., Yabe, H., Ford, J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21, 212–230 (2011) CrossRefMATHMathSciNet Narushima, Y., Yabe, H., Ford, J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21, 212–230 (2011) CrossRefMATHMathSciNet
10.
Zurück zum Zitat Miele, A., Cantrell, J.W.: Study on a memory gradient method for the minimization of functions. J. Optim. Theory Appl. 3, 459–470 (1969) CrossRefMATHMathSciNet Miele, A., Cantrell, J.W.: Study on a memory gradient method for the minimization of functions. J. Optim. Theory Appl. 3, 459–470 (1969) CrossRefMATHMathSciNet
11.
Zurück zum Zitat Cragg, E.E., Levy, A.V.: Study on a supermemory gradient method for the minimization of functions. J. Optim. Theory Appl. 4, 191–205 (1969) CrossRefMATHMathSciNet Cragg, E.E., Levy, A.V.: Study on a supermemory gradient method for the minimization of functions. J. Optim. Theory Appl. 4, 191–205 (1969) CrossRefMATHMathSciNet
12.
Zurück zum Zitat Narushima, Y., Yabe, H.: Global convergence of a memory gradient method for unconstrained optimization. Comput. Optim. Appl. 35, 325–346 (2006) CrossRefMATHMathSciNet Narushima, Y., Yabe, H.: Global convergence of a memory gradient method for unconstrained optimization. Comput. Optim. Appl. 35, 325–346 (2006) CrossRefMATHMathSciNet
13.
14.
Zurück zum Zitat Zheng, Y., Wan, Z.P.: A new variant of the memory gradient method for unconstrained optimization. Optim. Lett. 6, 1643–1655 (2012) CrossRefMATHMathSciNet Zheng, Y., Wan, Z.P.: A new variant of the memory gradient method for unconstrained optimization. Optim. Lett. 6, 1643–1655 (2012) CrossRefMATHMathSciNet
15.
Zurück zum Zitat Shi, Z.J., Shen, J.: On memory gradient method with trust region for unconstrained optimization. Numer. Algorithms 41, 173–196 (2006) CrossRefMATHMathSciNet Shi, Z.J., Shen, J.: On memory gradient method with trust region for unconstrained optimization. Numer. Algorithms 41, 173–196 (2006) CrossRefMATHMathSciNet
16.
Zurück zum Zitat Ou, Y.G., Wang, G.S.: A new supermemory gradient method for unconstrained optimization problems. Optim. Lett. 6, 975–992 (2012) CrossRefMATHMathSciNet Ou, Y.G., Wang, G.S.: A new supermemory gradient method for unconstrained optimization problems. Optim. Lett. 6, 975–992 (2012) CrossRefMATHMathSciNet
17.
Zurück zum Zitat Ou, Y.G., Wang, G.S.: A hybrid ODE-based method for unconstrained optimization problems. Comput. Optim. Appl. 53, 249–270 (2012) CrossRefMATHMathSciNet Ou, Y.G., Wang, G.S.: A hybrid ODE-based method for unconstrained optimization problems. Comput. Optim. Appl. 53, 249–270 (2012) CrossRefMATHMathSciNet
18.
Zurück zum Zitat Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23, 707–716 (1986) CrossRefMATHMathSciNet Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23, 707–716 (1986) CrossRefMATHMathSciNet
19.
Zurück zum Zitat Liu, G.H., Jing, L.Z., Han, L.X., Han, D.: A class of nonmonotone conjugate gradient method for unconstrained optimization. J. Optim. Theory Appl. 101, 127–140 (1999) CrossRefMATHMathSciNet Liu, G.H., Jing, L.Z., Han, L.X., Han, D.: A class of nonmonotone conjugate gradient method for unconstrained optimization. J. Optim. Theory Appl. 101, 127–140 (1999) CrossRefMATHMathSciNet
20.
Zurück zum Zitat Shi, Z.J., Wang, S.Q., Xu, Z.W.: The convergence of conjugate gradient method with nonmonotone line search. Appl. Math. Comput. 217, 1921–1932 (2010) CrossRefMATHMathSciNet Shi, Z.J., Wang, S.Q., Xu, Z.W.: The convergence of conjugate gradient method with nonmonotone line search. Appl. Math. Comput. 217, 1921–1932 (2010) CrossRefMATHMathSciNet
21.
Zurück zum Zitat Narushima, Y.: A nonmonotone memory gradient method for unconstrained optimization. J. Oper. Res. Soc. Jpn. 50, 31–45 (2007) MATHMathSciNet Narushima, Y.: A nonmonotone memory gradient method for unconstrained optimization. J. Oper. Res. Soc. Jpn. 50, 31–45 (2007) MATHMathSciNet
22.
Zurück zum Zitat Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7, 26–33 (1997) CrossRefMATHMathSciNet Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7, 26–33 (1997) CrossRefMATHMathSciNet
23.
Zurück zum Zitat Sun, W.Y., Zhou, Q.Y.: An unconstrained optimization method using nonmonotone second order Goldstein’s linesearch. Sci. China Ser. A, Math. 50, 1389–1400 (2007) CrossRefMATHMathSciNet Sun, W.Y., Zhou, Q.Y.: An unconstrained optimization method using nonmonotone second order Goldstein’s linesearch. Sci. China Ser. A, Math. 50, 1389–1400 (2007) CrossRefMATHMathSciNet
24.
25.
Zurück zum Zitat Mo, J.T., Zhang, K.C., Wei, Z.X.: A nonmonotone trust region method for unconstrained optimization. Appl. Math. Comput. 171, 371–384 (2005) CrossRefMATHMathSciNet Mo, J.T., Zhang, K.C., Wei, Z.X.: A nonmonotone trust region method for unconstrained optimization. Appl. Math. Comput. 171, 371–384 (2005) CrossRefMATHMathSciNet
26.
Zurück zum Zitat Ou, Y.G., Zhou, Q.: A nonmonotonic trust region algorithm for a class of semi-infinite minimax programming. Appl. Math. Comput. 215(2), 474–480 (2009) CrossRefMATHMathSciNet Ou, Y.G., Zhou, Q.: A nonmonotonic trust region algorithm for a class of semi-infinite minimax programming. Appl. Math. Comput. 215(2), 474–480 (2009) CrossRefMATHMathSciNet
27.
28.
Zurück zum Zitat Ji, Y., Li, Y.J., Zhang, K.C., Zhang, X.L.: A new nonmonotone trust region method for conic model for solving unconstrained optimization. J. Comput. Appl. Math. 233, 1746–1754 (2010) CrossRefMATHMathSciNet Ji, Y., Li, Y.J., Zhang, K.C., Zhang, X.L.: A new nonmonotone trust region method for conic model for solving unconstrained optimization. J. Comput. Appl. Math. 233, 1746–1754 (2010) CrossRefMATHMathSciNet
30.
Zurück zum Zitat Toint, Ph.L.: An assessment of nonmonotone line search techniques for unconstrained optimization. SIAM J. Sci. Stat. Comput. 17, 725–739 (1996) CrossRefMATHMathSciNet Toint, Ph.L.: An assessment of nonmonotone line search techniques for unconstrained optimization. SIAM J. Sci. Stat. Comput. 17, 725–739 (1996) CrossRefMATHMathSciNet
31.
Zurück zum Zitat Zhang, H.C., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14, 1043–1056 (2004) CrossRefMATHMathSciNet Zhang, H.C., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14, 1043–1056 (2004) CrossRefMATHMathSciNet
32.
Zurück zum Zitat Hu, S.L., Huang, Z.H., Lu, N.: A nonmonotone line search algorithm for unconstrained optimization. J. Sci. Comput. 42, 38–53 (2010) CrossRefMATHMathSciNet Hu, S.L., Huang, Z.H., Lu, N.: A nonmonotone line search algorithm for unconstrained optimization. J. Sci. Comput. 42, 38–53 (2010) CrossRefMATHMathSciNet
33.
Zurück zum Zitat Cui, Z.C., Wu, B.Y.: A new modified nonmonotone adaptive trust region method for unconstrained optimization. Comput. Optim. Appl. 53, 795–806 (2012) CrossRefMATHMathSciNet Cui, Z.C., Wu, B.Y.: A new modified nonmonotone adaptive trust region method for unconstrained optimization. Comput. Optim. Appl. 53, 795–806 (2012) CrossRefMATHMathSciNet
34.
35.
Zurück zum Zitat Andrei, N.: An unconstrained optimization test functions. Adv. Model. Optim. 10(1), 147–161 (2008) MATHMathSciNet Andrei, N.: An unconstrained optimization test functions. Adv. Model. Optim. 10(1), 147–161 (2008) MATHMathSciNet
36.
Zurück zum Zitat Dolan, E.D., More, J.J.: Benchmarking optimization software with performance profiles. Math. Program., Ser. A 91, 201–213 (2002) CrossRefMATHMathSciNet Dolan, E.D., More, J.J.: Benchmarking optimization software with performance profiles. Math. Program., Ser. A 91, 201–213 (2002) CrossRefMATHMathSciNet
37.
Zurück zum Zitat Wen, Z.W., Yin, W.T., Liu, X., Zhang, Y.: Introduction to compressive sensing and sparse optimization. Oper. Res. Trans. 16, 49–64 (2012) (in Chinese) MathSciNet Wen, Z.W., Yin, W.T., Liu, X., Zhang, Y.: Introduction to compressive sensing and sparse optimization. Oper. Res. Trans. 16, 49–64 (2012) (in Chinese) MathSciNet
38.
Zurück zum Zitat Vasant, P.: Meta-Heuristics Optimization Algorithms in Engineering, Business, Economics, and Finance. IGI Global, Hershey (2013) Vasant, P.: Meta-Heuristics Optimization Algorithms in Engineering, Business, Economics, and Finance. IGI Global, Hershey (2013)
39.
Zurück zum Zitat Vasant, P.M.: Handbook of Research on Novel Soft Computing Intelligent Algorithms: Theory and Practical Applications. IGI Global, Hershey (2014) Vasant, P.M.: Handbook of Research on Novel Soft Computing Intelligent Algorithms: Theory and Practical Applications. IGI Global, Hershey (2014)
40.
Zurück zum Zitat Kolda, T.G., Lewis, R.M., Torczon, V.: Optimization by direct search: new perspectives on some classical and modern methods. SIAM Rev. 45, 385–482 (2003) CrossRefMATHMathSciNet Kolda, T.G., Lewis, R.M., Torczon, V.: Optimization by direct search: new perspectives on some classical and modern methods. SIAM Rev. 45, 385–482 (2003) CrossRefMATHMathSciNet
41.
Zurück zum Zitat Ganesan, T., Vasant, P., Elamvazuthi, I.: Hybrid neuro-genetic programming approach for optimizing operations in 3-D land seismic surveys. Math. Comput. Model. 54, 2913–2922 (2011) CrossRefMATH Ganesan, T., Vasant, P., Elamvazuthi, I.: Hybrid neuro-genetic programming approach for optimizing operations in 3-D land seismic surveys. Math. Comput. Model. 54, 2913–2922 (2011) CrossRefMATH
42.
Zurück zum Zitat Ganesan, T., Vasant, P., Elamvazuthi, I.: Hybrid PSO approach for solving non-convex optimization problems. Arch. Control Sci. 22, 5–23 (2012) Ganesan, T., Vasant, P., Elamvazuthi, I.: Hybrid PSO approach for solving non-convex optimization problems. Arch. Control Sci. 22, 5–23 (2012)
43.
Zurück zum Zitat Vasant, P., Barsoum, N.: Hybrid genetic algorithms and line search method for industrial production planning with non-linear fitness function. Eng. Appl. Artif. Intell. 22, 767–777 (2009) CrossRef Vasant, P., Barsoum, N.: Hybrid genetic algorithms and line search method for industrial production planning with non-linear fitness function. Eng. Appl. Artif. Intell. 22, 767–777 (2009) CrossRef
44.
Zurück zum Zitat Vasant, P., Barsoum, N.: Hybrid pattern search and simulated annealing for fuzzy production planning problems. Comput. Math. Appl. 60, 1058–1067 (2010) CrossRefMATH Vasant, P., Barsoum, N.: Hybrid pattern search and simulated annealing for fuzzy production planning problems. Comput. Math. Appl. 60, 1058–1067 (2010) CrossRefMATH
45.
Zurück zum Zitat Vasant, P.: Hybrid simulated annealing and genetic algorithms for industrial production management problems. Int. J. Comput. Methods 7, 279–297 (2010) CrossRef Vasant, P.: Hybrid simulated annealing and genetic algorithms for industrial production management problems. Int. J. Comput. Methods 7, 279–297 (2010) CrossRef
46.
Zurück zum Zitat Vasant Hybrid, P.: LS-SA-PS methods for solving fuzzy non-linear programming problems. Math. Comput. Model. 57, 180–188 (2013) CrossRef Vasant Hybrid, P.: LS-SA-PS methods for solving fuzzy non-linear programming problems. Math. Comput. Model. 57, 180–188 (2013) CrossRef
47.
Zurück zum Zitat Gong, B.: A new Fourier neural network based on Quasi-Newton method. ICIC Express Lett., Part B, Appl. 3, 355–360 (2012) Gong, B.: A new Fourier neural network based on Quasi-Newton method. ICIC Express Lett., Part B, Appl. 3, 355–360 (2012)
Metadaten
Titel
A nonmonotone supermemory gradient algorithm for unconstrained optimization
verfasst von
Yigui Ou
Yuanwen Liu
Publikationsdatum
01.10.2014
Verlag
Springer Berlin Heidelberg
Erschienen in
Journal of Applied Mathematics and Computing / Ausgabe 1-2/2014
Print ISSN: 1598-5865
Elektronische ISSN: 1865-2085
DOI
https://doi.org/10.1007/s12190-013-0747-0

Weitere Artikel der Ausgabe 1-2/2014

Journal of Applied Mathematics and Computing 1-2/2014 Zur Ausgabe