Skip to main content
Top
Published in: Numerical Algorithms 3/2020

02-05-2019 | Original Paper

Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization

Authors: S. Bojari, M. R. Eslahchi

Published in: Numerical Algorithms | Issue 3/2020

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

In this paper, we present two families of modified three-term conjugate gradient methods for solving unconstrained large-scale smooth optimization problems. We show that our new families satisfy the Dai-Liao conjugacy condition and the sufficient descent condition under any line search technique which guarantees the positiveness of \({y_{k}^{T}} s_{k}\). For uniformly convex functions, we indicate that our families are globally convergent under weak-Wolfe-Powell line search technique and standard conditions on the objective function. We also establish a weaker global convergence theorem for general smooth functions under similar assumptions. Our numerical experiments for 260 standard problems and seven other recently developed conjugate gradient methods illustrate that the members of our families are numerically efficient and effective.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Appendix
Available only for authorised users
Literature
1.
go back to reference Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research, Springer, Berlin (2006)MATH Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research, Springer, Berlin (2006)MATH
2.
go back to reference Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)MathSciNetCrossRef Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)MathSciNetCrossRef
4.
go back to reference Polak, E., Ribiére, G.: Note sur la convergence de directions conjuguées. Francaise Infomat Recherche Operatonelle 16, 35–43 (1969)MATH Polak, E., Ribiére, G.: Note sur la convergence de directions conjuguées. Francaise Infomat Recherche Operatonelle 16, 35–43 (1969)MATH
5.
go back to reference Polyak, B.T.: The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 9, 94–112 (1969)CrossRef Polyak, B.T.: The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 9, 94–112 (1969)CrossRef
6.
go back to reference Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)MathSciNetCrossRef Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)MathSciNetCrossRef
7.
go back to reference Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2, 21–42 (1992)MathSciNetCrossRef Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2, 21–42 (1992)MathSciNetCrossRef
8.
go back to reference Birgin, E.G., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43, 117–128 (2001)MathSciNetCrossRef Birgin, E.G., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43, 117–128 (2001)MathSciNetCrossRef
9.
go back to reference Andrei, N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur. J. Oper. Res. 204, 410–420 (2010)MathSciNetCrossRef Andrei, N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur. J. Oper. Res. 204, 410–420 (2010)MathSciNetCrossRef
10.
go back to reference Fatemi, M.: A scaled conjugate gradient method for nonlinear unconstrained optimization. Optim. Methods Softw. 32, 1095–1112 (2016)MathSciNetCrossRef Fatemi, M.: A scaled conjugate gradient method for nonlinear unconstrained optimization. Optim. Methods Softw. 32, 1095–1112 (2016)MathSciNetCrossRef
11.
go back to reference Dai, Y., Liao, L.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)MathSciNetCrossRef Dai, Y., Liao, L.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)MathSciNetCrossRef
12.
go back to reference Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)MathSciNetCrossRef Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)MathSciNetCrossRef
13.
go back to reference Zheng, Y., Zhen, B.: Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems. J. Optim. Theory Appl. 175, 502–509 (2017)MathSciNetCrossRef Zheng, Y., Zhen, B.: Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems. J. Optim. Theory Appl. 175, 502–509 (2017)MathSciNetCrossRef
14.
go back to reference Andrei, N.: Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update. J. Comput. Appl. Math. 325, 149–164 (2017)MathSciNetCrossRef Andrei, N.: Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update. J. Comput. Appl. Math. 325, 149–164 (2017)MathSciNetCrossRef
15.
go back to reference Yang, Y., Chen, Y., Lu, Y.: A subspace conjugate gradient algorithm for large-scale unconstrained optimization. Numerical Algorithms 76, 813–828 (2017)MathSciNetCrossRef Yang, Y., Chen, Y., Lu, Y.: A subspace conjugate gradient algorithm for large-scale unconstrained optimization. Numerical Algorithms 76, 813–828 (2017)MathSciNetCrossRef
16.
go back to reference Alhawarat, A., Salleh, Z., Mamat, M., Rivaie, M.: An efficient modified Polak-Ribiére-Polyak conjugate gradient method with global convergence properties. Optim. Methods Softw. 32, 1299–1312 (2017)MathSciNetCrossRef Alhawarat, A., Salleh, Z., Mamat, M., Rivaie, M.: An efficient modified Polak-Ribiére-Polyak conjugate gradient method with global convergence properties. Optim. Methods Softw. 32, 1299–1312 (2017)MathSciNetCrossRef
17.
go back to reference Liu, Z., Liu, H.: Several efficient gradient methods with approximate optimal step sizes for large scale unconstrained optimization. J. Comput. Appl. Math. 328, 400–413 (2018)MathSciNetCrossRef Liu, Z., Liu, H.: Several efficient gradient methods with approximate optimal step sizes for large scale unconstrained optimization. J. Comput. Appl. Math. 328, 400–413 (2018)MathSciNetCrossRef
18.
go back to reference Babaie-Kafaki, S., Ghanbari, R.: A descent family of Dai-Liao conjugate gradient methods. Optim. Methods Softw. 29, 583–591 (2014)MathSciNetCrossRef Babaie-Kafaki, S., Ghanbari, R.: A descent family of Dai-Liao conjugate gradient methods. Optim. Methods Softw. 29, 583–591 (2014)MathSciNetCrossRef
19.
go back to reference Babaie-Kafaki, S.: On optimality of two adaptive choices for the parameter of Dai-Liao method. Optim. Lett. 10, 1789–1797 (2016)MathSciNetCrossRef Babaie-Kafaki, S.: On optimality of two adaptive choices for the parameter of Dai-Liao method. Optim. Lett. 10, 1789–1797 (2016)MathSciNetCrossRef
20.
go back to reference Fatemi, M.: An optimal parameter for Dai–Liao family of conjugate gradient methods. J. Optim. Theory Appl. 169, 587–605 (2016)MathSciNetCrossRef Fatemi, M.: An optimal parameter for Dai–Liao family of conjugate gradient methods. J. Optim. Theory Appl. 169, 587–605 (2016)MathSciNetCrossRef
21.
go back to reference Beale, E.M.L.: A derivation of conjugate gradients. In: Lootsma, F.A. (ed.) Numerical Methods for Nonlinear Optimization, pp 39–43. Academic Press, London (1972) Beale, E.M.L.: A derivation of conjugate gradients. In: Lootsma, F.A. (ed.) Numerical Methods for Nonlinear Optimization, pp 39–43. Academic Press, London (1972)
22.
23.
go back to reference Dixon, L.C.W., Ducksbury, P.G., Ningh, P.: A new three-term conjugate gradient method. J. Optim. Theory Appl. 47, 285–300 (1985)MathSciNetCrossRef Dixon, L.C.W., Ducksbury, P.G., Ningh, P.: A new three-term conjugate gradient method. J. Optim. Theory Appl. 47, 285–300 (1985)MathSciNetCrossRef
24.
go back to reference Narushima, Y., Yabe, H., Ford, J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21, 212–230 (2011)MathSciNetCrossRef Narushima, Y., Yabe, H., Ford, J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21, 212–230 (2011)MathSciNetCrossRef
25.
go back to reference Andrei, N.: A modified Polak-Ribiére-Polyak conjugate gradient algorithm for unconstrained optimization. Optimization 60, 1457–1471 (2011)MathSciNetCrossRef Andrei, N.: A modified Polak-Ribiére-Polyak conjugate gradient algorithm for unconstrained optimization. Optimization 60, 1457–1471 (2011)MathSciNetCrossRef
26.
go back to reference Liu, J.K., Wu, X.S.: New three-term conjugate gradient method for solving unconstrained optimization problems. Sci. Asia 40, 295–300 (2014)CrossRef Liu, J.K., Wu, X.S.: New three-term conjugate gradient method for solving unconstrained optimization problems. Sci. Asia 40, 295–300 (2014)CrossRef
27.
go back to reference Al-Baali, M., Narushima, Y., Yabe, H.: A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization. Comput. Optim. Appl. 60, 89–110 (2015)MathSciNetCrossRef Al-Baali, M., Narushima, Y., Yabe, H.: A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization. Comput. Optim. Appl. 60, 89–110 (2015)MathSciNetCrossRef
28.
go back to reference Wu, Y.: A modified three-term PRP conjugate gradient algorithm for optimization models. Journal of Inequalities and Applications 97, 1–14 (2017)MathSciNet Wu, Y.: A modified three-term PRP conjugate gradient algorithm for optimization models. Journal of Inequalities and Applications 97, 1–14 (2017)MathSciNet
29.
go back to reference Liu, J.K., Feng, Y.M., Zou, L.M.: Some three-term conjugate gradient methods with the inexact line search condition. Calcolo 55, 1–16 (2018)MathSciNetCrossRef Liu, J.K., Feng, Y.M., Zou, L.M.: Some three-term conjugate gradient methods with the inexact line search condition. Calcolo 55, 1–16 (2018)MathSciNetCrossRef
30.
go back to reference Zhang, L., Zhou, W., Li, D.H.: A descent modified Polak-Ribiére-Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26, 629–640 (2006)MathSciNetCrossRef Zhang, L., Zhou, W., Li, D.H.: A descent modified Polak-Ribiére-Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26, 629–640 (2006)MathSciNetCrossRef
31.
go back to reference Andrei, N.: A simple three-term conjugate gradient algorithm for unconstrained optimization. J. Comput. Appl. Math. 241, 19–29 (2013)MathSciNetCrossRef Andrei, N.: A simple three-term conjugate gradient algorithm for unconstrained optimization. J. Comput. Appl. Math. 241, 19–29 (2013)MathSciNetCrossRef
32.
go back to reference Andrei, N.: On three-term conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 219, 6316–6327 (2013)MathSciNetMATH Andrei, N.: On three-term conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 219, 6316–6327 (2013)MathSciNetMATH
33.
go back to reference Yao, S., Ning, L.: An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix. J. Comput. Appl. Math. 332, 72–85 (2018)MathSciNetCrossRef Yao, S., Ning, L.: An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix. J. Comput. Appl. Math. 332, 72–85 (2018)MathSciNetCrossRef
34.
go back to reference Zhang, L., Zhou, W.J., Li, D.H.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22, 697–711 (2007)MathSciNetCrossRef Zhang, L., Zhou, W.J., Li, D.H.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22, 697–711 (2007)MathSciNetCrossRef
35.
go back to reference Liu, J.K., Li, S.J.: New three-term conjugate gradient method with guaranteed global convergence. Int. J. Comput. Math. 91, 1744–1754 (2014)MathSciNetCrossRef Liu, J.K., Li, S.J.: New three-term conjugate gradient method with guaranteed global convergence. Int. J. Comput. Math. 91, 1744–1754 (2014)MathSciNetCrossRef
36.
go back to reference Deng, S., Wan, Z.: A three-term conjugate gradient algorithm for large-scale unconstrained optimization problems. Appl. Numer. Math. 92, 70–81 (2015)MathSciNetCrossRef Deng, S., Wan, Z.: A three-term conjugate gradient algorithm for large-scale unconstrained optimization problems. Appl. Numer. Math. 92, 70–81 (2015)MathSciNetCrossRef
37.
go back to reference Dong, X.L., Liu, H.W., He, Y.B.: New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction. Appl. Math. Comput. 269, 606–617 (2015)MathSciNetMATH Dong, X.L., Liu, H.W., He, Y.B.: New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction. Appl. Math. Comput. 269, 606–617 (2015)MathSciNetMATH
38.
go back to reference Arzuka, I., Abu Bakar, M.R., Leong, W.J.: A scaled three-term conjugate gradient method for unconstrained optimization. Journal of Inequalities and Applications 325, 1–16 (2016)MathSciNetMATH Arzuka, I., Abu Bakar, M.R., Leong, W.J.: A scaled three-term conjugate gradient method for unconstrained optimization. Journal of Inequalities and Applications 325, 1–16 (2016)MathSciNetMATH
39.
40.
go back to reference Sun, W., Yuan, Y.X.: Optimization theory and methods: nonlinear programming. Springer Optimization and its Applications, Springer (2006) Sun, W., Yuan, Y.X.: Optimization theory and methods: nonlinear programming. Springer Optimization and its Applications, Springer (2006)
41.
go back to reference Andrei, N.: An unconstrained optimization test functions collection. Advanced Modeling and Optimization 10, 147–161 (2008)MathSciNetMATH Andrei, N.: An unconstrained optimization test functions collection. Advanced Modeling and Optimization 10, 147–161 (2008)MathSciNetMATH
42.
43.
go back to reference Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)MathSciNetCrossRef Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)MathSciNetCrossRef
Metadata
Title
Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization
Authors
S. Bojari
M. R. Eslahchi
Publication date
02-05-2019
Publisher
Springer US
Published in
Numerical Algorithms / Issue 3/2020
Print ISSN: 1017-1398
Electronic ISSN: 1572-9265
DOI
https://doi.org/10.1007/s11075-019-00709-7

Other articles of this Issue 3/2020

Numerical Algorithms 3/2020 Go to the issue

Premium Partner