Skip to main content
Log in

On optimality of two adaptive choices for the parameter of Dai–Liao method

  • Short Communication
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

Orthonormal matrices are a class of well-conditioned matrices with the least spectral condition number. Here, at first it is shown that a recently proposed choice for parameter of the Dai–Liao nonlinear conjugate gradient method makes the search direction matrix as close as possible to an orthonormal matrix in the Frobenius norm. Then, conducting a brief singular value analysis, it is shown that another recently proposed choice for the Dai–Liao parameter improves spectral condition number of the search direction matrix. Thus, theoretical justifications of the two choices for the Dai–Liao parameter are enhanced. Finally, some comparative numerical results are reported.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2

References

  1. Andrei, N.: Numerical comparison of conjugate gradient algorithms for unconstrained optimization. Stud. Inform. Control 16(4), 333–352 (2007)

    MathSciNet  Google Scholar 

  2. Andrei, N.: Open problems in conjugate gradient algorithms for unconstrained optimization. B. Malays. Math. Sci. So. 34(2), 319–330 (2011)

    MathSciNet  MATH  Google Scholar 

  3. Babaie-Kafaki, S.: An adaptive conjugacy condition and related nonlinear conjugate gradient methods. Int. J. Comput. Methods 11(4), 1350092 (2014)

    Article  MathSciNet  Google Scholar 

  4. Babaie-Kafaki, S.: On the sufficient descent condition of the Hager-Zhang conjugate gradient methods. 4OR 12(3), 285–292 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  5. Babaie-Kafaki, S., Ghanbari, R.: The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234(3), 625–630 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  6. Babaie-Kafaki, S., Ghanbari, R.: A descent family of Dai–Liao conjugate gradient methods. Optim. Methods Softw. 29(3), 583–591 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  7. Babaie-Kafaki, S., Ghanbari, R.: Two optimal Dai–Liao conjugate gradient methods. Optimization 64(1), 2277–2287 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  8. Dai, Y.H., Han, J.Y., Liu, G.H., Sun, D.F., Yin, H.X., Yuan, Y.X.: Convergence properties of nonlinear conjugate gradient methods. SIAM J. Optim. 10(2), 348–358 (1999)

    MathSciNet  MATH  Google Scholar 

  9. Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  10. Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  11. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2, Ser. A), 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  12. Gould, N.I.M., Orban, D., Toint, PhL: CUTEr: a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  13. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  14. Hager, W.W., Zhang, H.: Algorithm 851: CG\(_{-}\)Descent, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32(1), 113–137 (2006)

    Article  MathSciNet  Google Scholar 

  15. Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  16. Perry, A.: A modified conjugate gradient algorithm. Oper. Res. 26(6), 1073–1078 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  17. Piazza, G., Politi, T.: An upper bound for the condition number of a matrix in spectral norm. J. Comput. Appl. Math. 143(1), 141–144 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  18. Stewart, G.W.: Matrix Algorithms, Volume 1: basic decompositions. SIAM, Philadelphia (1998)

  19. Sun, W., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer, New York (2006)

    MATH  Google Scholar 

  20. Watkins, D.S.: Fundamentals of Matrix Computations. Wiley, New York (2002)

    Book  MATH  Google Scholar 

  21. Zhang, L., Zhou, W., Li, D.H.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22(4), 697–711 (2007)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgments

This research was supported by the Research Council of Semnan University. The author is grateful to Professor William W. Hager for providing the line search code. He also thanks the anonymous reviewers for their valuable comments and suggestions helped to improve the presentation.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Saman Babaie-Kafaki.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Babaie-Kafaki, S. On optimality of two adaptive choices for the parameter of Dai–Liao method. Optim Lett 10, 1789–1797 (2016). https://doi.org/10.1007/s11590-015-0965-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-015-0965-5

Keywords

Navigation