Journal of the Operations Research Society of Japan
Online ISSN : 2188-8299
Print ISSN : 0453-4514
ISSN-L : 0453-4514
A NEW NONLINEAR CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION
Hiroshi YabeNaoki Sakaiwa
Author information
JOURNAL FREE ACCESS

2005 Volume 48 Issue 4 Pages 284-296

Details
Abstract

Conjugate gradient methods are widely used for large scale unconstrained optimization problems. Most of conjugate gradient methods don't always generate a descent search direction, so the descent condition is usually assumed in the analysis and implementations. Dai and Yuan (1999) proposed a conjugate gradient method which generates a descent search direction at every iteration and converges globally to the solution if the Wolfe conditions are satisfied within the line search strategy. In this paper, we give a new conjugate gradient method based on the study of Dai and Yuan, and show that our method always produces a descent search direction and converges globally if the Wolfe conditions are satisfied. Moreover our method has the second-order curvature information with a higher precision which uses the modified secant condition proposed by Zhang, Deng and Chen (1999) and Zhang and Xu (2001). Our numerical results show that our method is very efficient for given standard test problems, if we make a good choice of a parameter included in our method.

Content from these authors
© 2005 The Operations Research Society of Japan
Previous article Next article
feedback
Top