Skip to main content
Top
Published in: Neural Computing and Applications 1/2022

13-08-2021 | Original Article

A conjugate gradient algorithm based on double parameter scaled Broyden–Fletcher–Goldfarb–Shanno update for optimization problems and image restoration

Authors: Dan Luo, Yong Li, Junyu Lu, Gonglin Yuan

Published in: Neural Computing and Applications | Issue 1/2022

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

In this paper, we present a three-term conjugate gradient algorithm and three approaches are used in the designed algorithm: (i) A modified weak Wolfe-Powell line search technique is introduced to obtain \(\alpha _k\). (ii) The search direction \(d_k\) is given by a symmetrical Perry matrix which contains two positive parameters, and the sufficient descent property of the generated directions holds independent of the MWWP line search technique. (iii) A parabolic will be proposed and regarded as the projection surface, the next point \(x_{k+1}\) is generated by a new projection technique. The global convergence of the new algorithm under a MWWP line search is obtained for general functions. Numerical experiments show that the given algorithm is promising.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Al-Bayati AY, Sharif WH (2010) A new three-term conjugate gradient method for unconstrained optimization. Can J Sci Eng Math 1:108–124 Al-Bayati AY, Sharif WH (2010) A new three-term conjugate gradient method for unconstrained optimization. Can J Sci Eng Math 1:108–124
2.
go back to reference Abubakar AB, Kumam P (2019) A descent Dai-Liao conjugate gradient method for nonlinear equations. Numer Algo 81:197–210MathSciNetCrossRef Abubakar AB, Kumam P (2019) A descent Dai-Liao conjugate gradient method for nonlinear equations. Numer Algo 81:197–210MathSciNetCrossRef
3.
go back to reference Awwal AM, Kumam Poom, Abubakar Auwal Bala (2019) A modified conjugate gradient method for monotone nonlinear equations with convex constraints. Appl Numer Math 145:507–520MathSciNetCrossRef Awwal AM, Kumam Poom, Abubakar Auwal Bala (2019) A modified conjugate gradient method for monotone nonlinear equations with convex constraints. Appl Numer Math 145:507–520MathSciNetCrossRef
4.
go back to reference Andrei N (2013) On three term conjugate gradient algorithms for unconstrained optimization. Appl Math Comput 219:6316–6327MathSciNetMATH Andrei N (2013) On three term conjugate gradient algorithms for unconstrained optimization. Appl Math Comput 219:6316–6327MathSciNetMATH
5.
go back to reference Andrei N (2017) Accelerated adaptive perry conjugate gradient algorithms based on the self-scaling memoryless bfgs update. J Comput Appl Math 325:149–164MathSciNetCrossRef Andrei N (2017) Accelerated adaptive perry conjugate gradient algorithms based on the self-scaling memoryless bfgs update. J Comput Appl Math 325:149–164MathSciNetCrossRef
6.
go back to reference Andrei N (2018) A double parameter scaled bfgs method for unconstrained optimization. J Comput Appl Math 332:26–44MathSciNetCrossRef Andrei N (2018) A double parameter scaled bfgs method for unconstrained optimization. J Comput Appl Math 332:26–44MathSciNetCrossRef
7.
go back to reference Babaie-Kafaki S, Ghanbari R (2014) A descent family of daiCliao conjugate gradient methods. Opt Methods Softw 29:83–591MATH Babaie-Kafaki S, Ghanbari R (2014) A descent family of daiCliao conjugate gradient methods. Opt Methods Softw 29:83–591MATH
8.
go back to reference Birgin EG, Martinez JM (2001) A spectral conjugate gradient method for unconstrained optimization. Appl Math Opt 43:117–128MathSciNetCrossRef Birgin EG, Martinez JM (2001) A spectral conjugate gradient method for unconstrained optimization. Appl Math Opt 43:117–128MathSciNetCrossRef
9.
go back to reference Broyden CG (1970) The convergence of a class of double rank minimization algorithms: 2. The new algorithm. J Inst Math Appl 6:222–231MathSciNetCrossRef Broyden CG (1970) The convergence of a class of double rank minimization algorithms: 2. The new algorithm. J Inst Math Appl 6:222–231MathSciNetCrossRef
10.
11.
12.
go back to reference Dai Y, Liao L (2001) New conjugacy conditions and relatednon linear conjugate gradient methods. Appl Math Opt 43:87–101CrossRef Dai Y, Liao L (2001) New conjugacy conditions and relatednon linear conjugate gradient methods. Appl Math Opt 43:87–101CrossRef
13.
go back to reference Dai Y, Yuan Y (1999) A nonlinear conjugate gradient with a strong global convergence property. SIAM J Opt 10:177–182MathSciNetCrossRef Dai Y, Yuan Y (1999) A nonlinear conjugate gradient with a strong global convergence property. SIAM J Opt 10:177–182MathSciNetCrossRef
14.
go back to reference Ding Y, Xiao Y, Li J (2017) A class of conjugate gradient methods for convex constrained monotone equations. Optimization 66:2309–2328MathSciNetCrossRef Ding Y, Xiao Y, Li J (2017) A class of conjugate gradient methods for convex constrained monotone equations. Optimization 66:2309–2328MathSciNetCrossRef
15.
go back to reference Feng D, Sun M, Wang X (2017) A family of conjugate gradient methods for large-scale nonlinear equations. J Inequal Appl 2017:236MathSciNetCrossRef Feng D, Sun M, Wang X (2017) A family of conjugate gradient methods for large-scale nonlinear equations. J Inequal Appl 2017:236MathSciNetCrossRef
16.
go back to reference Fletcher R (1970) A new approach to variable metric algorithms. Comput J 13:317–322CrossRef Fletcher R (1970) A new approach to variable metric algorithms. Comput J 13:317–322CrossRef
17.
go back to reference Fletcher R (1997) Practical method of optimization, vol I: unconstrained optimization. Wiley, New York Fletcher R (1997) Practical method of optimization, vol I: unconstrained optimization. Wiley, New York
19.
go back to reference Geem ZW (2006) Parameter estimation for the nonlinear Muskingum model using the BFGS technique. J Irrig Drain Eng 132:474–478CrossRef Geem ZW (2006) Parameter estimation for the nonlinear Muskingum model using the BFGS technique. J Irrig Drain Eng 132:474–478CrossRef
20.
21.
go back to reference Hestenes MR, Stiefel E (1952) Method of conjugate gradient for solving linear equations. J Res National Bureau Stand 49:409–436CrossRef Hestenes MR, Stiefel E (1952) Method of conjugate gradient for solving linear equations. J Res National Bureau Stand 49:409–436CrossRef
22.
go back to reference Khoshgam Z, Ashrafi A (2019) A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function. Comput Appl Math 38:1–14MathSciNetCrossRef Khoshgam Z, Ashrafi A (2019) A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function. Comput Appl Math 38:1–14MathSciNetCrossRef
23.
go back to reference Li G, Tang C, Wei Z (2007) New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J Comput Appl Math 202:523–539MathSciNetCrossRef Li G, Tang C, Wei Z (2007) New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J Comput Appl Math 202:523–539MathSciNetCrossRef
24.
go back to reference Liu Y, Storey C (2000) Effcient generalized conjugate gradient algorithms part 1: theory. J comput Appl Math 10:177–182 Liu Y, Storey C (2000) Effcient generalized conjugate gradient algorithms part 1: theory. J comput Appl Math 10:177–182
25.
go back to reference Ouyang A, Liu L, Sheng Z, Wu F (2015) A class of parameter estimation methods for nonlinear Muskingum model using hybrid invasive weed optimization algorithm. Math Prob Eng , 2015, Article ID 573894 Ouyang A, Liu L, Sheng Z, Wu F (2015) A class of parameter estimation methods for nonlinear Muskingum model using hybrid invasive weed optimization algorithm. Math Prob Eng , 2015, Article ID 573894
26.
go back to reference Ouyang A, Tang Z, Li K, Sallam A, Sha E (2014) Estimating parameters of Muskingum model using an adaptive hybrid PSO algorithm. Int J Pattern Recog Artif Intell . Recogn, 28, Article ID 1459003 Ouyang A, Tang Z, Li K, Sallam A, Sha E (2014) Estimating parameters of Muskingum model using an adaptive hybrid PSO algorithm. Int J Pattern Recog Artif Intell . Recogn, 28, Article ID 1459003
28.
go back to reference Perry J et al A class of conjugate gradient algorithms with a two-step variable metric memory. In: Discussion Papers 269 Perry J et al A class of conjugate gradient algorithms with a two-step variable metric memory. In: Discussion Papers 269
29.
go back to reference Polak E, Ribiére G (1969) Note Surlá convergence de directions conjugées, Rev. Francaise Informat Recherche Operationelle. 3e Anne, 16:35–43 Polak E, Ribiére G (1969) Note Surlá convergence de directions conjugées, Rev. Francaise Informat Recherche Operationelle. 3e Anne, 16:35–43
30.
go back to reference Polyak BT (1969) The conjugate gradient method in extreme problems. USSR Comput Math Math Phys 9:94–112CrossRef Polyak BT (1969) The conjugate gradient method in extreme problems. USSR Comput Math Math Phys 9:94–112CrossRef
33.
34.
go back to reference Sugiki K, Narushima Y, Yabe H (2012) Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J Opt Theory Appl 153:733–757MathSciNetCrossRef Sugiki K, Narushima Y, Yabe H (2012) Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J Opt Theory Appl 153:733–757MathSciNetCrossRef
36.
go back to reference Yao S, Ning L (2018) An adaptive three-term conjugate gradient method based on self-scaling memoryless bfgs matrix. J Comput Appl Math 332:2–85MathSciNetCrossRef Yao S, Ning L (2018) An adaptive three-term conjugate gradient method based on self-scaling memoryless bfgs matrix. J Comput Appl Math 332:2–85MathSciNetCrossRef
37.
go back to reference Yuan G, Li T, Hu W (2020) A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems. Appl Numer Math 147:129–141MathSciNetCrossRef Yuan G, Li T, Hu W (2020) A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems. Appl Numer Math 147:129–141MathSciNetCrossRef
38.
go back to reference Yuan G, Lu J, Wang Z (2020) The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems. Appl Numer Math 152:1–11MathSciNetCrossRef Yuan G, Lu J, Wang Z (2020) The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems. Appl Numer Math 152:1–11MathSciNetCrossRef
39.
go back to reference Yuan G, Sheng Z, Wang B, Hu W, Li C (2018) The global convergence of a modified BFGS method for nonconvex functions. J Comput Appl Math 327:274–294MathSciNetCrossRef Yuan G, Sheng Z, Wang B, Hu W, Li C (2018) The global convergence of a modified BFGS method for nonconvex functions. J Comput Appl Math 327:274–294MathSciNetCrossRef
40.
go back to reference Yuan G, Wei Z, Lu X (2017) Global convergence of the BFGS method and the PRP method for general functions under a modified weak Wolfe-Powell line search. Appl Math Opt 47:811–825MATH Yuan G, Wei Z, Lu X (2017) Global convergence of the BFGS method and the PRP method for general functions under a modified weak Wolfe-Powell line search. Appl Math Opt 47:811–825MATH
41.
go back to reference Yuan G, Wei Z, Yang Y (2019) The global convergence of the Polak-Ribiére-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions. J Comput Appl Math 362:262–275MathSciNetCrossRef Yuan G, Wei Z, Yang Y (2019) The global convergence of the Polak-Ribiére-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions. J Comput Appl Math 362:262–275MathSciNetCrossRef
42.
go back to reference Yuan Y (1993) Analysis on the conjugate gradient method. Opt Math Softw 2:19–29CrossRef Yuan Y (1993) Analysis on the conjugate gradient method. Opt Math Softw 2:19–29CrossRef
43.
go back to reference Yuan Y, Sun W (1999) Theory and methods of optimization. Science Press of China, Beijing Yuan Y, Sun W (1999) Theory and methods of optimization. Science Press of China, Beijing
Metadata
Title
A conjugate gradient algorithm based on double parameter scaled Broyden–Fletcher–Goldfarb–Shanno update for optimization problems and image restoration
Authors
Dan Luo
Yong Li
Junyu Lu
Gonglin Yuan
Publication date
13-08-2021
Publisher
Springer London
Published in
Neural Computing and Applications / Issue 1/2022
Print ISSN: 0941-0643
Electronic ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-021-06383-y

Other articles of this Issue 1/2022

Neural Computing and Applications 1/2022 Go to the issue

Premium Partner