Skip to main content

2020 | OriginalPaper | Buchkapitel

9. Beyond First Order: \(\mathcal {V}\mathcal {U}\)-Decomposition Methods

verfasst von : Shuai Liu, Claudia Sagastizábal

Erschienen in: Numerical Nonsmooth Optimization

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

When minimizing a nonsmooth convex function bundle methods are well known by their robustness and reliability. While such features are related to global convergence properties of the algorithm, speed of convergence is a different matter, as fast rates cannot be expected if the objective function lacks smoothness.
In the typical bundle dichotomy that splits iterates between null and serious steps, the latter subsequence converges with R-linear speed. Moving from the first-order method realm to the world of superlinear speed is possible when realizing that nonsmoothness often appears in a structured manner. This is the basis of the \({\mathcal {V}\mathcal {U}}\)-decomposition approach presented in this chapter. Thanks to this decomposition, it is possible to make a Newton-like move in certain \({\mathcal {U}}\)-subspace, where all the function “smoothness” concentrates at least locally. On its orthogonal complement, the “sharp” \({\mathcal {V}}\)-subspace, an intermediate iterate is defined such that the overall convergence is driven by the \({\mathcal {U}}\)-step. As a result, the serious-step subsequence converges with superlinear speed.
By focusing on the proximal variants of bundle methods, this chapter introduces gradually the \({\mathcal {V}\mathcal {U}}\)-theory and the ingredients needed to build superlinearly convergent algorithms in nonsmooth optimization. For functions whose proximal points are easy to compute, as in machine learning, an implementable quasi-Newton \({\mathcal {V}\mathcal {U}}\)-algorithm is given. The bundle machinery is incorporated for functions whose proximal points are not readily available. In this case, superlinear speed can be achieved by solving two quadratic programming problems per iteration.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Auslender, A.: Numerical methods for nondifferentiable convex optimization Math. Program. Study 30, 102–126 (1987)MathSciNetCrossRef Auslender, A.: Numerical methods for nondifferentiable convex optimization Math. Program. Study 30, 102–126 (1987)MathSciNetCrossRef
2.
Zurück zum Zitat Bonnans, J.F., Gilbert, J.C., Lemaréchal, C., Sagastizábal, C.A.: A family of variable metric proximal methods Math. Program. 68(1–3), 15–47 (1995)MATH Bonnans, J.F., Gilbert, J.C., Lemaréchal, C., Sagastizábal, C.A.: A family of variable metric proximal methods Math. Program. 68(1–3), 15–47 (1995)MATH
3.
Zurück zum Zitat Chen, X., Fukushima, M.: Proximal quasi-Newton methods for nondifferentiable convex optimization Math. Program. 85(2, Ser. A), 313–334 (1999) Chen, X., Fukushima, M.: Proximal quasi-Newton methods for nondifferentiable convex optimization Math. Program. 85(2, Ser. A), 313–334 (1999)
4.
Zurück zum Zitat Correa, R., Lemaréchal, C.: Convergence of some algorithms for convex minimization Math. Program. 62(1–3), 261–275 (1993)CrossRef Correa, R., Lemaréchal, C.: Convergence of some algorithms for convex minimization Math. Program. 62(1–3), 261–275 (1993)CrossRef
6.
8.
Zurück zum Zitat Hare, W., Sagastizábal, C.: A redistributed proximal bundle method for nonconvex optimization SIAM J. Optim. 20(5), 2442–2473 (2010)MathSciNetMATH Hare, W., Sagastizábal, C.: A redistributed proximal bundle method for nonconvex optimization SIAM J. Optim. 20(5), 2442–2473 (2010)MathSciNetMATH
9.
Zurück zum Zitat Lemaréchal, C., Mifflin, R.: Global and superlinear convergence of an algorithm for one-dimensional minimization of convex functions Math. Program. Study 24, 241–256 (1982)CrossRef Lemaréchal, C., Mifflin, R.: Global and superlinear convergence of an algorithm for one-dimensional minimization of convex functions Math. Program. Study 24, 241–256 (1982)CrossRef
10.
Zurück zum Zitat Lemaréchal, C., Sagastizábal, C.: An approach to variable metric bundle methods. In: Henry, J., Yvon, J.P. (eds.) System Modelling and Optimization. Lecture Notes in Control and Information Sciences, vol. 197, pp. 144–162. Springer, Berlin, Heidelberg (1994). https://doi.org/10.1007/BFb0035464 Lemaréchal, C., Sagastizábal, C.: An approach to variable metric bundle methods. In: Henry, J., Yvon, J.P. (eds.) System Modelling and Optimization. Lecture Notes in Control and Information Sciences, vol. 197, pp. 144–162. Springer, Berlin, Heidelberg (1994). https://​doi.​org/​10.​1007/​BFb0035464
13.
Zurück zum Zitat Lemaréchal, C., Oustry, F., Sagastizábal, C.: The U-Lagrangian of a convex function. Trans. Am. Math. Soc. 352(2), 711–729 (2000)MathSciNetCrossRef Lemaréchal, C., Oustry, F., Sagastizábal, C.: The U-Lagrangian of a convex function. Trans. Am. Math. Soc. 352(2), 711–729 (2000)MathSciNetCrossRef
15.
Zurück zum Zitat Liu, S., Sagastizábal, C., Solodov, M.: Subdifferential Enlargements and Continuity Properties of the VU-decomposition in Convex Optimization. International Series of Numerical Mathematics, vol. 170. Birkhäuser/Springer, Cham (2018) Liu, S., Sagastizábal, C., Solodov, M.: Subdifferential Enlargements and Continuity Properties of the VU-decomposition in Convex Optimization. International Series of Numerical Mathematics, vol. 170. Birkhäuser/Springer, Cham (2018)
16.
17.
Zurück zum Zitat Mifflin, R., Sagastizábal, C.: Proximal points are on the fast track J. Convex Anal. 9(2), 563–580 (2002)MathSciNetMATH Mifflin, R., Sagastizábal, C.: Proximal points are on the fast track J. Convex Anal. 9(2), 563–580 (2002)MathSciNetMATH
18.
Zurück zum Zitat Mifflin, R., Sagastizábal, C.: Primal-dual gradient structured functions: second-order results; links to epi-derivatives and partly smooth functions. SIAM J. Optim. 13(4), 1174–1194 (2003)MathSciNetCrossRef Mifflin, R., Sagastizábal, C.: Primal-dual gradient structured functions: second-order results; links to epi-derivatives and partly smooth functions. SIAM J. Optim. 13(4), 1174–1194 (2003)MathSciNetCrossRef
20.
Zurück zum Zitat Mifflin, R., Sagastizábal, C.: A VU-algorithm for convex minimization Math. Program. 104(2–3), 583–608 (2005)MathSciNetCrossRef Mifflin, R., Sagastizábal, C.: A VU-algorithm for convex minimization Math. Program. 104(2–3), 583–608 (2005)MathSciNetCrossRef
21.
Zurück zum Zitat Mifflin, R., Sagastizábal, C.: Optimization stories. In: Grötschel, M. (ed.) Extra Volume ISMP 2012, pp. 460 (2012) Mifflin, R., Sagastizábal, C.: Optimization stories. In: Grötschel, M. (ed.) Extra Volume ISMP 2012, pp. 460 (2012)
24.
Zurück zum Zitat Rauf, A., Fukushima, M.: Globally convergent BFGS method for nonsmooth convex optimization. J. Optim. Theory Appl. 104(3), 539–558 (1997)MathSciNetCrossRef Rauf, A., Fukushima, M.: Globally convergent BFGS method for nonsmooth convex optimization. J. Optim. Theory Appl. 104(3), 539–558 (1997)MathSciNetCrossRef
26.
Zurück zum Zitat Rockafellar, R.T., Wets, R.J.B.: Variational Analysis: Grundlehren Der Mathematischen Wissenschaften, vol. 317. Springer, Berlin (1998)CrossRef Rockafellar, R.T., Wets, R.J.B.: Variational Analysis: Grundlehren Der Mathematischen Wissenschaften, vol. 317. Springer, Berlin (1998)CrossRef
27.
Zurück zum Zitat Sagastizábal, C.: A \(\mathcal {V}\mathcal {U}\)-point of view of nonsmooth optimization. In: Proceedings of the International Congress of Mathematicians 2018 - Invited Lectures, vol. 3, pp. 3785–3806 (2018) Sagastizábal, C.: A \(\mathcal {V}\mathcal {U}\)-point of view of nonsmooth optimization. In: Proceedings of the International Congress of Mathematicians 2018 - Invited Lectures, vol. 3, pp. 3785–3806 (2018)
Metadaten
Titel
Beyond First Order: -Decomposition Methods
verfasst von
Shuai Liu
Claudia Sagastizábal
Copyright-Jahr
2020
DOI
https://doi.org/10.1007/978-3-030-34910-3_9