Skip to main content
Top
Published in:
Cover of the book

2019 | OriginalPaper | Chapter

1. Convergence Rate of Proximal Inertial Algorithms Associated with Moreau Envelopes of Convex Functions

Authors : Hedy Attouch, Juan Peypouquet

Published in: Splitting Algorithms, Modern Operator Theory, and Applications

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

In a Hilbert space setting \({\mathcal H}\), we develop new inertial proximal-based algorithms that aim to rapidly minimize a convex lower-semicontinuous proper function \(\varPhi : \mathcal H \rightarrow {\mathbb R} \cup \{+\infty \}\). The guiding idea is to use an accelerated proximal scheme where, at each step, Φ is replaced by its Moreau envelope, with varying approximation parameter. This leads to consider a Relaxed Inertial Proximal Algorithm (RIPA) with variable parameters which take into account the effects of inertia, relaxation, and approximation. (RIPA) was first introduced to solve general maximally monotone inclusions, in which case a judicious adjustment of the parameters makes it possible to obtain the convergence of the iterates towards the equilibria. In the case of convex minimization problems, convergence analysis of (RIPA) was initially addressed by Attouch and Cabot, based on its formulation as an inertial gradient method with varying potential functions. We propose a new approach to this algorithm, along with further developments, based on its formulation as a proximal algorithm associated with varying Moreau envelopes. For convenient choices of the parameters, we show the fast optimization property of the function values, with the order o(k −2), and the weak convergence of the iterates. This is in line with the recent studies of Su-Boyd-Candès, Chambolle-Dossal, Attouch-Peypouquet. We study the impact of geometric assumptions on the convergence rates, and the stability of the results with respect to perturbations and errors. Finally, in the case of structured minimization problems smooth + nonsmooth, based on this approach, we introduce new proximal-gradient inertial algorithms for which similar convergence rates are shown.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Appendix
Available only for authorised users
Literature
1.
go back to reference Álvarez, F.: On the minimizing property of a second-order dissipative system in Hilbert spaces. SIAM J. Control Optim. 38, 1102–1119 (2000)MathSciNetCrossRef Álvarez, F.: On the minimizing property of a second-order dissipative system in Hilbert spaces. SIAM J. Control Optim. 38, 1102–1119 (2000)MathSciNetCrossRef
2.
go back to reference Álvarez, F., Attouch, H., Bolte, J., Redont, P.: A second-order gradient-like dissipative dynamical system with Hessian-driven damping. Application to optimization and mechanics. J. Math. Pures Appl. 81, 747–779 (2002)MATH Álvarez, F., Attouch, H., Bolte, J., Redont, P.: A second-order gradient-like dissipative dynamical system with Hessian-driven damping. Application to optimization and mechanics. J. Math. Pures Appl. 81, 747–779 (2002)MATH
3.
go back to reference Apidopoulos, V., Aujol, J.-F., Dossal, Ch.: Convergence rate of inertial Forward-Backward algorithm beyond Nesterov’s rule. Math. Prog. (Ser. A). , 1–20 (2018) Apidopoulos, V., Aujol, J.-F., Dossal, Ch.: Convergence rate of inertial Forward-Backward algorithm beyond Nesterov’s rule. Math. Prog. (Ser. A). , 1–20 (2018)
4.
go back to reference Attouch, H.: Variational Analysis for Functions and Operators. Pitman (1984) Attouch, H.: Variational Analysis for Functions and Operators. Pitman (1984)
5.
go back to reference Attouch, H., Bolte, J., Redont, P.: Optimizing properties of an inertial dynamical system with geometric damping. Link with proximal methods. Control Cybernet. 31, 643–657 (2002)MATH Attouch, H., Bolte, J., Redont, P.: Optimizing properties of an inertial dynamical system with geometric damping. Link with proximal methods. Control Cybernet. 31, 643–657 (2002)MATH
6.
go back to reference Attouch, H., Cabot, A.: Asymptotic stabilization of inertial gradient dynamics with time-dependent viscosity. J. Differential Equations 263, 5412–5458 (2017)MathSciNetCrossRef Attouch, H., Cabot, A.: Asymptotic stabilization of inertial gradient dynamics with time-dependent viscosity. J. Differential Equations 263, 5412–5458 (2017)MathSciNetCrossRef
7.
go back to reference Attouch, H., Cabot, A.: Convergence rates of inertial forward-backward algorithms. SIAM J. Optim. 28, 849–874 (2018)MathSciNetCrossRef Attouch, H., Cabot, A.: Convergence rates of inertial forward-backward algorithms. SIAM J. Optim. 28, 849–874 (2018)MathSciNetCrossRef
8.
go back to reference Attouch, H., Cabot, A.: Convergence of damped inertial dynamics governed by regularized maximally monotone operators. J. Differential Equations to appear. HAL-01648383v2 (2018) Attouch, H., Cabot, A.: Convergence of damped inertial dynamics governed by regularized maximally monotone operators. J. Differential Equations to appear. HAL-01648383v2 (2018)
9.
go back to reference Attouch, H., Cabot, A.: Convergence of a relaxed inertial proximal algorithm for maximally monotone operators. HAL-01708905 (2018) Attouch, H., Cabot, A.: Convergence of a relaxed inertial proximal algorithm for maximally monotone operators. HAL-01708905 (2018)
10.
go back to reference Attouch, H., Cabot, A.: Convergence rate of a relaxed inertial proximal algorithm for convex minimization. HAL-01807041 (2018) Attouch, H., Cabot, A.: Convergence rate of a relaxed inertial proximal algorithm for convex minimization. HAL-01807041 (2018)
11.
go back to reference Attouch, H., Cabot, A., Redont, P.: The dynamics of elastic shocks via epigraphical regularization of a differential inclusion. Adv. Math. Sci. Appl. 12, 273–306 (2002)MathSciNetMATH Attouch, H., Cabot, A., Redont, P.: The dynamics of elastic shocks via epigraphical regularization of a differential inclusion. Adv. Math. Sci. Appl. 12, 273–306 (2002)MathSciNetMATH
12.
go back to reference Attouch, H., Cabot, A., Chbani, Z., Riahi, H.: Accelerated forward-backward algorithms with perturbations. Application to Tikhonov regularization. J. Optim. Th. Appl. 179, 1–36 (2018)CrossRef Attouch, H., Cabot, A., Chbani, Z., Riahi, H.: Accelerated forward-backward algorithms with perturbations. Application to Tikhonov regularization. J. Optim. Th. Appl. 179, 1–36 (2018)CrossRef
13.
go back to reference Attouch, H., Chbani, Z., Peypouquet, J., Redont, P.: Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity. Math. Prog. (Ser. B) 168, 123–175 (2018) Attouch, H., Chbani, Z., Peypouquet, J., Redont, P.: Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity. Math. Prog. (Ser. B) 168, 123–175 (2018)
14.
go back to reference Attouch, H., Chbani, Z., Riahi, H.: Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3, ESAIM: COCV 25 (2019) Attouch, H., Chbani, Z., Riahi, H.: Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3, ESAIM: COCV 25 (2019)
15.
go back to reference Attouch, H., Peypouquet, J.: The rate of convergence of Nesterov’s accelerated forward-backward method is actually faster than 1∕k 2. SIAM J. Optim. 26, 1824–1834 (2016)MathSciNetCrossRef Attouch, H., Peypouquet, J.: The rate of convergence of Nesterov’s accelerated forward-backward method is actually faster than 1∕k 2. SIAM J. Optim. 26, 1824–1834 (2016)MathSciNetCrossRef
16.
go back to reference Attouch, H., Peypouquet, J.: Convergence of inertial dynamics and proximal algorithms governed by maximal monotone operators. Math. Prog. 174, 319–432 (2019)CrossRef Attouch, H., Peypouquet, J.: Convergence of inertial dynamics and proximal algorithms governed by maximal monotone operators. Math. Prog. 174, 319–432 (2019)CrossRef
17.
go back to reference Attouch, H., Peypouquet, J., Redont, P.: Fast convex minimization via inertial dynamics with Hessian driven damping. J. Differential Equations 261, 5734–5783 (2016)MathSciNetCrossRef Attouch, H., Peypouquet, J., Redont, P.: Fast convex minimization via inertial dynamics with Hessian driven damping. J. Differential Equations 261, 5734–5783 (2016)MathSciNetCrossRef
18.
go back to reference Aujol, J.-F., Dossal, Ch.: Stability of over-relaxations for the Forward-Backward algorithm, application to FISTA. SIAM J. Optim. 25, 2408–2433 (2015)MathSciNetCrossRef Aujol, J.-F., Dossal, Ch.: Stability of over-relaxations for the Forward-Backward algorithm, application to FISTA. SIAM J. Optim. 25, 2408–2433 (2015)MathSciNetCrossRef
19.
go back to reference Baillon, J.-B.:, Un exemple concernant le comportement asymptotique de la solution du problème \(\frac {du}{dt} + \partial \phi (u) \ni 0\). J. Functional Anal. 28, 369–376 (1978) Baillon, J.-B.:, Un exemple concernant le comportement asymptotique de la solution du problème \(\frac {du}{dt} + \partial \phi (u) \ni 0\). J. Functional Anal. 28, 369–376 (1978)
20.
go back to reference Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert spaces. Springer (2011) Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert spaces. Springer (2011)
21.
go back to reference Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2, 183–202 (2009)MathSciNetCrossRef Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2, 183–202 (2009)MathSciNetCrossRef
22.
go back to reference Beck, A., Teboulle, M.: Gradient-Based Algorithms with Applications in Signal Recovery Problems. In Convex Optimization in Signal Processing and Communications, D. Palomar and Y. Eldar Eds., Cambridge University Press, 33–88 (2010) Beck, A., Teboulle, M.: Gradient-Based Algorithms with Applications in Signal Recovery Problems. In Convex Optimization in Signal Processing and Communications, D. Palomar and Y. Eldar Eds., Cambridge University Press, 33–88 (2010)
23.
go back to reference Bolte, J., Daniilidis, A., Ley, O., Mazet, L.: Characterizations of Lojasiewicz Inequalities and Applications. Trans. AMS 362, 3319–3363 (2010)CrossRef Bolte, J., Daniilidis, A., Ley, O., Mazet, L.: Characterizations of Lojasiewicz Inequalities and Applications. Trans. AMS 362, 3319–3363 (2010)CrossRef
24.
go back to reference Bolte, J., Nguyen, T.P., Peypouquet, J., Suter, B.: From error bounds to the complexity of first-order descent methods for convex functions. Math. Prog. 165, 471–507 (2017)MathSciNetCrossRef Bolte, J., Nguyen, T.P., Peypouquet, J., Suter, B.: From error bounds to the complexity of first-order descent methods for convex functions. Math. Prog. 165, 471–507 (2017)MathSciNetCrossRef
25.
go back to reference Brézis, H.: Opérateurs maximaux monotones dans les espaces de Hilbert et équations d’évolution. North Holland, (1972) Brézis, H.: Opérateurs maximaux monotones dans les espaces de Hilbert et équations d’évolution. North Holland, (1972)
26.
go back to reference Chambolle, A., Dossal, Ch.: On the convergence of the iterates of the Fast Iterative Shrinkage Thresholding Algorithm. J. Optim. Theory Appl. 166, 968–982 (2015)MathSciNetCrossRef Chambolle, A., Dossal, Ch.: On the convergence of the iterates of the Fast Iterative Shrinkage Thresholding Algorithm. J. Optim. Theory Appl. 166, 968–982 (2015)MathSciNetCrossRef
27.
go back to reference Drori, Y., Teboulle, M.: Performance of first-order methods for smooth convex minimization: a novel approach. Math. Prog. (Ser. A) 145, 451–482 (2014) Drori, Y., Teboulle, M.: Performance of first-order methods for smooth convex minimization: a novel approach. Math. Prog. (Ser. A) 145, 451–482 (2014)
28.
go back to reference Güler, O.: On the convergence of the proximal point algorithm for convex optimization. SIAM J. Control Optim. 29 403–419 (1991)MathSciNetCrossRef Güler, O.: On the convergence of the proximal point algorithm for convex optimization. SIAM J. Control Optim. 29 403–419 (1991)MathSciNetCrossRef
29.
go back to reference Imbert, C.: Convex Analysis techniques for Hopf-Lax formulae in Hamilton-Jacobi equations. J. of Nonlinear Convex Anal. 2 333–343 (2001)MathSciNetMATH Imbert, C.: Convex Analysis techniques for Hopf-Lax formulae in Hamilton-Jacobi equations. J. of Nonlinear Convex Anal. 2 333–343 (2001)MathSciNetMATH
30.
go back to reference Kim, D., Fessler, J.A.: Optimized first-order methods for smooth convex minimization. Math. Prog. to appear. DOI 10.1007/s10107–015-0949-3. Kim, D., Fessler, J.A.: Optimized first-order methods for smooth convex minimization. Math. Prog. to appear. DOI 10.1007/s10107–015-0949-3.
31.
go back to reference Liang, J., Fadili, J., Peyré, G.: Local linear convergence of forward-backward under partial smoothness. Advances in Neural Information Processing Systems, 1970–1978 (2014) Liang, J., Fadili, J., Peyré, G.: Local linear convergence of forward-backward under partial smoothness. Advances in Neural Information Processing Systems, 1970–1978 (2014)
32.
go back to reference May, R.: Asymptotic for a second-order evolution equation with convex potential and vanishing damping term. Turkish J. Math. 41, 681–685 (2017)MathSciNetCrossRef May, R.: Asymptotic for a second-order evolution equation with convex potential and vanishing damping term. Turkish J. Math. 41, 681–685 (2017)MathSciNetCrossRef
33.
go back to reference Nesterov, Y.: A method of solving a convex programming problem with convergence rate O(1∕k 2). Soviet Math. Doklady 27, 372–376 (1983)MATH Nesterov, Y.: A method of solving a convex programming problem with convergence rate O(1∕k 2). Soviet Math. Doklady 27, 372–376 (1983)MATH
34.
go back to reference Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course. Volume 87 of Applied Optimization. Kluwer Academic Publishers, Boston, MA (2004) Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course. Volume 87 of Applied Optimization. Kluwer Academic Publishers, Boston, MA (2004)
35.
go back to reference Parikh, N., Boyd, S.: Proximal algorithms. Foundations and Trends in Optimization 1, 123–231 (2013) Parikh, N., Boyd, S.: Proximal algorithms. Foundations and Trends in Optimization 1, 123–231 (2013)
36.
go back to reference Peypouquet, J.: Convex Optimization in Normed Spaces: Theory, Methods and Examples. Springer (2015)CrossRef Peypouquet, J.: Convex Optimization in Normed Spaces: Theory, Methods and Examples. Springer (2015)CrossRef
37.
go back to reference Villa, S., Salzo, S., Baldassarres, L., Verri A.: Accelerated and inexact forward-backward. SIAM J. Optim. 23, 1607–1633 (2013)MathSciNetCrossRef Villa, S., Salzo, S., Baldassarres, L., Verri A.: Accelerated and inexact forward-backward. SIAM J. Optim. 23, 1607–1633 (2013)MathSciNetCrossRef
38.
go back to reference Su, W., Boyd, S., Candès, E.J.: A differential equation for modeling Nesterov’s accelerated gradient method: theory and insights. Neural Information Processing Systems 27, 2510–2518 (2014)MATH Su, W., Boyd, S., Candès, E.J.: A differential equation for modeling Nesterov’s accelerated gradient method: theory and insights. Neural Information Processing Systems 27, 2510–2518 (2014)MATH
Metadata
Title
Convergence Rate of Proximal Inertial Algorithms Associated with Moreau Envelopes of Convex Functions
Authors
Hedy Attouch
Juan Peypouquet
Copyright Year
2019
DOI
https://doi.org/10.1007/978-3-030-25939-6_1

Premium Partner