Skip to main content
Erschienen in: Numerical Algorithms 1/2024

03.08.2023 | Original Paper

Stochastic incremental mirror descent algorithms with Nesterov smoothing

verfasst von: Sandy Bitterlich, Sorin-Mihai Grad

Erschienen in: Numerical Algorithms | Ausgabe 1/2024

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

For minimizing a sum of finitely many proper, convex and lower semicontinuous functions over a nonempty closed convex set in an Euclidean space we propose a stochastic incremental mirror descent algorithm constructed by means of the Nesterov smoothing. Further, we modify the algorithm in order to minimize over a nonempty closed convex set in an Euclidean space a sum of finitely many proper, convex and lower semicontinuous functions composed with linear operators. Next, a stochastic incremental mirror descent Bregman-proximal scheme with Nesterov smoothing is proposed in order to minimize over a nonempty closed convex set in an Euclidean space a sum of finitely many proper, convex and lower semicontinuous functions and a prox-friendly proper, convex and lower semicontinuous function. Different to the previous contributions from the literature on mirror descent methods for minimizing sums of functions, we do not require these to be (Lipschitz) continuous or differentiable. Applications in Logistics, Tomography and Machine Learning modelled as optimization problems illustrate the theoretical achievements

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Ahookhosh, M.: Optimal subgradient methods: computational properties for large-scale linear inverse problems. Optim Eng 19, 815–844 (2018)MathSciNetCrossRef Ahookhosh, M.: Optimal subgradient methods: computational properties for large-scale linear inverse problems. Optim Eng 19, 815–844 (2018)MathSciNetCrossRef
2.
Zurück zum Zitat Z. Allen-Zhu, L. Orecchia: Linear coupling: An ultimate unification of gradient and mirror descent, In: C.H. Papadimitrou (ed.), Innovations in Theoretical Computer Science (ITCS 2017), Leibniz Int Pr Infor 8, Art. No. 3, 3:1–3:22 (2017) Z. Allen-Zhu, L. Orecchia: Linear coupling: An ultimate unification of gradient and mirror descent, In: C.H. Papadimitrou (ed.), Innovations in Theoretical Computer Science (ITCS 2017), Leibniz Int Pr Infor 8, Art. No. 3, 3:1–3:22 (2017)
4.
Zurück zum Zitat N. Azizan, B. Hassibi: A characterization of stochastic mirror descent algorithms and their convergence properties, In: Int Conf Acoust Spee (ICASSP-2019), 5167–5171 (2019) N. Azizan, B. Hassibi: A characterization of stochastic mirror descent algorithms and their convergence properties, In: Int Conf Acoust Spee (ICASSP-2019), 5167–5171 (2019)
5.
6.
Zurück zum Zitat Beck, A.: First Order Methods in Optimization. SIAM, Philadelphia (2017)CrossRef Beck, A.: First Order Methods in Optimization. SIAM, Philadelphia (2017)CrossRef
7.
Zurück zum Zitat Beck, A., Teboulle, M.: Mirror descent and nonlinear projected subgradient methods for convex optimization. Oper Res Lett 31, 167–175 (2003)MathSciNetCrossRef Beck, A., Teboulle, M.: Mirror descent and nonlinear projected subgradient methods for convex optimization. Oper Res Lett 31, 167–175 (2003)MathSciNetCrossRef
8.
Zurück zum Zitat Beck, A., Teboulle, M.: Smoothing and first order methods: a unified framework. SIAM J Optim 22, 557–580 (2012)MathSciNetCrossRef Beck, A., Teboulle, M.: Smoothing and first order methods: a unified framework. SIAM J Optim 22, 557–580 (2012)MathSciNetCrossRef
9.
Zurück zum Zitat Ben-Tal, A., Margalit, T., Nemirovski, A.: The ordered subsets mirror descent optimization method with applications to tomography. SIAM J Optim 12, 79–108 (2001)MathSciNetCrossRef Ben-Tal, A., Margalit, T., Nemirovski, A.: The ordered subsets mirror descent optimization method with applications to tomography. SIAM J Optim 12, 79–108 (2001)MathSciNetCrossRef
10.
Zurück zum Zitat Boţ, R.I., Böhm, A.: An incremental mirror descent subgradient algorithm with random sweeping and proximal step. Optimization 68, 1–18 (2018)MathSciNet Boţ, R.I., Böhm, A.: An incremental mirror descent subgradient algorithm with random sweeping and proximal step. Optimization 68, 1–18 (2018)MathSciNet
11.
Zurück zum Zitat Boţ, R.I., Böhm, A.: Variable smoothing for convex optimization problems using stochastic gradients, J Sci Comput 85:33 (2020) Boţ, R.I., Böhm, A.: Variable smoothing for convex optimization problems using stochastic gradients, J Sci Comput 85:33 (2020)
12.
Zurück zum Zitat J.-P. Calliess: Lipschitz optimisation for Lipschitz interpolation, In: 2017 American Control Conference (ACC2017), 17000349 (2017) J.-P. Calliess: Lipschitz optimisation for Lipschitz interpolation, In: 2017 American Control Conference (ACC2017), 17000349 (2017)
13.
Zurück zum Zitat A. Defazio: A simple practical accelerated method for finite sums, In: D.D. Lee, U. von Luxburg, R. Garnett, M. Sugiyama and I.M.Guyon (eds.), Adv Neur In 29 (NIPS 2016) A. Defazio: A simple practical accelerated method for finite sums, In: D.D. Lee, U. von Luxburg, R. Garnett, M. Sugiyama and I.M.Guyon (eds.), Adv Neur In 29 (NIPS 2016)
14.
Zurück zum Zitat Doan, T.T., Bose, S., Nguyen, D.H., Beck, C.L.: Convergence of the iterates in mirror descent methods. IEEE Contr Syst Lett 3, 114–119 (2019)MathSciNetCrossRef Doan, T.T., Bose, S., Nguyen, D.H., Beck, C.L.: Convergence of the iterates in mirror descent methods. IEEE Contr Syst Lett 3, 114–119 (2019)MathSciNetCrossRef
15.
Zurück zum Zitat Duchi, J.C., Agarwal, A., Johansson, M., Jordan, M.I.: Ergodic mirror descent. SIAM J Optim 22, 1549–1578 (2012)CrossRef Duchi, J.C., Agarwal, A., Johansson, M., Jordan, M.I.: Ergodic mirror descent. SIAM J Optim 22, 1549–1578 (2012)CrossRef
17.
Zurück zum Zitat G. Goh: Optimization with Costly Subgradients, ProQuest Dissertations Publishing, 2017.10685037 (2017) G. Goh: Optimization with Costly Subgradients, ProQuest Dissertations Publishing, 2017.10685037 (2017)
18.
Zurück zum Zitat Grad, S.-M., Wilfer, O.: A proximal method for solving nonlinear minmax location problems with perturbed minimal time functions via conjugate duality. J Glob Optim 74, 121–160 (2019)MathSciNetCrossRef Grad, S.-M., Wilfer, O.: A proximal method for solving nonlinear minmax location problems with perturbed minimal time functions via conjugate duality. J Glob Optim 74, 121–160 (2019)MathSciNetCrossRef
19.
Zurück zum Zitat Guigues, V.: Inexact stochastic mirror descent for two-stage nonlinear stochastic programs. Math Program 187, 533–577 (2021)MathSciNetCrossRef Guigues, V.: Inexact stochastic mirror descent for two-stage nonlinear stochastic programs. Math Program 187, 533–577 (2021)MathSciNetCrossRef
20.
Zurück zum Zitat Hanzely, F., Richtárik, P.: Fastest rates for stochastic mirror descent methods. Comput Optim Appl 79, 717–766 (2021)MathSciNetCrossRef Hanzely, F., Richtárik, P.: Fastest rates for stochastic mirror descent methods. Comput Optim Appl 79, 717–766 (2021)MathSciNetCrossRef
22.
Zurück zum Zitat Hien, L.T.K., Nguyen, C.V., Xu, H., Lu, C., Feng, J.: Accelerated randomized mirror descent algorithms for composite non-strongly convex optimization. J Optim Theory Appl 181, 541–566 (2019)MathSciNetCrossRef Hien, L.T.K., Nguyen, C.V., Xu, H., Lu, C., Feng, J.: Accelerated randomized mirror descent algorithms for composite non-strongly convex optimization. J Optim Theory Appl 181, 541–566 (2019)MathSciNetCrossRef
23.
Zurück zum Zitat Hovhannisyan, V., Parpas, P., Zafeiriou, S.: MAGMA - multilevel accelerated gradient mirror descent algorithm for large-scale convex composite minimization. SIAM J Imaging Sci 9, 1829–1857 (2016)MathSciNetCrossRef Hovhannisyan, V., Parpas, P., Zafeiriou, S.: MAGMA - multilevel accelerated gradient mirror descent algorithm for large-scale convex composite minimization. SIAM J Imaging Sci 9, 1829–1857 (2016)MathSciNetCrossRef
24.
Zurück zum Zitat Ivanova, A., Stonyakin, F., Pasechnyuk, D., Vorontsova, E., Gasnikov, A.: Adaptive mirror descent for the network utility maximization problem. IFAC-PapersOnLine 53, 7851–7856 (2020)CrossRef Ivanova, A., Stonyakin, F., Pasechnyuk, D., Vorontsova, E., Gasnikov, A.: Adaptive mirror descent for the network utility maximization problem. IFAC-PapersOnLine 53, 7851–7856 (2020)CrossRef
25.
Zurück zum Zitat Juditsky, A., Kwon, J., Moulines, É.: Unifying mirror descent and dual averaging, Math Program 199, 793–830 (2023) Juditsky, A., Kwon, J., Moulines, É.: Unifying mirror descent and dual averaging, Math Program 199, 793–830 (2023)
26.
Zurück zum Zitat W. Krichene, A. M. Bayen, P. L. Bartlett: Accelerated mirror descent in continuous and discrete time, Adv Neur In 2 (NIPS 2015), 2845–2853 (2015) W. Krichene, A. M. Bayen, P. L. Bartlett: Accelerated mirror descent in continuous and discrete time, Adv Neur In 2 (NIPS 2015), 2845–2853 (2015)
27.
Zurück zum Zitat G. Kunapuli, J. Shavlik: Mirror descent for metric learning: a unified approach, In: P.A. Flach, T. De Bie and N. Cristianini (eds.), Machine Learning and Knowledge Discovery in Databases - ECML PKDD 2012, Lect Notes Artif Int 7523, 859-874 (2012) G. Kunapuli, J. Shavlik: Mirror descent for metric learning: a unified approach, In: P.A. Flach, T. De Bie and N. Cristianini (eds.), Machine Learning and Knowledge Discovery in Databases - ECML PKDD 2012, Lect Notes Artif Int 7523, 859-874 (2012)
29.
Zurück zum Zitat Lu, H.: Relative continuity for non-Lipschitz nonsmooth convex optimization using stochastic (or deterministic) mirror descent. INFORMS J Optim 4, 288–303 (2019)MathSciNetCrossRef Lu, H.: Relative continuity for non-Lipschitz nonsmooth convex optimization using stochastic (or deterministic) mirror descent. INFORMS J Optim 4, 288–303 (2019)MathSciNetCrossRef
30.
Zurück zum Zitat D.V.N. Luong, P. Parpas, D. Rueckert, B. Rustem: Solving MRF minimization by mirror descent, In: G. Bebis et al. (eds.), Advances in Visual Computing (ISVC 2012), Lect Notes Comput Sci 7431, Springer, 587–598 (2012) D.V.N. Luong, P. Parpas, D. Rueckert, B. Rustem: Solving MRF minimization by mirror descent, In: G. Bebis et al. (eds.), Advances in Visual Computing (ISVC 2012), Lect Notes Comput Sci 7431, Springer, 587–598 (2012)
31.
Zurück zum Zitat S. Mahadevan, B. Liu, P. Thomas, W. Dabney, S. Giguere, N. Jacek, I. Gemp, J. Liu: Proximal reinforcement learning: a new theory of sequential decision making in primal-dual spaces, http://arxiv.org/abs/1405.6757 (2014) S. Mahadevan, B. Liu, P. Thomas, W. Dabney, S. Giguere, N. Jacek, I. Gemp, J. Liu: Proximal reinforcement learning: a new theory of sequential decision making in primal-dual spaces, http://​arxiv.​org/​abs/​1405.​6757 (2014)
33.
Zurück zum Zitat P. Mertikopoulos, B. Lecouat, H. Zenati, C.-S. Foo, V. Chandrasekhar, G. Piliouras: Optimistic mirror descent in saddle-point problems - going the extra (gradient) mile, International Conference on Learning Representations (ICLR 2019), 1–23 (2019) P. Mertikopoulos, B. Lecouat, H. Zenati, C.-S. Foo, V. Chandrasekhar, G. Piliouras: Optimistic mirror descent in saddle-point problems - going the extra (gradient) mile, International Conference on Learning Representations (ICLR 2019), 1–23 (2019)
34.
Zurück zum Zitat Mertikopoulos, P., Staudigl, M.: Stochastic mirror descent dynamics and their convergence in monotone variational inequalities. J Optim Theory Appl 179, 838–867 (2018)MathSciNetCrossRef Mertikopoulos, P., Staudigl, M.: Stochastic mirror descent dynamics and their convergence in monotone variational inequalities. J Optim Theory Appl 179, 838–867 (2018)MathSciNetCrossRef
36.
Zurück zum Zitat A.V. Nazin, S. Anulova, A. Tremba: Application of the mirror descent method to minimize average losses coming by a Poisson flow, In: Proceedings of the European Control Conference (ECC14), 2194–2197 (2014) A.V. Nazin, S. Anulova, A. Tremba: Application of the mirror descent method to minimize average losses coming by a Poisson flow, In: Proceedings of the European Control Conference (ECC14), 2194–2197 (2014)
37.
Zurück zum Zitat Nedić, A., Lee, S.: On stochastic subgradient mirror-descent algorithm with weighted averaging. SIAM J Optim 24, 84–107 (2014)MathSciNetCrossRef Nedić, A., Lee, S.: On stochastic subgradient mirror-descent algorithm with weighted averaging. SIAM J Optim 24, 84–107 (2014)MathSciNetCrossRef
38.
Zurück zum Zitat Nemirovski, A.: Efficient methods for large-scale convex optimization problems. Ékon Mat Metody 2, 135–152 (1979). ((in Russian)) Nemirovski, A.: Efficient methods for large-scale convex optimization problems. Ékon Mat Metody 2, 135–152 (1979). ((in Russian))
39.
Zurück zum Zitat Nemirovski, A., Yudin, D.B.: Problem Complexity and Method Efficiency in Optimization. J. Wiley & Sons, New York (1983) Nemirovski, A., Yudin, D.B.: Problem Complexity and Method Efficiency in Optimization. J. Wiley & Sons, New York (1983)
40.
42.
Zurück zum Zitat Y. Nesterov: Lectures on Convex Optimization, Springer (2018) Y. Nesterov: Lectures on Convex Optimization, Springer (2018)
43.
Zurück zum Zitat Nguyen, M.N., Le, T.H.A., Giles, D., Nguyen, T.A.: Smoothing techniques and difference of convex functions algorithms for image reconstruction. Optimization 69(7–8), 1601–1633 (2019)MathSciNet Nguyen, M.N., Le, T.H.A., Giles, D., Nguyen, T.A.: Smoothing techniques and difference of convex functions algorithms for image reconstruction. Optimization 69(7–8), 1601–1633 (2019)MathSciNet
44.
Zurück zum Zitat Paulavicius, R.: J, Zilinskas: Analysis of different norms and corresponding Lipschitz constants for global optimization. Inf Technol Control 12, 301–306 (2006) Paulavicius, R.: J, Zilinskas: Analysis of different norms and corresponding Lipschitz constants for global optimization. Inf Technol Control 12, 301–306 (2006)
45.
Zurück zum Zitat Quoc, T.-D.: Adaptive smoothing algorithms for nonsmooth composite convex minimization. Comput Optim Appl 66, 425–451 (2017)MathSciNetCrossRef Quoc, T.-D.: Adaptive smoothing algorithms for nonsmooth composite convex minimization. Comput Optim Appl 66, 425–451 (2017)MathSciNetCrossRef
46.
Zurück zum Zitat R.T. Rockafellar: Convex Analysis, Princeton University Press (1970) R.T. Rockafellar: Convex Analysis, Princeton University Press (1970)
49.
Zurück zum Zitat Semenov, V.V.: A version of the mirror descent method to solve variational inequalities. Cybern Syst Anal 53, 234–243 (2017)CrossRef Semenov, V.V.: A version of the mirror descent method to solve variational inequalities. Cybern Syst Anal 53, 234–243 (2017)CrossRef
50.
Zurück zum Zitat Titov, A., Stonyakin, F., Alkousa, M., Ablaev, S., Gasnikov, A.: Analogues of switching subgradient schemes for relatively Lipschitz-continuous convex programming problems. MOTOR 2020, 133–149 (2020)MathSciNet Titov, A., Stonyakin, F., Alkousa, M., Ablaev, S., Gasnikov, A.: Analogues of switching subgradient schemes for relatively Lipschitz-continuous convex programming problems. MOTOR 2020, 133–149 (2020)MathSciNet
53.
Zurück zum Zitat Y. Zhou, Y. Liang, L. Shen: A unified approach to proximal algorithms using Bregman distance, Technical Report, Syracuse University (2016) Y. Zhou, Y. Liang, L. Shen: A unified approach to proximal algorithms using Bregman distance, Technical Report, Syracuse University (2016)
54.
Zurück zum Zitat Zhou, Z., Mertikopoulos, P., Bambos, N., Boyd, S.P., Glynn, P.W.: On the convergence of mirror descent beyond stochastic convex programming. SIAM J Optim 30, 687–716 (2020)MathSciNetCrossRef Zhou, Z., Mertikopoulos, P., Bambos, N., Boyd, S.P., Glynn, P.W.: On the convergence of mirror descent beyond stochastic convex programming. SIAM J Optim 30, 687–716 (2020)MathSciNetCrossRef
55.
Zurück zum Zitat J. Zimmert, T. Lattimore: Connections between mirror descent, Thompson sampling and the information ratio, In: H. Wallach, H. Larochelle, A. Beygelzimer, F. d’Alché-Buc, E. Fox and R. Garnett (eds.), Adv Neur In 32 (NIPS 2019), 11973–11982 (2019) J. Zimmert, T. Lattimore: Connections between mirror descent, Thompson sampling and the information ratio, In: H. Wallach, H. Larochelle, A. Beygelzimer, F. d’Alché-Buc, E. Fox and R. Garnett (eds.), Adv Neur In 32 (NIPS 2019), 11973–11982 (2019)
Metadaten
Titel
Stochastic incremental mirror descent algorithms with Nesterov smoothing
verfasst von
Sandy Bitterlich
Sorin-Mihai Grad
Publikationsdatum
03.08.2023
Verlag
Springer US
Erschienen in
Numerical Algorithms / Ausgabe 1/2024
Print ISSN: 1017-1398
Elektronische ISSN: 1572-9265
DOI
https://doi.org/10.1007/s11075-023-01574-1

Weitere Artikel der Ausgabe 1/2024

Numerical Algorithms 1/2024 Zur Ausgabe

Premium Partner