Skip to main content
Erschienen in: Programming and Computer Software 6/2023

01.12.2023

Adaptive Methods for Variational Inequalities with Relatively Smooth and Reletively Strongly Monotone Operators

verfasst von: S. S. Ablaev, F. S. Stonyakin, M. S. Alkousa, D. A. Pasechnyk

Erschienen in: Programming and Computer Software | Ausgabe 6/2023

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

This paper is devoted to some adaptive methods for variational inequalities with relatively smooth and relatively strongly monotone operators. Based on the recently proposed proximal version of the extragradient method for this class of problems, we study in detail the method with adaptively selected parameter values. The rate of convergence of this method is estimated. The result is generalized to the class of variational inequalities with relatively strongly monotone δ-generalized smooth operators. For the ridge regression problem and variational inequality associated with box-simplex games, numerical experiments are carried out to demonstrate the effectiveness of the proposed technique for adaptive parameter selection during the execution of the algorithm.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Stonyakin, F., Tyurin, A., Gasnikov, A., Dvurechensky, P., Agafonov, A., Dvinskikh, D., Alkousa, M., Pasechnyuk, D., Artamonov, S., and Piskunova, V., Inexact relative smoothness and strong convexity for optimization and variational inequalities by inexact model, Optim. Methods Software, 2021, vol. 36, no. 6, pp. 1155–1201.MathSciNetCrossRefMATH Stonyakin, F., Tyurin, A., Gasnikov, A., Dvurechensky, P., Agafonov, A., Dvinskikh, D., Alkousa, M., Pasechnyuk, D., Artamonov, S., and Piskunova, V., Inexact relative smoothness and strong convexity for optimization and variational inequalities by inexact model, Optim. Methods Software, 2021, vol. 36, no. 6, pp. 1155–1201.MathSciNetCrossRefMATH
2.
Zurück zum Zitat Cohen, M.B., Sidford, A., and Tian, K., Relative Lipschitzness in extragradient methods and a direct recipe for acceleration, 2020. https://arxiv.org/pdf/2011.06572.pdf. Cohen, M.B., Sidford, A., and Tian, K., Relative Lipschitzness in extragradient methods and a direct recipe for acceleration, 2020. https://arxiv.org/pdf/2011.06572.pdf.
3.
Zurück zum Zitat Titov, A.A., Ablaev, S.S., Stonyakin, F.S., Alkousa, M.S., and Gasnikov, A., Some adaptive first-order methods for variational inequalities with relatively strongly monotone operators and generalized smoothness, Lect. Notes Comput. Sci., Olenev, N., Evtushenko, Y., Jaćimović, M., Khachay, M., Malkova, V., and Pospelov, I., Eds., 2022, vol. 13781.MATH Titov, A.A., Ablaev, S.S., Stonyakin, F.S., Alkousa, M.S., and Gasnikov, A., Some adaptive first-order methods for variational inequalities with relatively strongly monotone operators and generalized smoothness, Lect. Notes Comput. Sci., Olenev, N., Evtushenko, Y., Jaćimović, M., Khachay, M., Malkova, V., and Pospelov, I., Eds., 2022, vol. 13781.MATH
4.
Zurück zum Zitat Bauschke, H.H., Bolte, J., and Teboulle, M., A descent lemma beyond Lipschitz gradient continuity: First-order methods revisited and applications, Math. Oper. Res., 2017, vol. 42, no. 2, pp. 330–348.MathSciNetCrossRefMATH Bauschke, H.H., Bolte, J., and Teboulle, M., A descent lemma beyond Lipschitz gradient continuity: First-order methods revisited and applications, Math. Oper. Res., 2017, vol. 42, no. 2, pp. 330–348.MathSciNetCrossRefMATH
5.
Zurück zum Zitat Lu, H., Freund, R.M., and Nesterov, Y., Relatively smooth convex optimization by first-order methods, and applications, SIAM J. Optim., 2018, vol. 28, no. 1, pp. 333–354.MathSciNetCrossRefMATH Lu, H., Freund, R.M., and Nesterov, Y., Relatively smooth convex optimization by first-order methods, and applications, SIAM J. Optim., 2018, vol. 28, no. 1, pp. 333–354.MathSciNetCrossRefMATH
6.
Zurück zum Zitat Hendrikx, H., Xiao, L., Bubeck, S., Bach, F., and Massoulie, L., Statistically preconditioned accelerated gradient method for distributed optimization, Proc. Int. Conf. Machine Learning, 2020, pp. 4203–4227. Hendrikx, H., Xiao, L., Bubeck, S., Bach, F., and Massoulie, L., Statistically preconditioned accelerated gradient method for distributed optimization, Proc. Int. Conf. Machine Learning, 2020, pp. 4203–4227.
7.
Zurück zum Zitat Tian, Y., Scutari, G., Cao, T., and Gasnikov, A., Acceleration in distributed optimization under similarity, Proc. 25th Int. Conf. Artificial Intelligence and Statistics, 2022, vol. 151, pp. 5721–5756. Tian, Y., Scutari, G., Cao, T., and Gasnikov, A., Acceleration in distributed optimization under similarity, Proc. 25th Int. Conf. Artificial Intelligence and Statistics, 2022, vol. 151, pp. 5721–5756.
8.
Zurück zum Zitat Jin, Y., Sidford, A., and Tian, K., Sharper rates for separable minimax and finite sum optimization via primal-dual extragradient methods, Proc. Conf. Learning Theory, 2022, pp. 4362–4415. Jin, Y., Sidford, A., and Tian, K., Sharper rates for separable minimax and finite sum optimization via primal-dual extragradient methods, Proc. Conf. Learning Theory, 2022, pp. 4362–4415.
Metadaten
Titel
Adaptive Methods for Variational Inequalities with Relatively Smooth and Reletively Strongly Monotone Operators
verfasst von
S. S. Ablaev
F. S. Stonyakin
M. S. Alkousa
D. A. Pasechnyk
Publikationsdatum
01.12.2023
Verlag
Pleiades Publishing
Erschienen in
Programming and Computer Software / Ausgabe 6/2023
Print ISSN: 0361-7688
Elektronische ISSN: 1608-3261
DOI
https://doi.org/10.1134/S0361768823060026

Weitere Artikel der Ausgabe 6/2023

Programming and Computer Software 6/2023 Zur Ausgabe

Premium Partner