Skip to main content
Top
Published in: Journal of Scientific Computing 2/2023

01-11-2023

A Unified Primal-Dual Algorithm Framework for Inequality Constrained Problems

Authors: Zhenyuan Zhu, Fan Chen, Junyu Zhang, Zaiwen Wen

Published in: Journal of Scientific Computing | Issue 2/2023

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

In this paper, we propose a unified primal-dual algorithm framework based on the augmented Lagrangian function for composite convex problems with conic inequality constraints. The new framework is highly versatile. First, it not only covers many existing algorithms such as PDHG, Chambolle–Pock, GDA, OGDA and linearized ALM, but also guides us to design a new efficient algorithm called Semi-OGDA (SOGDA). Second, it enables us to study the role of the augmented penalty term in the convergence analysis. Interestingly, a properly selected penalty not only improves the numerical performance of the above methods, but also theoretically enables the convergence of algorithms like PDHG and SOGDA. Under properly designed step sizes and penalty term, our unified framework preserves the \({\mathcal {O}}(1/N)\) ergodic convergence while not requiring any prior knowledge about the magnitude of the optimal Lagrangian multiplier. Linear convergence rate for affine equality constrained problem is also obtained given appropriate conditions. Finally, numerical experiments on linear programming, \(\ell _1\) minimization problem, and multi-block basis pursuit problem demonstrate the efficiency of our methods.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Appendix
Available only for authorised users
Literature
1.
go back to reference Bauschke, H.H., Combettes, P.L., et al.: Convex Analysis and Monotone Operator Theory in Hilbert Spaces, vol. 408. Springer, New York (2011)CrossRefMATH Bauschke, H.H., Combettes, P.L., et al.: Convex Analysis and Monotone Operator Theory in Hilbert Spaces, vol. 408. Springer, New York (2011)CrossRefMATH
2.
go back to reference Bertsekas, D.: Convex Optimization Algorithms. Athena Scientific, Nashua (2015)MATH Bertsekas, D.: Convex Optimization Algorithms. Athena Scientific, Nashua (2015)MATH
3.
go back to reference Cai, X., Han, D., Yuan, X.: On the convergence of the direct extension of ADMM for three-block separable convex minimization models with one strongly convex function. Comput. Optim. Appl. 66(1), 39–73 (2017)MathSciNetCrossRefMATH Cai, X., Han, D., Yuan, X.: On the convergence of the direct extension of ADMM for three-block separable convex minimization models with one strongly convex function. Comput. Optim. Appl. 66(1), 39–73 (2017)MathSciNetCrossRefMATH
4.
go back to reference Chambolle, A., Pock, T.: A first-order primal-dual algorithm for convex problems with applications to imaging. J. Math. Imaging Vis. 40(1), 120–145 (2011)MathSciNetCrossRefMATH Chambolle, A., Pock, T.: A first-order primal-dual algorithm for convex problems with applications to imaging. J. Math. Imaging Vis. 40(1), 120–145 (2011)MathSciNetCrossRefMATH
5.
go back to reference Chambolle, A., Pock, T.: On the ergodic convergence rates of a first-order primal-dual algorithm. Math. Program. 159(1), 253–287 (2016)MathSciNetCrossRefMATH Chambolle, A., Pock, T.: On the ergodic convergence rates of a first-order primal-dual algorithm. Math. Program. 159(1), 253–287 (2016)MathSciNetCrossRefMATH
6.
go back to reference Chen, C., He, B., Ye, Y., Yuan, X.: The direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent. Math. Program. 155(1), 57–79 (2016)MathSciNetCrossRefMATH Chen, C., He, B., Ye, Y., Yuan, X.: The direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent. Math. Program. 155(1), 57–79 (2016)MathSciNetCrossRefMATH
7.
go back to reference Chen, C., Shen, Y., You, Y.: On the convergence analysis of the alternating direction method of multipliers with three blocks. In: Abstract and Applied Analysis, vol. 2013. Hindawi (2013) Chen, C., Shen, Y., You, Y.: On the convergence analysis of the alternating direction method of multipliers with three blocks. In: Abstract and Applied Analysis, vol. 2013. Hindawi (2013)
9.
go back to reference De Marchi, A., Jia, X., Kanzow, C., Mehlitz, P.: Constrained composite optimization and augmented Lagrangian methods. Mathematical Programming, pp. 1–34 (2023) De Marchi, A., Jia, X., Kanzow, C., Mehlitz, P.: Constrained composite optimization and augmented Lagrangian methods. Mathematical Programming, pp. 1–34 (2023)
10.
go back to reference Deng, W., Lai, M.J., Peng, Z., Yin, W.: Parallel multi-block ADMM with \(o (1/k)\) convergence. J. Sci. Comput. 71(2), 712–736 (2017)MathSciNetCrossRefMATH Deng, W., Lai, M.J., Peng, Z., Yin, W.: Parallel multi-block ADMM with \(o (1/k)\) convergence. J. Sci. Comput. 71(2), 712–736 (2017)MathSciNetCrossRefMATH
11.
go back to reference Eckstein, J., Bertsekas, D.P.: On the Douglas–Rachford splitting method and the proximal point algorithm for maximal monotone operators. Math. Program. 55(1), 293–318 (1992)MathSciNetCrossRefMATH Eckstein, J., Bertsekas, D.P.: On the Douglas–Rachford splitting method and the proximal point algorithm for maximal monotone operators. Math. Program. 55(1), 293–318 (1992)MathSciNetCrossRefMATH
12.
go back to reference Gao, X., Xu, Y., Zhang, S.: Randomized primal-dual proximal block coordinate updates. J. Oper. Res. Soc. China 7(2), 205–250 (2019)MathSciNetCrossRefMATH Gao, X., Xu, Y., Zhang, S.: Randomized primal-dual proximal block coordinate updates. J. Oper. Res. Soc. China 7(2), 205–250 (2019)MathSciNetCrossRefMATH
13.
go back to reference Gay, D.M.: Electronic mail distribution of linear programming test problems. Math. Program. Soc. COAL Newsl. 13, 10–12 (1985) Gay, D.M.: Electronic mail distribution of linear programming test problems. Math. Program. Soc. COAL Newsl. 13, 10–12 (1985)
15.
go back to reference Hamedani, E.Y., Aybat, N.S.: A primal-dual algorithm with line search for general convex-concave saddle point problems. SIAM J. Optim. 31(2), 1299–1329 (2021)MathSciNetCrossRefMATH Hamedani, E.Y., Aybat, N.S.: A primal-dual algorithm with line search for general convex-concave saddle point problems. SIAM J. Optim. 31(2), 1299–1329 (2021)MathSciNetCrossRefMATH
16.
go back to reference He, B., You, Y., Yuan, X.: On the convergence of primal-dual hybrid gradient algorithm. SIAM J. Imaging Sci. 7(4), 2526–2537 (2014)MathSciNetCrossRefMATH He, B., You, Y., Yuan, X.: On the convergence of primal-dual hybrid gradient algorithm. SIAM J. Imaging Sci. 7(4), 2526–2537 (2014)MathSciNetCrossRefMATH
17.
go back to reference Kong, W., Melo, J.G., Monteiro, R.D.: Iteration complexity of a proximal augmented Lagrangian method for solving nonconvex composite optimization problems with nonlinear convex constraints. Math. Oper. Res. 48(2), 1066–1094 (2023)MathSciNetCrossRef Kong, W., Melo, J.G., Monteiro, R.D.: Iteration complexity of a proximal augmented Lagrangian method for solving nonconvex composite optimization problems with nonlinear convex constraints. Math. Oper. Res. 48(2), 1066–1094 (2023)MathSciNetCrossRef
18.
go back to reference Li, M., Sun, D., Toh, K.C.: A convergent 3-block semi-proximal ADMM for convex minimization problems with one strongly convex block. Asia-Pac. J. Oper. Res. 32(04), 1550024 (2015)MathSciNetCrossRefMATH Li, M., Sun, D., Toh, K.C.: A convergent 3-block semi-proximal ADMM for convex minimization problems with one strongly convex block. Asia-Pac. J. Oper. Res. 32(04), 1550024 (2015)MathSciNetCrossRefMATH
19.
go back to reference Liang, T., Stokes, J.: Interaction matters: a note on non-asymptotic local convergence of generative adversarial networks. In: The 22nd International Conference on Artificial Intelligence and Statistics, pp. 907–915. PMLR (2019) Liang, T., Stokes, J.: Interaction matters: a note on non-asymptotic local convergence of generative adversarial networks. In: The 22nd International Conference on Artificial Intelligence and Statistics, pp. 907–915. PMLR (2019)
20.
go back to reference Lin, T., Ma, S., Zhang, S.: On the global linear convergence of the ADMM with multiblock variables. SIAM J. Optim. 25(3), 1478–1497 (2015)MathSciNetCrossRefMATH Lin, T., Ma, S., Zhang, S.: On the global linear convergence of the ADMM with multiblock variables. SIAM J. Optim. 25(3), 1478–1497 (2015)MathSciNetCrossRefMATH
21.
22.
23.
go back to reference Malitsky, Y., Tam, M.K.: A forward-backward splitting method for monotone inclusions without cocoercivity. SIAM J. Optim. 30(2), 1451–1472 (2020)MathSciNetCrossRefMATH Malitsky, Y., Tam, M.K.: A forward-backward splitting method for monotone inclusions without cocoercivity. SIAM J. Optim. 30(2), 1451–1472 (2020)MathSciNetCrossRefMATH
24.
go back to reference Milzarek, A., Ulbrich, M.: A semismooth Newton method with multidimensional filter globalization for \(l_1\)-optimization. SIAM J. Optim. 24(1), 298–333 (2014)MathSciNetCrossRefMATH Milzarek, A., Ulbrich, M.: A semismooth Newton method with multidimensional filter globalization for \(l_1\)-optimization. SIAM J. Optim. 24(1), 298–333 (2014)MathSciNetCrossRefMATH
25.
go back to reference Mokhtari, A., Ozdaglar, A., Pattathil, S.: A unified analysis of extra-gradient and optimistic gradient methods for saddle point problems: proximal point approach. In: International Conference on Artificial Intelligence and Statistics, pp. 1497–1507. PMLR (2020) Mokhtari, A., Ozdaglar, A., Pattathil, S.: A unified analysis of extra-gradient and optimistic gradient methods for saddle point problems: proximal point approach. In: International Conference on Artificial Intelligence and Statistics, pp. 1497–1507. PMLR (2020)
26.
go back to reference Mokhtari, A., Ozdaglar, A.E., Pattathil, S.: Convergence rate of \(\cal{O} (1/k)\) for optimistic gradient and extragradient methods in smooth convex-concave saddle point problems. SIAM J. Optim. 30(4), 3230–3251 (2020)MathSciNetCrossRefMATH Mokhtari, A., Ozdaglar, A.E., Pattathil, S.: Convergence rate of \(\cal{O} (1/k)\) for optimistic gradient and extragradient methods in smooth convex-concave saddle point problems. SIAM J. Optim. 30(4), 3230–3251 (2020)MathSciNetCrossRefMATH
27.
go back to reference Moreau, J.J.: Décomposition orthogonale d’un espace hilbertien selon deux cônes mutuellement polaires. C. R. Hebd. Seances Acad. Sci. 255, 238–240 (1962)MATH Moreau, J.J.: Décomposition orthogonale d’un espace hilbertien selon deux cônes mutuellement polaires. C. R. Hebd. Seances Acad. Sci. 255, 238–240 (1962)MATH
28.
go back to reference Rockafellar, R.T., Wets, R.J.B.: Variational Analysis, vol. 317. Springer Science & Business Media, New York (2009)MATH Rockafellar, R.T., Wets, R.J.B.: Variational Analysis, vol. 317. Springer Science & Business Media, New York (2009)MATH
29.
go back to reference Uzawa, H.: Iterative methods for concave programming. Stud. Linear Nonlinear Program. 6, 154–165 (1958) Uzawa, H.: Iterative methods for concave programming. Stud. Linear Nonlinear Program. 6, 154–165 (1958)
30.
go back to reference Wei, C.Y., Lee, C.W., Zhang, M., Luo, H.: Linear last-iterate convergence in constrained saddle-point optimization. In: International Conference on Learning Representations (2020) Wei, C.Y., Lee, C.W., Zhang, M., Luo, H.: Linear last-iterate convergence in constrained saddle-point optimization. In: International Conference on Learning Representations (2020)
31.
go back to reference Xu, Y.: First-order methods for constrained convex programming based on linearized augmented Lagrangian function. Inf. J. Optim. 3(1), 89–117 (2021)MathSciNet Xu, Y.: First-order methods for constrained convex programming based on linearized augmented Lagrangian function. Inf. J. Optim. 3(1), 89–117 (2021)MathSciNet
32.
go back to reference Yang, J., Yuan, X.: Linearized augmented Lagrangian and alternating direction methods for nuclear norm minimization. Math. Comput. 82(281), 301–329 (2013)MathSciNetCrossRefMATH Yang, J., Yuan, X.: Linearized augmented Lagrangian and alternating direction methods for nuclear norm minimization. Math. Comput. 82(281), 301–329 (2013)MathSciNetCrossRefMATH
33.
go back to reference Yang, J., Zhang, Y.: Alternating direction algorithms for \(\ell _1\)-problems in compressive sensing. SIAM J. Sci. Comput. 33(1), 250–278 (2011)MathSciNetCrossRef Yang, J., Zhang, Y.: Alternating direction algorithms for \(\ell _1\)-problems in compressive sensing. SIAM J. Sci. Comput. 33(1), 250–278 (2011)MathSciNetCrossRef
34.
go back to reference Ye, J., Ye, X.: Necessary optimality conditions for optimization problems with variational inequality constraints. Math. Oper. Res. 22(4), 977–997 (1997)MathSciNetCrossRefMATH Ye, J., Ye, X.: Necessary optimality conditions for optimization problems with variational inequality constraints. Math. Oper. Res. 22(4), 977–997 (1997)MathSciNetCrossRefMATH
35.
go back to reference Yuan, X., Zeng, S., Zhang, J.: Discerning the linear convergence of ADMM for structured convex optimization through the lens of variational analysis. J. Mach. Learn. Res. 21, 83–1 (2020)MathSciNetMATH Yuan, X., Zeng, S., Zhang, J.: Discerning the linear convergence of ADMM for structured convex optimization through the lens of variational analysis. J. Mach. Learn. Res. 21, 83–1 (2020)MathSciNetMATH
36.
go back to reference Zhang, J., Hong, M., Zhang, S.: On lower iteration complexity bounds for the convex concave saddle point problems. Mathematical Programming, pp. 1–35 (2021) Zhang, J., Hong, M., Zhang, S.: On lower iteration complexity bounds for the convex concave saddle point problems. Mathematical Programming, pp. 1–35 (2021)
37.
go back to reference Zhang, J., Wang, M., Hong, M., Zhang, S.: Primal-dual first-order methods for affinely constrained multi-block saddle point problems. arXiv preprint arXiv:2109.14212 (2021) Zhang, J., Wang, M., Hong, M., Zhang, S.: Primal-dual first-order methods for affinely constrained multi-block saddle point problems. arXiv preprint arXiv:​2109.​14212 (2021)
38.
go back to reference Zhu, M., Chan, T.: An efficient primal-dual hybrid gradient algorithm for total variation image restoration. UCLA Cam Rep. 34, 8–34 (2008) Zhu, M., Chan, T.: An efficient primal-dual hybrid gradient algorithm for total variation image restoration. UCLA Cam Rep. 34, 8–34 (2008)
Metadata
Title
A Unified Primal-Dual Algorithm Framework for Inequality Constrained Problems
Authors
Zhenyuan Zhu
Fan Chen
Junyu Zhang
Zaiwen Wen
Publication date
01-11-2023
Publisher
Springer US
Published in
Journal of Scientific Computing / Issue 2/2023
Print ISSN: 0885-7474
Electronic ISSN: 1573-7691
DOI
https://doi.org/10.1007/s10915-023-02346-8

Other articles of this Issue 2/2023

Journal of Scientific Computing 2/2023 Go to the issue

Premium Partner