Abstract.
We propose a new subgradient-type method for minimizing extremely large-scale nonsmooth convex functions over “simple” domains. The characteristic features of the method are (a) the possibility to adjust the scheme to the geometry of the feasible set, thus allowing to get (nearly) dimension-independent (and nearly optimal in the large-scale case) rate-of-convergence results for minimization of a convex Lipschitz continuous function over a Euclidean ball, a standard simplex, and a spectahedron (the set of positive semidefinite symmetric matrices, of given size, with unit trace); (b) flexible handling of accumulated information, allowing for tradeoff between the level of utilizing this information and iteration’s complexity. We present extensions of the scheme for the cases of minimizing non-Lipschitzian convex objectives, finding saddle points of convex-concave functions and solving variational inequalities with monotone operators. Finally, we report on encouraging numerical results of experiments with test problems of dimensions up to 66,000.
Similar content being viewed by others
References
Ben-Tal, A., Margalit, T., Nemirovski, A.: The Ordered Subsets Mirror Descent Optimization Method with Applications to Tomography. SIAM Journal on Optimization 12, 79–108 (2001)
Chudak, F.A.: Improved approximation algorithms for uncapacitated facility location. Lecture Notes on Computer Science 1412, 180–192 (1998)
Kiwiel, K.: An aggregate subgradient method for nonsmooth convex minimization. Mathematical Programming 27, 320–341 (1983)
Kiwiel, K.: Proximal level bundle method for convex nondifferentable optimization, saddle point problems and variational inequalities. Mathematical Programming Series B 69, 89–109 (1995)
Kiwiel K.C., Larson, T., Lindberg, P.O.: The efficiency of ballstep subgradient level methods for convex optimization. Mathematics of Operations Research 24, 237–254 (1999).
Lemaréchal, C.: Nonsmooth optimization and descent methods. Research Report 78–4, IIASA, Laxenburg, Austria, 1978
Lemaréchal, C., Strodiot, J.J., Bihain, A.: On a bundle algorithm for nonsmooth optimization. In: O.L. Mangasarian, R.R. Meyer, S.M. Robinson, (eds.), Nonlinear Programming 4 (Academic Press, NY, 1981), pp. 245–282
Lemaréchal, C., Nemirovski, A., Nesterov, Yu.: New variants of bundle methods. Mathematical Programming Series B 69, 111–148 (1995)
Mifflin, R.: A modification and an extension of Lemaréchal’s algorithm for nonsmooth minimization. Mathematical Programming Study 17, 77–90 (1982)
Nemirovski, A., Yudin, D.: Problem complexity and method efficiency in optimization, J. Wiley & Sons, 1983
Nesterov, Yu.: Cutting plane algorithms from analytic centers: complexity estimate. Mathematical Programming 65, 149–176 (1995)
Polyak, B.T.: A general method for solving extremal problems. Soviet Math. Doklady 174, 33–36 (1967)
Shor, N.Z.: Generalized gradient descent with application to block programming. Kibernetika No. 3 (1967) (in Russian)
Schramm, H., Zowe, J.: A version of bundle idea for minimizing a non-smooth function: conceptual idea, convergence analysis, numerical results. SIAM Journal on Optimization 2, 121–152 (1992)
Author information
Authors and Affiliations
Corresponding author
Additional information
This research was supported by the Technion Fund for Promotion of Research
Rights and permissions
About this article
Cite this article
Ben-Tal, A., Nemirovski, A. Non-euclidean restricted memory level method for large-scale convex optimization. Math. Program. 102, 407–456 (2005). https://doi.org/10.1007/s10107-004-0553-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10107-004-0553-4