2000 | OriginalPaper | Buchkapitel
Adaptive Scaling and Convergence Rates of a Separable Augmented Lagrangian Algorithm
verfasst von : Philippe Mahey, Jean-Pierre Dussault, Abdelhamid Benchakroun, Abdelouahed Hamdi
Erschienen in: Optimization
Verlag: Springer Berlin Heidelberg
Enthalten in: Professional Book Archive
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
We analyze the numerical behaviour of a separable Augmented Lagrangian algorithm. This algorithm which is equivalent to the Proximal Decomposition algorithm in the convex case, uses a dual sequence which convergence is associated with the inverse of the primal penalty parameter. As a consequence, an optimal value for that parameter, which is thus more like a scaling parameter than like a penalty one, is expected, confirming former theoretical results in the strongly convex case.We propose an implementable algorithm where the scaling parameter is adjusted at each iteration in order to keep the same rate of convergence for both primal and dual sequences. The correcting effect of that parameter update is illustrated on small quadratic problems. The autoscaled decomposition algorithm is then tested on larger block-angular problems with convex or non convex separable cost functions.