Skip to main content

2000 | OriginalPaper | Buchkapitel

Adaptive Scaling and Convergence Rates of a Separable Augmented Lagrangian Algorithm

verfasst von : Philippe Mahey, Jean-Pierre Dussault, Abdelhamid Benchakroun, Abdelouahed Hamdi

Erschienen in: Optimization

Verlag: Springer Berlin Heidelberg

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

We analyze the numerical behaviour of a separable Augmented Lagrangian algorithm. This algorithm which is equivalent to the Proximal Decomposition algorithm in the convex case, uses a dual sequence which convergence is associated with the inverse of the primal penalty parameter. As a consequence, an optimal value for that parameter, which is thus more like a scaling parameter than like a penalty one, is expected, confirming former theoretical results in the strongly convex case.We propose an implementable algorithm where the scaling parameter is adjusted at each iteration in order to keep the same rate of convergence for both primal and dual sequences. The correcting effect of that parameter update is illustrated on small quadratic problems. The autoscaled decomposition algorithm is then tested on larger block-angular problems with convex or non convex separable cost functions.

Metadaten
Titel
Adaptive Scaling and Convergence Rates of a Separable Augmented Lagrangian Algorithm
verfasst von
Philippe Mahey
Jean-Pierre Dussault
Abdelhamid Benchakroun
Abdelouahed Hamdi
Copyright-Jahr
2000
Verlag
Springer Berlin Heidelberg
DOI
https://doi.org/10.1007/978-3-642-57014-8_19