Skip to main content
Log in

Bounded perturbation resilience of projected scaled gradient methods

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

We investigate projected scaled gradient (PSG) methods for convex minimization problems. These methods perform a descent step along a diagonally scaled gradient direction followed by a feasibility regaining step via orthogonal projection onto the constraint set. This constitutes a generalized algorithmic structure that encompasses as special cases the gradient projection method, the projected Newton method, the projected Landweber-type methods and the generalized expectation-maximization (EM)-type methods. We prove the convergence of the PSG methods in the presence of bounded perturbations. This resilience to bounded perturbations is relevant to the ability to apply the recently developed superiorization methodology to PSG methods, in particular to the EM algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Notes

  1. We use the term “algorithm” for the iterative processes discussed here, even for those that do not include any termination criterion. This does not create any ambiguity because whether we consider an infinite iterative process or an algorithm with a termination rule is always clear from the context.

References

  1. Bauschke, H.H., Borwein, J.M.: On projection algorithms for solving convex feasibility problems. SIAM Rev. 38, 367–426 (1996)

    Article  MathSciNet  MATH  Google Scholar 

  2. Bertero, M., Boccacci, P.: Introduction to Inverse Problems in Imaging. Institute of Physics, Bristol (1998)

    Book  MATH  Google Scholar 

  3. Bertero, M., Lantéri, H., Zanni, L.: Iterative image reconstruction: a point of view. In: Censor, Y., Jiang, M., Louis, A.K. (eds.) Mathematical Methods in Biomedical Imaging and Intensity-Modulated Radiation Therapy (IMRT), Publications of the Scuola Normale Superiore, vol. 7, pp. 37–63. Edizioni della Normale, Pisa (2008)

    Google Scholar 

  4. Bertsekas, D.P.: On the Goldstein-Levitin-Polyak gradient projection method. IEEE Trans. Autom. Control 21, 174–184 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  5. Bertsekas, D.P.: Projected Newton methods for optimization problems with simple constraints. SIAM J. Control Optim. 20, 221–246 (1982)

    Article  MathSciNet  MATH  Google Scholar 

  6. Bertsekas, D.P.: Nonlinear Programming. Athena Scientific, Belmont (1999)

    MATH  Google Scholar 

  7. Bonettini, S., Zanella, R., Zanni, L.: A scaled gradient projection method for constrained image deblurring. Inverse Probl. 25, 015002 (2009). (23pp)

    Article  MathSciNet  Google Scholar 

  8. Butnariu, D., Davidi, R., Herman, G.T., Kazantsev, I.G.: Stable convergence behavior under summable perturbations of a class of projection methods for convex feasibility and optimization problems. IEEE J. Sel. Top. Signal Process. 1, 540–547 (2007)

    Article  Google Scholar 

  9. Byrne, C.L., Censor, Y.: Proximity function minimization using multiple Bregman projections, with applications to split feasibility and Kullback-Leibler distance minimization. Ann. Oper. Res. 105, 77–98 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  10. Byrne, C.L.: Iterative image reconstruction algorithms based on cross-entropy minimization. IEEE Trans. Image Process. 2, 96–103 (1993)

    Article  Google Scholar 

  11. Byrne, C.L.: Applied Iterative Methods. A K Peters, Wellesley (2008)

    MATH  Google Scholar 

  12. Cegielski, A.: Iterative Methods for Fixed Point Problems in Hilbert Spaces. Lecture Notes in Mathematics, vol. 2057. Springer, Heidelberg (2013)

    Book  Google Scholar 

  13. Censor, Y.: Weak and strong superiorization: between feasibility-seeking and minimization. An. St. Univ. Ovidius Constanta, Ser. Mat. 23, 41–54 (2015)

  14. Censor, Y., Davidi, R., Herman, G.T.: Perturbation resilience and superiorization of iterative algorithms. Inverse Probl. 26, 065008 (2010). (12pp)

    Article  MathSciNet  Google Scholar 

  15. Censor, Y., Davidi, R., Herman, G.T., Schulte, R.W., Tetruashvili, L.: Projected subgradient minimization versus superiorization. J. Optim. Theory Appl. 160, 730–747 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  16. Censor, Y., Elfving, T., Herman, G.T., Nikazad, T.: On diagonally relaxed orthogonal projection methods. SIAM J. Sci. Comput. 30, 473–504 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  17. Censor, Y., Zaslavski, A.J.: Convergence and perturbation resilience of dynamic string-averaging projection methods. Comput. Optim. Appl. 54, 65–76 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  18. Censor, Y., Zenios, S.A.: Parallel Optimization: Theory, Algorithms, and Applications. Oxford University Press, New York (1997)

    MATH  Google Scholar 

  19. Cheng, Y.C.: On the gradient-projection method for solving the nonsymmetric linear complementarity problem. J. Optim. Theory Appl. 43, 527–541 (1984)

    Article  MathSciNet  MATH  Google Scholar 

  20. Chinneck, J.W.: Feasibility and Infeasibility in Optimization: Algorithms and Computational Methods. International Series in Operations Research and Management Science, vol. 118. Springer, New York (2008)

    Google Scholar 

  21. Combettes, P.L.: Inconsistent signal feasibility problems: least-squares solutions in a product space. IEEE Trans. Signal Process. 42, 2955–2966 (1994)

    Article  Google Scholar 

  22. Combettes, P.L.: Quasi-Fejérian analysis of some optimization algorithms. In: Butnariu, D., Censor, Y., Reich, S. (eds.) Inherently Parallel Algorithms in Feasibility and Optimization and Their Applications. Studies in Computational Mathematics, vol. 8, pp. 115–152. Elsevier, Amsterdam (2001)

    Google Scholar 

  23. Csiszár, I.: Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems. Ann. Stat. 19, 2032–2066 (1991)

    Article  MATH  Google Scholar 

  24. Davidi, R., Herman, G.T., Censor, Y.: Perturbation-resilient block-iterative projection methods with application to image reconstruction from projections. Int. Trans. Oper. Res. 16, 505–524 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  25. Gafni, E.M., Bertsekas, D.P.: Two-metric projection methods for constrained optimization. SIAM J. Control Optim. 22, 936–964 (1984)

    Article  MathSciNet  MATH  Google Scholar 

  26. Garduño, E., Herman, G.T.: Superiorization of the ML-EM algorithm. IEEE Trans. Nucl. Sci. 61, 162–172 (2014)

    Article  Google Scholar 

  27. Goldstein, A.A.: Convex programming in Hilbert space. Bull. Am. Math. Soc. 70, 709–710 (1964)

    Article  MATH  Google Scholar 

  28. Golub, G.H., Van Loan, C.F.: Matrix Computations, 3rd edn. Johns Hopkins University Press, Baltimore (1996)

    MATH  Google Scholar 

  29. Helou Neto, E.S., De Pierro, A.R.: Convergence results for scaled gradient algorithms in positron emission tomography. Inverse Probl. 21, 1905–1914 (2005)

    Article  MATH  Google Scholar 

  30. Helou Neto, E.S., De Pierro, A.R.: Incremental subgradients for constrained convex optimization: a unified framework and new methods. SIAM J. Optim. 20, 1547–1572 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  31. Herman, G.T.: Superiorization for image analysis. Combinatorial Image Analysis. Lecture Notes in Computer Science, vol. 8466, pp. 1–7. Springer, Switzerland (2014)

    Chapter  Google Scholar 

  32. Herman, G.T.: Fundamentals of Computerized Tomography: Image Reconstruction from Projections, 2nd edn. Springer, London (2009)

    Book  Google Scholar 

  33. Herman, G.T., Garduño, E., Davidi, R., Censor, Y.: Superiorization: an optimization heuristic for medical physics. Med. Phys. 39, 5532–5546 (2012)

    Article  Google Scholar 

  34. Iusem, A.N.: Convergence analysis for a multiplicatively relaxed EM algorithm. Math. Methods Appl. Sci. 14, 573–593 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  35. Jiang, M., Wang, G.: Development of iterative algorithms for image reconstruction. J. X-ray Sci. Technol. 10, 77–86 (2001)

    Google Scholar 

  36. Jiang, M., Wang, G.: Convergence studies on iterative algorithms for image reconstruction. IEEE Trans. Med. Imaging 22, 569–579 (2003)

    Article  Google Scholar 

  37. Jin, W., Censor, Y., Jiang, M.: A heuristic superiorization-like approach to bioluminescence tomography. In: World Congress on Medical Physics and Biomedical Engineering May 26–31, 2012, Beijing, China, IFMBE Proceedings, vol. 39, pp. 1026–1029. Springer, Heidelberg (2013)

  38. Kiwiel, K.C.: Convergence of approximate and incremental subgradient methods for convex optimization. SIAM J. Optim. 14, 807–840 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  39. Landweber, L.: An iteration formula for Fredholm integral equations of the first kind. Am. J. Math. 73, 615–624 (1951)

    Article  MathSciNet  MATH  Google Scholar 

  40. Lantéri, H., Roche, M., Cuevas, O., Aime, C.: A general method to devise maximum-likelihood signal restoration multiplicative algorithms with non-negativity constraints. Signal Process. 81, 945–974 (2001)

    Article  MATH  Google Scholar 

  41. Levitin, E.S., Polyak, B.T.: Constrained minimization methods. USSR Comput. Math. Math. Phys. 6, 1–50 (1966)

    Article  Google Scholar 

  42. Li, W.: Remarks on convergence of the matrix splitting algorithm for the symmetric linear complementarity problem. SIAM J. Optim. 3, 155–163 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  43. Luo, S., Zhou, T.: Superiorization of EM algorithm and its application in single-photon emission computed tomography (SPECT). Inverse Probl. Imaging. 8, 223–246 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  44. Luo, Z.Q., Tseng, P.: On the linear convergence of descent methods for convex essentially smooth minimization. SIAM J. Control Optim. 30, 408–425 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  45. Luo, Z.Q., Tseng, P.: Error bounds and convergence analysis of feasible descent methods: a general approach. Ann. Oper. Res. 46, 157–178 (1993)

    Article  MathSciNet  Google Scholar 

  46. Mangasarian, O.L.: Convergence of iterates of an inexact matrix splitting algorithm for the symmetric monotone linear complementarity problem. SIAM J. Optim. 1, 114–122 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  47. McCormick, S.F., Rodrigue, G.H.: A uniform approach to gradient methods for linear operator equations. J. Math. Anal. Appl. 49, 275–285 (1975)

    Article  MathSciNet  MATH  Google Scholar 

  48. Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course, Applied Optimization, vol. 87. Springer, New York (2004)

    Google Scholar 

  49. Nikazad, T., Davidi, R., Herman, G.T.: Accelerated perturbation-resilient block-iterative projection methods with application to image reconstruction. Inverse Probl. 28, 035005 (2012). (19pp)

    Article  MathSciNet  Google Scholar 

  50. Pang, J.S.: A posteriori error bounds for the linearly-constrained variational inequality problem. Math. Oper. Res. 12, 474–484 (1987)

    Article  MathSciNet  Google Scholar 

  51. Pang, J.S.: Error bounds in mathematical programming. Math. Program. 79, 299–332 (1997)

    MATH  Google Scholar 

  52. Penfold, S.N., Schulte, R.W., Censor, Y., Rosenfeld, A.B.: Total variation superiorization schemes in proton computed tomography image reconstruction. Med. Phys. 37, 5887–5895 (2010)

    Article  Google Scholar 

  53. Piana, M., Bertero, M.: Projected Landweber method and preconditioning. Inverse Probl. 13, 441–463 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  54. Polyak, B.T.: Introduction to Optimization. Optimization Software, New York (1987)

    Google Scholar 

  55. Davidi, R., Censor, Y., Schulte, R.W., Geneser, S., Xing, L.: Feasibility-seeking and superiorization algorithms applied to inverse treatment planning in radiation therapy. Contemp. Math. 636, 83–92 (2015)

    Article  MathSciNet  Google Scholar 

  56. Schrapp, M.J., Herman, G.T.: Data fusion in X-ray computed tomography using a superiorization approach. Rev. Sci. Instr. 85, 053701 (2014)

    Article  Google Scholar 

  57. Shepp, L.A., Vardi, Y.: Maximum likelihood reconstruction for emission tomography. IEEE Trans. Med. Imaging 1, 113–122 (1982)

    Article  Google Scholar 

  58. Solodov, M.V.: Convergence analysis of perturbed feasible descent methods. J. Optim. Theory Appl. 93, 337–353 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  59. Solodov, M.V., Zavriev, S.K.: Error stability properties of generalized gradient-type algorithms. J. Optim. Theory Appl. 98, 663–680 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  60. Trussell, H., Civanlar, M.: The Landweber iteration and projection onto convex sets. IEEE Trans. Acoust. Speech Signal Process. 33, 1632–1634 (1985)

    Article  Google Scholar 

Download references

Acknowledgments

We greatly appreciate the constructive comments of two anonymous reviewers and the Coordinating Editor which helped us improve the paper. This work was supported in part by the National Basic Research Program of China (973 Program) (2011CB809105), the National Science Foundation of China (61421062) and the United States-Israel Binational Science Foundation (BSF) Grant number 2013003.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yair Censor.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jin, W., Censor, Y. & Jiang, M. Bounded perturbation resilience of projected scaled gradient methods. Comput Optim Appl 63, 365–392 (2016). https://doi.org/10.1007/s10589-015-9777-x

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-015-9777-x

Keywords

Navigation