Skip to main content
Log in

Hybrid Approximate Proximal Method with Auxiliary Variational Inequality for Vector Optimization

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

This paper studies a general vector optimization problem of finding weakly efficient points for mappings from Hilbert spaces to arbitrary Banach spaces, where the latter are partially ordered by some closed, convex, and pointed cones with nonempty interiors. To find solutions of this vector optimization problem, we introduce an auxiliary variational inequality problem for a monotone and Lipschitz continuous mapping. The approximate proximal method in vector optimization is extended to develop a hybrid approximate proximal method for the general vector optimization problem under consideration by combining an extragradient method to find a solution of the variational inequality problem and an approximate proximal point method for finding a root of a maximal monotone operator. In this hybrid approximate proximal method, the subproblems consist of finding approximate solutions to the variational inequality problem for monotone and Lipschitz continuous mapping, and then finding weakly efficient points for a suitable regularization of the original mapping. We present both absolute and relative versions of our hybrid algorithm in which the subproblems are solved only approximately. The weak convergence of the generated sequence to a weak efficient point is established under quite mild assumptions. In addition, we develop some extensions of our hybrid algorithms for vector optimization by using Bregman-type functions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Bonnel, H., Iusem, A.N., Svaiter, B.F.: Proximal methods in vector optimization. SIAM J. Optim. 15(4), 953–970 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  2. Fliege, J., Svaiter, B.F.: Steepest descent methods for multicriteria optimization. Math. Methods Oper. Res. 51, 479–494 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  3. Grana Drummond, L.M., Svaiter, B.F.: A steepest descent method for vector optimization. J. Comput. Appl. Math. 175, 395–414 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  4. Grana Drummond, L.M., Iusem, A.N.: A projected gradient method for vector optimization problems. Comput. Optim. Appl. 28, 5–30 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  5. Eckstein, J.: Approximate iterations in Bregman-function-based proximal algorithms. Math. Program. 83, 113–123 (1998)

    MathSciNet  Google Scholar 

  6. Rockafellar, R.T.: Monotone operators and the proximal point algorithm. SIAM J. Control Optim. 14, 877–898 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  7. Burachik, R.S., Iusem, A.N., Svaiter, B.F.: Enlargement of monotone operators with applications to variational inequalities. Set-Valued Anal. 5, 159–180 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  8. Burachik, R.S., Scheimberg, S., Svaiter, B.F.: Robustness of the hybrid extragradient proximal point algorithm. J. Optim. Theory Appl. 111, 117–136 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  9. He, B.S.: Inexact implicit methods for monotone general variational inequalities. Math. Program. 86, 199–217 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  10. Jahn, J.: Theory of vector maximization: Various concepts of efficient solutions. In: Gal, T., Stewart, T.J., Hanne, T. (eds.): Multicriteria Decision Making. Advances in MCDM Models, Algorithms, Theory, and Applications, pp. 2-1–2-32. Kluwer, Boston (1999)

    Google Scholar 

  11. Solodov, M.V., Svaiter, B.F.: An inexact hybrid extragradient-proximal point algorithm using the enlargement of a maximal monotone operator. Set-Valued Anal. 7, 323–345 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  12. Burachik, R.S., Iusem, A.N.: Set-Valued Mappings and Enlargements of Monotone Operators. Optimization and Its Applications, vol. 8. Springer, Berlin (2008)

    Google Scholar 

  13. Chen, G., Teboulle, M.: A proximal-based decomposition method for convex minimization problems. Math. Program. 64, 81–100 (1994)

    Article  MathSciNet  Google Scholar 

  14. Da Silva Silva, P.J., Eckstein, J., Humes, C.: Rescaling and stepsize selection in proximal methods using separable generalized distance. SIAM J. Optim. 12, 238–261 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  15. Solodov, M.V., Svaiter, B.F.: A hybrid projection-proximal point algorithm. J. Convex Anal. 6, 59–70 (1999)

    MathSciNet  MATH  Google Scholar 

  16. Solodov, M.V., Svaiter, B.F.: Forcing strong convergence of proximal point iterations in a Hilbert space. Math. Program. 87, 189–202 (2000)

    MathSciNet  MATH  Google Scholar 

  17. Solodov, M.V., Svaiter, B.F.: An inexact hybrid generalized proximal point algorithm and some new results on the theory of Bregman functions. Math. Oper. Res. 25(2), 214–230 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  18. Göpfert, A., Riahi, H., Tammer, C., Zălinescu, C.: Variational Methods in Partially Ordered Spaces. Springer, New York (2003)

    MATH  Google Scholar 

  19. Benker, H., Hamel, A., Tammer, C.: An algorithm for vectorial control approximation problems. In: Fandel, G., Gal, T. (eds.) Multiple Criteria Decision Making, pp. 3–12. Springer, Berlin (1997)

    Google Scholar 

  20. Kyono, M., Fukushima, M.: Nonlinear proximal decomposition method for convex programming. J. Optim. Theory Appl. 106, 357–372 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  21. Butnariu, D., Iusem, A.N.: Totally Convex Functions for Fixed Point Computation and Infinite Dimensional Optimization. Kluwer, Dordrecht (1997)

    Google Scholar 

  22. Ceng, L.C., Yao, J.C.: Approximate proximal methods in vector optimization. Eur. J. Oper. Res. 183(1), 1–19 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  23. Nadezhkina, N., Takahashi, W.: Strong convergence theorem by a hybrid method for nonexpansive mappings and Lipschitz continuous monotone mappings. SIAM J. Optim. 16, 1230–1241 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  24. Korpelevich, G.M.: The extragradient method for finding saddle points and other problems. Matecon 12, 747–756 (1976)

    MATH  Google Scholar 

  25. Burachik, R.S., Lopes, J.O., Svaiter, B.F.: An outer approximation method for the variational inequality problem. SIAM J. Control Optim. 43, 2071–2088 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  26. Ceng, L.C., Cubiotti, P., Yao, J.C.: An implicit iterative scheme for monotone variational inequalities and fixed point problems. Nonlinear Anal. 69(8), 2445–2457 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  27. Mordukhovich, B.S.: Variational Analysis and Generalized Differentiation, I: Basic Theory, II: Applications. Springer, Berlin (2006)

    Google Scholar 

  28. Bao, T.Q., Mordukhovich, B.S.: Relative Pareto minimizers for multiobjective problems: existence and optimality conditions. Math. Program. (2009), published online

  29. Bolintineanu, S.: Approximate efficiency and scalar stationarity in unbounded nonsmooth convex vector optimization problems. J. Optim. Theory Appl. 106, 265–296 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  30. Bolintineanu, S.: Vector variational principles, ε-efficiency and scalar stationarity. J. Convex Anal. 8, 71–85 (2001)

    MathSciNet  MATH  Google Scholar 

  31. Luc, D.T.: Theory of Vector Optimization. Lecture Notes in Econom. and Math. Syst., vol. 319. Springer, Berlin (1989)

    Google Scholar 

  32. Schu, J.: Weak and strong convergence to fixed points of asymptotically nonexpansive mappings. Bull. Austral. Math. Soc. 43, 153–159 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  33. Rockafellar, R.T.: On the maximality of sums of nonlinear monotone operators. Trans. Am. Math. Soc. 149, 75–88 (1970)

    MathSciNet  MATH  Google Scholar 

  34. Phelps, R.R.: Convex Functions, Monotone Operators and Differentiability. Lecture Notes in Math., vol. 1364. Springer, Berlin (1988)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to J. C. Yao.

Additional information

Communicated by F. Giannessi.

L.C. Ceng research was partially supported by the Leading Academic Discipline Project of Shanghai Normal University (DZL707), Innovation Program of Shanghai Municipal Education Commission Grant (09ZZ133), National Science Foundation of China (10771141), PhD Program Foundation of Ministry of Education of China (20070270004), Science and Technology Commission of Shanghai Municipality Grant (075105118), and Shanghai Leading Academic Discipline Project (S30405).

B.S. Mordukhovich research was partially supported by the USA National Science Foundation under Grant DMS-0603896.

J.C. Yao research was partially supported by Grant NSC 98-2923-E-110-003-MY3.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Ceng, L.C., Mordukhovich, B.S. & Yao, J.C. Hybrid Approximate Proximal Method with Auxiliary Variational Inequality for Vector Optimization. J Optim Theory Appl 146, 267–303 (2010). https://doi.org/10.1007/s10957-010-9667-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-010-9667-4

Keywords

Navigation