Nonlinear Analysis: Theory, Methods & Applications
Hybrid methods for a class of monotone variational inequalities
Introduction
Let be a Hilbert space, a nonempty closed convex subset of , and a nonlinear mapping. A variational inequality problem, denoted , is to find a point with the property We say that is monotone if the mapping is a monotone operator. In this paper we are concerned with a special class of variational inequalities in which the mapping is the complement of a nonexpansive mapping and the constraint set is the set of fixed points of another nonexpansive mapping. Namely, we consider the following type of monotone variational inequality problem where are nonexpansive mappings. It is always assumed that the set of fixed points of , , is not empty.
It is well-known that the VIP (1.1) is equivalent to the fixed point equation where and is the metric projection of onto .
It is also well-known that if is Lipschitzian and strongly monotone, then for small enough , the mapping is a contraction on and so the sequence of Picard iterates, given by (), converges strongly to the unique solution of the VIP (1.1).
Hybrid methods for solving the variational inequality (1.1) were studied by Yamada [1] where he assumed that is Lipschitzian and strongly monotone. However, his methods do not apply to the variational inequality (1.2) since the mapping fails, in general, to be strongly monotone, though it is Lipschitzian. Therefore, other hybrid methods have to be sought.
Recently, Mainge and Moudafi [2] introduced a hybrid iterative method for solving the variational inequality (1.2), which generates a sequence as follows: where the initial guess , is a contraction, and and are sequences in satisfying certain conditions.
It is the purpose of this paper to further investigate other hybrid iterative methods for solving the variational inequality (1.2). More precisely, assuming (1.2) is consistent and noticing the fact that if, for each , is a fixed point of the nonexpansive mapping then every weak accumulation point of as is a solution of (1.2), we are able, upon the idea of regularization, to introduce a new hybrid iterative method as follows: where and are sequences in (0, 1), and is a contraction. Our idea is to regularize the nonexpansive mapping , instead of the nonexpansive mapping as done by Moudafi and Mainge [3]. Since Moudafi and Mainge’s regularization depends upon whereas ours not, we obtain our convergence result for the regularization under dramatically less restrictive conditions; as a matter of fact, the conditions (A1) and (A3) of Moudafi and Mainge [3] are completely removed.
We also apply both of our implicit and explicit schemes to solve a hierarchical minimization problem in a Hilbert space.
Section snippets
Preliminaries
Let be a nonempty closed convex subset of a real Hilbert space . Recall the following concepts of mappings.
- (i)
A mapping is a -contraction if and if the following property is satisfied:
- (ii)
A mapping is nonexpansive provided
- (iii)
A mapping is
- (a)
monotone if
- (b)
strictly monotone if
- (c)
-strongly monotone if there exists a constant such that
- (a)
The metric (or nearest
Implicit hybrid method
Suppose is an -Lipschitzian and -strongly monotone operator with . Suppose is nonexpansive with . Consider the variational inequality
Yamada [1] introduced the following hybrid iterative method for solving the variational inequality (3.1), which generates a sequence via the iterative algorithm: where the initial guess is arbitrary and where the sequence in (0, 1) satisfies the conditions:
Explicit hybrid method
Our variational inequality (3.3) involves two nonexpansive mappings and . Our explicit hybrid method is motivated by our implicit hybrid method investigated in the last section and the recent investigation on iterative methods for nonexpansive mappings (see more details in [9], [10], [11], [12], [7], [13], [14], [15], [16], [6], [17], [18], [19], [4]).
Our explicit iterative scheme generates a sequence from an arbitrary initial guess and via the recursive formula:
Application in hierarchical minimization
Let be a Hilbert space and let be proper lower semicontinuous convex functions. Consider the following hierarchical minimization where . (Here we always assume that is nonempty.) Let and assume . Assume and are differentiable and their gradients are Lipschitz continuous: Let where and .
It is easily seen that . It
Acknowledgement
The second author was supported in part by NSC 97-2628-M-110-003-MY3 (Taiwan).
References (22)
The hybrid steepest descent for the variational inequality problems over the intersection of fixed point sets of nonexpansive mappings
Viscosity Approximation Methods for Nonexpansive Mappings
J. Math. Anal. Appl.
(2004)Viscosity approximation methods for fixed-points problems
J. Math. Anal. Appl.
(2000)- et al.
A general iterative method for nonexpansive mappings in Hilbert spaces
J. Math. Anal. Appl.
(2006) Strong convergence theorems for resolvents of accretive operators in Banach spaces
J. Math. Anal. Appl.
(1980)- et al.
Strong convergence of an iterative method for hierarchical fixed-points problems
Pacific J. Optim.
(2007) - A. Moudafi, P-E. Mainge, Towards viscosity approximations of hierarchical fixed-points problems, Fixed Point Theory...
- et al.
Iterative algorithms for nonlinear operators
J. London Math. Soc.
(2002)- H.K. Xu, Viscosity method for hierarchical fixed point approach to variational inequalities, Taiwanese J. Math. 13...
Convergence of approximation to fixed points of nonexpansive nonlinear mappings in Hilbert spaces
Arch. Ration. Mech. Anal.
Cited by (57)
Extragradient Methods for Some Nonlinear Problems
2016, Fixed Point Theory and Graph Theory: Foundations and Integrative ApproachesComposite projection algorithms for the split feasibility problem
2013, Mathematical and Computer ModellingTrilevel and multilevel optimization using monotone operator theory
2024, Mathematical Methods of Operations ResearchOn the solution of monotone nested variational inequalities
2022, Mathematical Methods of Operations ResearchCombining approximation and exact penalty in hierarchical programming
2022, Optimization