Skip to main content

2015 | Buch

Vector Optimization and Monotone Operators via Convex Duality

Recent Advances

insite
SUCHEN

Über dieses Buch

This book investigates several duality approaches for vector optimization problems, while also comparing them. Special attention is paid to duality for linear vector optimization problems, for which a vector dual that avoids the shortcomings of the classical ones is proposed. Moreover, the book addresses different efficiency concepts for vector optimization problems. Among the problems that appear when the framework is generalized by considering set-valued functions, an increasing interest is generated by those involving monotone operators, especially now that new methods for approaching them by means of convex analysis have been developed. Following this path, the book provides several results on different properties of sums of monotone operators.

Inhaltsverzeichnis

Frontmatter
Chapter 1. Introduction and Preliminaries
Abstract
In this book we present some recent advances on duality for vector optimization problems and on monotone operators obtained by means of conjugate duality and other methods of the convex analysis. This is not intended to be a systematic presentation of the state-of-the-art in these fields, but an extended collection of the results obtained by the author in these research areas by making use of tools like conjugate functions, duality and convexity.
Sorin-Mihai Grad
Chapter 2. Duality for Scalar Optimization Problems
Abstract
Assigning a dual problem to a given minimization problem provides, due to the weak duality, a lower bound for the objective values of the latter. Moreover, if strong duality can be proven, the optimal objective values of the two problems coincide and they can be determined since usually the dual problem has a simpler structure than the primal one and can be easier solved. Moreover, necessary and sufficient optimality conditions for the primal-dual pair of problems in discussion can be derived and these can be employed for determining the optimal solutions of the primal problem when the ones of the dual, guaranteed by the strong duality statement, were already identified.
Sorin-Mihai Grad
Chapter 3. Minimality Concepts for Sets
Abstract
Solving a scalar optimization problem usually means to determine the points where the objective function attains its minimum (respectively maximum) over the feasible set, but one can also look for solutions satisfying stronger conditions, like weak sharp minima or strong minima. A similar situation can be found in vector optimization, too, where due to the increased complexity of the problems the solution concepts are more diversified.
Sorin-Mihai Grad
Chapter 4. Vector Duality via Scalarization for Vector Optimization Problems
Abstract
As seen in Chap. 3, one can consider different minimality notions for sets and these can be employed in different situations in order to serve various purposes, among which one can find the theory of vector optimization. Solving a vector optimization problem amounts of determining its feasible elements where the value of its objective function satisfies the desired minimality property within the image set of the feasible set through the objective function, the so-called image set of the vector optimization problem. These elements are usually called efficient solutions.
Sorin-Mihai Grad
Chapter 5. General Wolfe and Mond-Weir Duality
Abstract
After having a solid duality theory for linear optimization problems, the next step was to extend it for more general problems. Following Dorn’s successful generalization of duality for quadratic problems, the next step was to deal with convex optimization problems. In [215], Wolfe proposed a dual problem for a scalar convex optimization problem in which the involved functions were taken differentiable, too.
Sorin-Mihai Grad
Chapter 6. Vector Duality for Linear and Semidefinite Vector Optimization Problems
Abstract
While the scalar linear optimization problems were intensively studied, inclusive via duality, and the things regarding them are settled down, the investigations on their vector counterparts are far from being complete. The first papers on linear vector duality were due to Gale, Kuhn and Tucker (cf. [96]), Kornbluth (cf. [150]), Schönefeld (cf. [187]) and Rödder (cf. [181]), while Isermann was the one who introduced in [132, 133] the classical vector dual problem to a primal linear vector optimization problem in finitely dimensional spaces. Moreover, he compared his results to the previously mentioned ones, pointing which of them could be recovered as special cases of his approach.
Sorin-Mihai Grad
Chapter 7. Monotone Operators Approached via Convex Analysis
Abstract
The monotone operators started being intensively investigated during the 1960’s by authors like Browder, Brézis or Minty, and it did not take much time until their connections with convex analysis were noticed by Rockafellar, Gossez and others. The fact that the (convex) subdifferential of a proper, convex and lower semicontinuous function is a maximally monotone operator was one of the reasons for connecting these at a first sight maybe unrelated research fields.
Sorin-Mihai Grad
Backmatter
Metadaten
Titel
Vector Optimization and Monotone Operators via Convex Duality
verfasst von
Sorin-Mihai Grad
Copyright-Jahr
2015
Electronic ISBN
978-3-319-08900-3
Print ISBN
978-3-319-08899-0
DOI
https://doi.org/10.1007/978-3-319-08900-3