Skip to main content

1990 | Buch

Global Optimization

Deterministic Approaches

verfasst von: Professor Dr. Reiner Horst, Professor Dr. Hoang Tuy

Verlag: Springer Berlin Heidelberg

insite
SUCHEN

Über dieses Buch

The enormous practical need for solving global optimization problems coupled with a rapidly advancing computer technology has allowed one to consider problems which a few years ago would have been considered computationally intractable. As a consequence, we are seeing the creation of a large and increasing number of diverse algorithms for solving a wide variety of multiextremal global optimization problems. The goal of this book is to systematically clarify and unify these diverse approaches in order to provide insight into the underlying concepts and their pro­ perties. Aside from a coherent view of the field much new material is presented. By definition, a multiextremal global optimization problem seeks at least one global minimizer of a real-valued objective function that possesses different local n minimizers. The feasible set of points in IR is usually determined by a system of inequalities. It is well known that in practically all disciplines where mathematical models are used there are many real-world problems which can be formulated as multi extremal global optimization problems.

Inhaltsverzeichnis

Frontmatter

Introduction and Basic Techniques

Frontmatter
Chapter 1. Some Important Classes of Global Optimization Problems
Abstract
In Chapter I, we introduce the main classes of global optimization problems that we study: concave minimization, reverse convex constraints, d.—c. programming, and Lipschitz optimization. Some basic properties of these problems and various applications are discussed. It is also shown that very general systems of equalities and (or) inequalities can be formulated as global optimization problems.
Reiner Horst, Hoang Tuy
Chapter II. Outer Approximation
Abstract
Outer approximation of the feasible set by a sequence of simpler relaxed sets is a basic method in many fields of optimization. In this technique, the current approximating set is improved by a suitable additional constraint (a cut).
Reiner Horst, Hoang Tuy
Chapter III. Concavity Cuts
Abstract
In Chapter II we discussed the general concept of a cut and the use of cuts in the basic technique of outer approximation. There, we were mainly concerned with using cuts in a “conjunctive” manner: typically, cuts were generated in such a way that no feasible point of the problem is excluded and the intersection of all the cuts contains the whole feasible region. This technique is most successful when the feasible region is a convex set, so that supporting hyperplanes can easily be constructed to separate this convex set from a point lying outside.
Reiner Horst, Hoang Tuy
Chapter IV. Branch and Bound
Abstract
A widely used method to solve various kinds of difficult optimization problems is called branch and bound. In this technique, the feasible set is relaxed and subsequently split into parts (branching) over which lower (and often also upper) bounds of the objective function value can be determined (bounding).
Reiner Horst, Hoang Tuy

Concave Minimization

Frontmatter
Chapter V. Cutting Methods
Abstract
In this chapter we discuss some basic cutting plane methods for concave minimization. These include concavity cuts and related cuts, facial cuts, cut and split procedures and a discussion of how to generate deep cuts. The important special case of concave quadratic objective functions is treated in some detail.
Reiner Horst, Hoang Tuy
Chapter VI. Successive Approximation Methods
Abstract
In the cutting plane methods discussed in the previous chapter, the feasible domain is reduced at each step by cutting off a feasible portion that is known to contain no better solution than the current best solution.
Reiner Horst, Hoang Tuy
Chapter VII. Successive Partition Methods
Abstract
This chapter is devoted to a class of methods for concave minimization which investigate the feasible domain by dividing it into smaller pieces and refining the partition as needed (successive partition methods, branch and bound).
Reiner Horst, Hoang Tuy
Chapter VIII. Decomposition of Large Scale Problems
Abstract
In many problems of large size encountered in applications, the constraints are linear, while the objective function is a sum of two parts: a linear part involving most of the variables of the problem, and a concave part involving only a relatively small number of variables. More precisely, these problems have the form
$$ {\text{minimize }}f(x) + dy\quad {\text{subject to }}(x,y) \in \Omega \subset {R^n} \times {R^h} $$
(P)
where f: ℝn → ℝ is a concave function, Ω is a polyhedron, d and y are vectors in ℝh, and n is generally much smaller than h.
Reiner Horst, Hoang Tuy
Chapter IX. Special Problems of Concave Minimization
Abstract
Many nonconvex optimization problems can be reduced to concave minimization problems of a special form and can be solved by specialized concave minimization methods. In this chapter we shall study some of the most important examples of these problems. They include bilinear programming, complementarity problems and certain parametric concave minimization problems. An important subclass of parametric concave minimization which we will study is linear programming subject to an additional reverse convex constraint.
Reiner Horst, Hoang Tuy

General Nonlinear Problems

Frontmatter
Chapter X. D.C. Programming
Abstract
In Chapter X, we continue the discussion of d.c. programming problems. First, a duality theory is developed between the objective and the constraints of a very general class of optimization problems. This theory allows one to derive several outer approximation methods for solving canonical d.c. problems and even certain d.c. problems that involve functions whose d.c. representations are not known. Then we present branch and bound methods for the general d.c. program and a combination of outer approximations and branch and bound. Finally, the design centering problem and biconvex programming are discussed in some detail.
Reiner Horst, Hoang Tuy
Chapter XI. Lipschitz and Continuous Optimization
Abstract
In this chapter, we discuss global optimization problems where the functions involved are Lipschitz-continuous on certain subsets M ⊂ ℝn. Section 1 presents a brief introduction into the most often treated univariate case. Section 2 is devoted to branch and bound methods. First it is shown that the well-known univariate approaches can be interpreted as branch and bound methods. Then several extensions of univariate methods to the case of n dimensional problems with rectangular feasible sets are discussed. Finally, it is recalled from Chapter IV that very general Lipschitz optimization problems and also very general systems of equations and (or) inequalities can be solved by means of branch and bound techniques. As an example of Lipschitz optimization, the problem of minimizing a concave function subject to separable indefinite quadratic constraints is discussed in some detail.
Reiner Horst, Hoang Tuy
Backmatter
Metadaten
Titel
Global Optimization
verfasst von
Professor Dr. Reiner Horst
Professor Dr. Hoang Tuy
Copyright-Jahr
1990
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-662-02598-7
Print ISBN
978-3-662-02600-7
DOI
https://doi.org/10.1007/978-3-662-02598-7