Skip to main content

2006 | Buch

Global Optimization

From Theory to Implementation

insite
SUCHEN

Über dieses Buch

Most books about global optimization describe the theory of the algorithms, whereas a given implementation’s quality never depends exclusively on the theoretical soundness of the algorithms that are implemented. The literature rarely discusses the tuning of algorithmic parameters, implementation tricks, software architectures, and the embedding of local solvers within global solvers. And yet, there are many good software implementations "out there” from which the entire community could learn something.

The scope of this book is moving a few steps toward the systematization of the path that goes from the invention to the implementation and testing of a global optimization algorithm. Some of the contributors to the book are famous and some are less well-known, but all are experts in the discipline of actually getting global optimization to work. Thus, the papers in this book address the following topics:

- Descriptions of new implementations of general-purpose or problem-specific global optimization algorithms

- New algorithms in global optimization (some with numerical results and a discussion of the implementation)

- Surveys discussing existing software packages.

Inhaltsverzeichnis

Frontmatter

Methods

Frontmatter
Optimization under Composite Monotonic Constraints and Constrained Optimization over the Efficient Set
Summary
We present a unified approach to a class of nonconvex global optimization problems with composite monotonic constraints. (By composite monotonic function is meant a function which is the composition of a monotonic function on ℝn with a mapping from ℝn → ℝm with mn.) This class includes problems with constraints involving products of linear functions, sums of ratio functions, etc., and also problems of constrained optimization over efficient/weakly efficient points. The approach is based on transforming the problem into a monotonic optimization problem in the space ℝp, which can then be efficiently solved by recently developed techniques. Nontrivial numerical examples are presented to illustrate the practicability of the approach.
Hoang Tuy, N. T. Hoai-Phuong
On a Local Search for Reverse Convex Problems
Summary
In this paper we propose two variants of Local Search Method for reverse convex problems. The first is based on well-known theorem of H. Tuy as well as on Linearization Principle. The second variant is due to an idea of J. Rosen. We demonstrate the practical effectiveness of the proposed methods computationally.
Alexander Strekalovsky
Some Transformation Techniques in Global Optimization
Summary
In this chapter some transformation techniques, useful in deterministic global optimization, are discussed. With the given techniques, a general class of nonconvex MINLP (mixed integer non-linear programming) problems can be solved to global optimality. The transformations can be applied to signomial functions and the feasible region of the original problem can be convexified and overestimated by the transformations. The global optimal solution of the original nonconvex problem can be found by solving a sequence of convexified MINLP sub-problems. In each such iteration a part of the infeasible region is cut off and the algorithm terminates when a solution point is sufficiently close to or within the feasible region of the original problem. The principles behind the algorithm are given in this chapter and numerical examples are used to illustrate how the global optimal solution is obtained with the algorithm.
Tapio Westerlund
Solving Nonlinear Mixed Integer Stochastic Problems: a Global Perspective
Summary
In this paper, we present a novel approach for solving nonlinear mixed integer stochastic programming problems. In particular, we consider two stage stochastic problem with nonlinearities both in the objective function and constraints, pure integer first stage and mixed-integer second stage variables. We formulate the problem by a scenario based representation, adding linear nonanticipativity constraints coming from splitting the first stage decision variables. In the separation phase we fully exploit the partial decomposable structure of SMINLPs. This allows to deal with a separable nondifferentiable problem, which can be solved by Lagrangian dual based procedure. In particular, we propose a specialization of the Randomized Incremental Subgradient Method- proposed by Bertsekas(2001)- which takes dynamically into account the information relative to the scenarios. The coordination phase is aimed at enforcing coordination among the solutions of the scenario subproblems. More specifically, we use a branch and bound in order to enforce the feasibility of the relaxed nonanticipativity constraints. In order to make more efficient the over-all method, we embed the Lagrangian iteration in a branch and bound scheme, by avoiding the exact solution of the dual problem and we propose an early branching rule and a worm start procedure to use within the Branch and Bound tree. Although SMINLPs have many application contexts, this class of problem has not been adequately treated in the literature. We propose a stochastic formulation of the Trim Loss Problem, which is new in the literature. A formal mathematical formulation is provided in the framework of two-stage stochastic programming which explicitly takes into account the uncertainty in the demand. Preliminary computational results illustrate the ability of the proposed method to determine the global optimum significantly decreasing the solution time. Furthermore, the proposed approach is able to solve instances of the problem intractable with conventional approaches.
Maria Elena Bruni
Application of Quasi Monte Carlo Methods in Global Optimization
Summary
It has been recognized through theory and practice that uniformly distributed deterministic sequences provide more accurate results than purely random sequences. A quasi Monte Carlo (QMC) variant of a multi level single linkage (MLSL) algorithm for global optimization is compared with an original stochastic MLSL algorithm for a number of test problems of various complexities. An emphasis is made on high dimensional problems. Two different low-discrepancy sequences (LDS) are used and their efficiency is analysed. It is shown that application of LDS can significantly increase the efficiency of MLSL. The dependence of the sample size required for locating global minima on the number of variables is examined. It is found that higher confidence in the obtained solution and possibly a reduction in the computational time can be achieved by the increase of the total sample size N. N should also be increased as the dimensionality of problems grows. For high dimensional problems clustering methods become inefficient. For such problems a multistart method can be more computationally expedient.
Sergei Kucherenko

Implementations

Frontmatter
GLOB — A new VNS-based Software for Global Optimization
Summary
We describe an application of Variable Neighbourhood Search (VNS) methodology to continuous global optimization problems with box constraints. A general VNS algorithm is implemented within the software package GLOB. The tests are performed on some standard test functions and on a class of NP-hard global optimization problems arising in practice. The computational results show the potential of the new software.
M. Drazić, V. Kovacevic-Vujcić, M. Cangalović, N. Mladenović
Disciplined Convex Programming
Summary
A new methodology for constructing convex optimization models called disciplined convex programming is introduced. The methodology enforces a set of conventions upon the models constructed, in turn allowing much of the work required to analyze and solve the models to be automated.
Michael Grant, Stephen Boyd, Yinyu Ye
Writing Global Optimization Software
Summary
Global Optimization software packages for solving Mixed-Integer Non-linear Optimization Problems are usually complex pieces of codes. Some of the difficulties involved in coding a good GO software are: embedding third-party local optimization codes within the main global optimization algorithm; providing efficient memory representations of the optimization problem; making sure that every part of the code is fully reentrant. Finding good software engineering solutions for these difficulties is not enough to make sure that the outcome will be a GO software that works well. However, starting from a sound software design makes it easy to concentrate on improving the efficiency of the global optimization algorithm implementation. In this paper we discuss the main issues that arise when writing a global optimization software package, namely software architecture and design, symbolic manipulation of mathematical expressions, choice of local solvers and implementation of global solvers.
Leo Liberti
MathOptimizer Professional: Key Features and Illustrative Applications
Summary
Integrated scientific-technical computing (ISTC) environments play an increasing role in advanced systems modeling and optimization. MathOptimizer Professional (MOP) has been recently developed to solve nonlinear optimization problems formulated in the ISTC system Mathematica. We introduce this software package, and review its key functionality and options. MOP is then used to solve illustrative circle packing problems, including both well-frequented models and a new (more difficult) model-class.
János D. Pintér, Frank J. Kampas
Variable Neighborhood Search for Extremal Graphs 14: The AutoGraphiX 2 System
Summary
The AutoGraphiX (AGX) system for computer assisted or, for some of its functions, fully automated graph theory was developed at GERAD, Montreal since 1997. We report here on a new version (AGX 2) of that system. It contains many enhancements, as well as a new function for automated proof of simple propositions. Among other results, AGX 2 led to several hundred new conjectures, ranking from easy ones, proved automatically, to others requiring longer unassisted or partially assisted proofs, to open ones. Many examples are given, illustrating AGX 2’s functions and the results obtained.
M. Aouchiche, J. M. Bonnefoy, A. Fidahoussen, G. Caporossi, P. Hansen, L. Hiesse, J. Lacheré, A. Monhait
From Theory to Implementation: Applying Metaheuristics.
Summary
Metaheuristics are strategies to design heuristic procedures to find high quality solutions to an optimization problem. This chapter focuses on the implementation aspects of heuristic algorithms based on metaheuristics, using an object oriented approach. This programming paradigm takes advantage of the common parts shared by codes that implement different metaheuristic procedures. We give a class hierarchy for metaheuristics that permits quickly generate algorithms from existing metaheuristic codes for specific problems by extending a few classes and adding the problem functionality. It also allows the development of new metaheuristic algorithms without programming from scratch the basis of the procedure. It consists of selecting an appropriate class with the closest functionality, and extending it to add the core of the algorithm. The purpose of this hierarchy is thus to provide an extensible model for a quick implementation of metaheuristics and the problem structures associated with them.
I. J. García del Amo, F. García López, M. García Torres, B. Melián Batista, J. A. Moreno Pérez, J. M. Moreno Vega
ooMILP — A C++ Callable Object-oriented Library and the Implementation of its Parallel Version using CORBA
Summary
Process systems engineering is one of the many areas in which mixed integer optimisation formulations have been successfully applied. The nature of the problems requires specialised solution strategies and computer packages or callable libraries able to be extended and modified in order to accommodate new solution techniques. Object-oriented programming languages have been identified to offer these features. Process system applications are usually of large scale, and require modelling and solution techniques with high level of customisation. ooMILP is a library of C++ callable procedures for the definition, manipulation and solution of large, sparse mixed integer linear programming (MILP) problems without the disadvantages of many existing modelling languages. We first present a general approach to the packaging of numerical solvers as software components, derived from material developed for the CAPE-OPEN project. The presentation is in the context of construction and solution of Mixed Integer Linear Programming (MILP) problems. We then demonstrate how this package, based on the use of CORBA interfaces for synchronous execution within a single process, can be adapted with a minimum of problem-specific changes to provide a distributed solution.
Panagiotis Tsiakis, Benjamin Keeping

Applications

Frontmatter
Global Order-Value Optimization by means of a Multistart Harmonic Oscillator Tunneling Strategy
Summary
The OVO (Order-Value Optimization) problem consists in the min-imization of the Order-Value function f(x), defined by \( f\left( x \right) = f_{i_p \left( x \right)} \left( x \right) \), where \( f_{i_1 \left( x \right)} \left( x \right) \leqslant ... \leqslant f_{i_m \left( x \right)} \left( x \right) \). The functions f1,..., fm are defined on Ω ⊂ ℝn and p is an integer between 1 and m. When x is a vector of portfolio positions and fi(x) is the predicted loss under the scenario i, the Order-Value function is the discrete Value-at-Risk (VaR) function, which is largely used in risk evaluations. The OVO problem is continuous but nonsmooth and, usually, has many local minimizers. A local method with guaranteed convergence to points that satisfy an optimality condi-tion was recently introduced by Andreani, Dunder and Martinez. The local method must be complemented with a global minimization strategy in order to be effective when m is large. A global optimization method is defined where local minimizations are improved by a tunneling strategy based on the harmonic oscillator initial value problem. It will be proved that the solution of this initial value problem is a smooth and dense trajectory if Ω is a box. An application of OVO to the problem of find-ing hidden patterns in data sets that contain many errors is described. Challenging numerical experiments are presented.
R. Andreani, J. M. Martinez, M. Salvatierra, F. Yano
On generating Instances for the Molecular Distance Geometry Problem
Summary
The molecular distance geometry problem can be stated as the deter-mination of the three-dimensional structure of a molecule using a set of distances between pairs of atoms. It can be formulated as a global minimization problem, where the main difficulty is the exponential increasing of local minimizers with the size of the molecule. The aim of this study is to generate new instances for the molecular distance geometry problem that can be used in order to test algorithms designed to solve it.
Carlile Lavor
Backmatter
Metadaten
Titel
Global Optimization
herausgegeben von
Leo Liberti
Nelson Maculan
Copyright-Jahr
2006
Verlag
Springer US
Electronic ISBN
978-0-387-30528-8
Print ISBN
978-0-387-28260-2
DOI
https://doi.org/10.1007/0-387-30528-9