Skip to main content

1997 | Buch

Developments in Global Optimization

herausgegeben von: Immanuel M. Bomze, Tibor Csendes, Reiner Horst, Panos M. Pardalos

Verlag: Springer US

Buchreihe : Nonconvex Optimization and Its Applications

insite
SUCHEN

Über dieses Buch

In recent years global optimization has found applications in many interesting areas of science and technology including molecular biology, chemical equilibrium problems, medical imaging and networks. The collection of papers in this book indicates the diverse applicability of global optimization. Furthermore, various algorithmic, theoretical developments and computational studies are presented.
Audience: All researchers and students working in mathematical programming.

Inhaltsverzeichnis

Frontmatter
NOP — A Compact Input Format for Nonlinear Optimization Problems
Abstract
This paper defines a compact format for specifying general constrained nonlinear optimization problems. The proposed format is a nonlinear analogue of an explicit representation of sparse matrices by means of index lists and values of the corresponding matrix entries. Thus the format abstracts from the meaning of the problem and hence does not allow names for variables or parameters, but it explicitly displays the internal structure of the problem. This is a very useful feature for global or large scale local optimization.
A. Neumaier
GLOPT — A Program for Constrained Global Optimization
Abstract
GLOPT is a Fortran77 program for global minimization of a block-separable objective function subject to bound constraints and block-separable constraints. It finds a nearly globally optimal point that is near a true local minimizer. Unless there are several local minimizers that are nearly global, we thus find a good approximation to the global minimizer.
GLOPT uses a branch and bound technique to split the problem recursively into subproblems that are either eliminated or reduced in their size. This is done by an extensive use of the block separable structure of the optimization problem.
In this paper we discuss a new reduction technique for boxes and new ways for generating feasible points of constrained nonlinear programs. These are implemented as the first stage of our GLOPT project. The current implementation of GLOPT uses neither derivatives nor simultaneous information about several constraints. Numerical results are already encouraging. Work on an extension using curvature information and quadratic programming techniques is in progress.
S. Dallwig, A. Neumaier, H. Schichl
Global Optimization for Imprecise Problems
Abstract
A new method for the computation of the global minimum of a continuously differentiable real—valued function f of n variables is presented. This method, which is composed of two parts, is based on the combinatorial topology concept of the degree of a mapping associated with an oriented polyhedron. In the first part, interval arithmetic is implemented for a “rough” isolation of all the stationary points of f. In the second part, the isolated stationary points are characterized as minima, maxima or saddle points and the global minimum is determined among the minima. The described algorithm can be successfully applied to problems with imprecise function and gradient values. The algorithm has been implemented and tested. It is primarily useful for small dimensions (n ≤ 10).
M. N. Vrahatis, D. G. Sotiropoulos, E. C. Triantafyllou
New Results on Gap-Treating Techniques in Extended Interval Newton Gauss-Seidel Steps for Global Optimization
Abstract
Interval-branch-and-bound methods for global optimization very often incorporate interval Newton Gauss-Seidel steps to reduce the widths of the boxes resulting from the basic branch-and-bound method. These steps try to determine the roots of the gradient of the objective function, whereas various other techniques eliminate the regions containing roots which do not correspond to global optimizers.
The interval Newton Gauss-Seidel step uses so-called extended interval arithmetic which allows the division by intervals containing zero. The latter may produce gaps in the resulting coordinate intervals, which can be used to split the resulting box of the interval Gauss-Seidel step.
We investigate the impact of gap-treating and box-splitting techniques which make use of branching rules, i.e. rules for selecting the subdivision direction in the underlying branch-and-bound method. Supplementing earlier studies ([3], [12]), the investigated model algorithm (similar to that in [5]) now uses the enclosure of the Hessian matrix to incorporate a second-order branching rule. We propose a strategy, a sorted interval Gauss-Seidel step, which improves the overall efficiency of the interval Newton Gauss-Seidel step and therefore of our global optimization method. We present results of computational experiments with standard global optimization problems.
Dietmar Ratz
Quadratic Programming with Box Constraints
Abstract
The global minimization of quadratic problems with box constraints naturally arises in many applications and as a subproblem of more complex optimization problems. In this paper we briefly describe the main results on global optimality conditions. Moreover, some of the most interesting computational approaches for the problem will be summarized.
Pasquale L. De Angelis, Panos M. Pardalos, Gerardo Toraldo
Evolutionary Approach to the Maximum Clique Problem: Empirical Evidence on a Larger Scale
Abstract
An algorithm for finding a maximum clique in a graph is presented which uses the Comtet regularization of the Motzkin/Straus continuous problem formulation: maximize an indefinite quadratic form over the standard simplex. We shortly review some surprising connections of the problem with dynamic principles of evolutionary game theory, and give a detailed report on our numerical experiences with the method proposed.
Immanuel Bomze, Marcello Pelillo, Robert Giacomini
Interval and Bounding Hessians
Abstract
Bounding Hessians are defined and an O(n 2) method of obtaining optimal bounding Hessians from interval Hessians is presented. They are applied to a framework of an adaptive second derivative covering method for global optimization, which is presented here. Also, bounding Hessians can be used for reformulation of very general global optimization problems into several useful forms for which there are existing algorithms.
Chris Stephens
On Global Search for Non-Convex Optimal Control Problems
Abstract
In this paper we consider non-convex optimal control problems having the same goal: to maximize a convex function of the terminal state. A global search algorithm is given. The first numerical tests have been performed.
Alexander Strekalovsky, Igor Vasiliev
A Multistart Linkage Algorithm Using First Derivatives
Abstract
A multistart clustering algorithm is presented. The algorithm uses a clustering technique similar to Multi-Level Single Linkage (MLSL) but uses linking criteria requiring gradient information at the sample points. The objective function is modelled by a stochastic process along each link, and this model gives an estimate of the reliability of each link. A second feature of the algorithm is that it generates new sample points by perturbing existing sample points. This allows the algorithm to explore some parts of the feasible region more thoroughly than others. Numerical results show that the algorithm is effective, and is better at resolving close maximisers than MLSL.
C. J. Price
Convergence Speed of an Integral Method for Computing the Essential Supremum
Abstract
We give an equivalence between the tasks of computing the essential supremum of a summable function and of finding a certain zero of a one-dimensional convex function. Interpreting the integral method as Newton-type method we show that in the case of objective functions with an essential supremum that is not spread the algorithm can work very slowly. For this reason we propose a method of accelerating the algorithm which is in some respect similar to the method of Aitken/Steffensen.
Jens Hichert, Armin Hoffmann, Hoàng Xuân Phú
Complexity Analysis Integrating Pure Adaptive Search (PAS) and Pure Random Search (PRS)
Abstract
Pure adaptive search (PAS) is a random search algorithm for global optimization that has promising complexity results. The complexity of pure adaptive search has been analyzed for both continuous and discrete, finite global optimization. Unfortunately, it is not possible at this time to implement pure adaptive search and achieve the ideal computational performance. To model practical random search algorithms more closely, we extend the complexity analysis to integrate pure adaptive search with pure random search. Many practical algorithms have some probability of sampling in the improving region, which is analogous to sampling according to PAS, and a probability of sampling outside the improving region, which is analogous to sampling according to PRS. Simulated annealing also has a probability of accepting a non-improving point. A Markov chain analysis is used to determine the expected number of iterations required to find the global optimum and to provide bounds for the expected number of iterations needed for a combination of PAS and PRS with acceptance probability. The analysis shows that one needs only a small probability of sampling in the improving region in order to dramatically improve performance.
Z. B. Zabinsky, B. P. Kristinsdottir
LGO — A Program System for Continuous and Lipschitz Global Optimization
Abstract
The program system LGO serves to solve global optimization problems under very mild-continuity or Lipschitz-continuity-structural assumptions. LGO is embedded into a menu-driven user interface which effectively assists the application development process. Implementation details, and several application areas are also highlighted.
János D. Pintér
A Method Using Local Tuning for Minimizing Functions with Lipschitz Derivatives
Abstract
This paper describes a new one-dimensional algorithm for solving global optimization problems for objective functions having Lipschitz first derivatives. The method does not require a priori knowledge of the exact Lipschitz constant, but adaptively estimates the local Lipschitz constants in different sectors of the search region in the course of optimization. Convergence conditions of the method are investigated. Results of a numerical comparison of the new method with some other one-dimensional global optimization algorithms are also reported.
Ya. D. Sergeyev
Molecular Structure Prediction by Global Optimization
Abstract
The CGU (convex global underestimator) global optimization method is used to predict the minimum energy structures, i.e. folded states, of small protein sequences. Computational results obtained from the CGU method applied to actual protein sequences using a detailed polypeptide model and a differentiable form of the Sun/Thomas/Dill potential energy function are presented. This potential function accounts for steric repulsion, hydrophobic attraction, and tφ/Ψ pair restrictions imposed by the so called Ramachandran maps. Furthermore, it is easily augmented to accommodate additional known data such as the existence of disulphide bridges and any other a priori distance data. The Ramachandran data is modeled by a continuous penalty term in the potential function, thereby permitting the use of continuous minimization techniques.
K. A. Dill, A. T. Phillips, J. B. Rosen
Optimal Renewal Policy for Slowly Degrading Systems
Abstract
In the paper we address the problem of determining the optimal time to renew a system when it experiences so called soft failures because of aging. It is assumed that crash failures, which reduce the service rate to 0 immediately, do not occur. However, the service rate of the examined system gradually decreases with time and settles to a very low value. Since the performance in this state is very low, it is necessary to “renew” the system to its peak performance level. We analyze this model for two different queueing policies under Poisson arrivals and decreasing service rate.
András Pfening, Miklós Telek
Numerical Prediction of Crystal Structures by Simulated Annealing
Abstract
The prediction of crystal structures is formulated as a global optimization problem in which an appropriate potential energy function is minimized. In the literature several potential energy functions can be found that model the atomic forces for various types of interactions between atoms of a crystal. These functions either require a detailed knowledge of the crystal structure or the employed mathematical terms have unfavorable convergence properties.
We develop a potential energy function that is suitable for determining correct structures in a variety of ionic crystals by using only empirical rules from electrostatics and crystal chemistry. The numerical technique for obtaining global and local minima proceeds in two steps. First, we use a Simulated Annealing method whose parameters are adapted to the topology of crystal structure prediction. Then, a deterministic optimization method is employed as a final refinement step to further refine the atomic positional parameters. Several simulated crystal structures are discussed that demonstrate the quality of the prediction with respect to different terms in the objective function.
Wilfried Bollweg, Helmut Maurer, Herbert Kroll
Multidimensional Optimization in Image Reconstruction from Projections
Abstract
A parallel implementation of a global optimization algorithm is described. The algorithm is based on a probabilistic random search method. Computational results are illustrated through application of the algorithm to a time consuming problem, in a multidimensional space, which arises from the field of image reconstruction from projections.
I. García, P. M. Ortigosa, L. G. Casado, G. T. Herman, S. Matej
Greedy Randomized Adaptive Search for a Location Problem with Economies of Scale
Abstract
We consider a heuristic approach for the solution of a location problem with economies of scale. The method chosen has a strong intuitive appeal, a prominent empirical track record, and is trivial to efficiently implement on parallel processors. We define the various components comprising this GRASP approach and perform a step-by-step development of such a heuristic for the location problem with concave costs. Computational results for problems of dimensions up to 100 × 1000 are reported.
K. Holmqvist, A. Migdalas, P. M. Pardalos
An Algorithm for Improving the Bounding Procedure in Solving Process Network Synthesis by a Branch-and-Bound Method
Abstract
Process network synthesis (PNS) has enormous practical impact; however, its mixed integer programming model is tedious to solve because it usually involves a large number of binary variables. A structural model of PNS can be given on the basis of a combinatorial approach. An optimal solution of this model can conveniently be generated by a branch-and-bound method if an effective bounding procedure is available. In the present work, a sharp bounding function is given for solving PNS by a branch-and-bound method.
B. Imreh, F. Friedler, L. T. Fan
Metadaten
Titel
Developments in Global Optimization
herausgegeben von
Immanuel M. Bomze
Tibor Csendes
Reiner Horst
Panos M. Pardalos
Copyright-Jahr
1997
Verlag
Springer US
Electronic ISBN
978-1-4757-2600-8
Print ISBN
978-1-4419-4768-0
DOI
https://doi.org/10.1007/978-1-4757-2600-8