Skip to main content
main-content

Über dieses Buch

This book presents efficient metaheuristic algorithms for optimal design of structures. Many of these algorithms are developed by the author and his colleagues, consisting of Democratic Particle Swarm Optimization, Charged System Search, Magnetic Charged System Search, Field of Forces Optimization, Dolphin Echolocation Optimization, Colliding Bodies Optimization, Ray Optimization. These are presented together with algorithms which were developed by other authors and have been successfully applied to various optimization problems. These consist of Particle Swarm Optimization, Big Bang-Big Crunch Algorithm, Cuckoo Search Optimization, Imperialist Competitive Algorithm, and Chaos Embedded Metaheuristic Algorithms. Finally a multi-objective optimization method is presented to solve large-scale structural problems based on the Charged System Search algorithm.

The concepts and algorithms presented in this book are not only applicable to optimization of skeletal structures and finite element models, but can equally be utilized for optimal design of other systems such as hydraulic and electrical networks.

In the second edition seven new chapters are added consisting of the new developments in the field of optimization. These chapters consist of the Enhanced Colliding Bodies Optimization, Global Sensitivity Analysis, Tug of War Optimization, Water Evaporation Optimization, Vibrating Particle System Optimization and Cyclical Parthenogenesis Optimization algorithms. A chapter is also devoted to optimal design of large scale structures.

Inhaltsverzeichnis

Frontmatter

Chapter 1. Introduction

Abstract
In today’s extremely competitive world, human beings attempt to exploit the maximum output or profit from a limited amount of available resources. In engineering design, for example, choosing design variables that fulfill all design requirements and have the lowest possible cost is concerned, i.e., the main objective is to comply with basic standards but also to achieve good economic results. Optimization offers a technique for solving this type of issues.
A. Kaveh

Chapter 2. Particle Swarm Optimization

Abstract
Particle swarm optimization (PSO) algorithms are nature-inspired population-based metaheuristic algorithms originally accredited to Eberhart, Kennedy, and Shi [1, 2]. The algorithms mimic the social behavior of birds flocking and fishes schooling. Starting form a randomly distributed set of particles (potential solutions), the algorithms try to improve the solutions according to a quality measure (fitness function). The improvisation is preformed through moving the particles around the search space by means of a set of simple mathematical expressions which model some interparticle communications. These mathematical expressions, in their simplest and most basic form, suggest the movement of each particle toward its own best experienced position and the swarm’s best position so far, along with some random perturbations. There is an abundance of different variants using different updating rules, however.
A. Kaveh

Chapter 3. Charged System Search Algorithm

Abstract
This chapter consists of two parts. In the first part, an optimization algorithm based on some principles from physics and mechanics, which is known as the charged system search (CSS) [1]. In this algorithm the governing Coulomb law from electrostatics and the Newtonian laws of mechanics. CSS is a multi-agent approach in which each agent is a charged particle (CP). CPs can affect each other based on their fitness values and their separation distances. The quantity of the resultant force is determined by using the electrostatics laws, and the quality of the movement is determined using Newtonian mechanics laws. CSS can be utilized in all optimization fields; especially it is suitable for non-smooth or non-convex domains. CSS needs neither the gradient information nor the continuity of the search space.
A. Kaveh

Chapter 4. Magnetic Charged System Search

Abstract
This chapter consists of two parts. In the first part, the standard magnetic charged system search (MCSS) is presented and applied to different numerical examples to examine the efficiency of this algorithm. The results are compared to those of the original charged system search method [1].
A. Kaveh

Chapter 5. Field of Forces Optimization

Abstract
Although different metaheuristic algorithms have some differences in approaches to determine the optimum solution, however, their general performance is approximately the same. They start the optimization with random solutions, and the subsequent solutions are based on randomization and some other rules. With progressing the optimization process, the power of rules increases, and the power of randomization decreases. It seems that these rules can be modeled by a familiar concept of physics as well known as the fields of forces (FOF). FOF is a concept which is utilized in physics to explain the reason of the operation of the universe. The virtual FOF model is approximately simulated by using the concepts of real-world fields such as gravitational, magnetic, or electric fields (Kaveh and Talatahari [1]).
A. Kaveh

Chapter 6. Dolphin Echolocation Optimization

Abstract
Nature has provided inspiration for most of the man-made technologies. Scientists believe that dolphins are the second to humans in smartness and intelligence. Echolocation is the biological sonar used by dolphins and several kinds of other animals for navigation and hunting in various environments. This ability of dolphins is mimicked in this chapter to develop a new optimization method. There are different metaheuristic optimization methods, but in most of these algorithms, parameter tuning takes a considerable time of the user, persuading the scientists to develop ideas to improve these methods. Studies have shown that metaheuristic algorithms have certain governing rules and knowing these rules helps to get better results. Dolphin echolocation (DE) takes advantages of these rules and outperforms many existing optimization methods, while it has few parameters to be set. The new approach leads to excellent results with low computational efforts [1].
A. Kaveh

Chapter 7. Colliding Bodies Optimization

Abstract
This chapter presents a novel efficient metaheuristic optimization algorithm called colliding bodies optimization (CBO) for optimization. This algorithm is based on one-dimensional collisions between bodies, with each agent solution being considered as the massed object or body. After a collision of two moving bodies having specified masses and velocities, these bodies are separated with new velocities. This collision causes the agents to move toward better positions in the search space. CBO utilizes a simple formulation to find minimum or maximum of functions; also it is independent of parameters [1].
A. Kaveh

Chapter 8. Ray Optimization Algorithm

Abstract
In this chapter a newly developed metaheuristic method, so-called ray optimization, is presented. Similar to other multi-agent methods, ray optimization has a number of particles consisting of the variables of the problem. These agents are considered as rays of light. Based on the Snell’s light refraction law, when light travels from a lighter medium to a darker medium, it refracts and its direction changes. This behavior helps the agents to explore the search space in early stages of the optimization process and to make them converge in the final stages. This law is the main tool of the ray optimization algorithm. This chapter consists of three parts.
A. Kaveh

Chapter 9. Modified Big Bang–Big Crunch Algorithm

Abstract
The Big Bang–Big Crunch (BB–BC) method developed by Erol and Eksin [1] consists of two phases: a Big Bang phase and a Big Crunch phase. In the Big Bang phase, candidate solutions are randomly distributed over the search space. Similar to other evolutionary algorithms, initial solutions are spread all over the search space in a uniform manner in the first Big Bang. Erol and Eksin [1] associated the random nature of the Big Bang to energy dissipation or the transformation from an ordered state (a convergent solution) to a disorder or chaos state (new set of solution candidates).
A. Kaveh

Chapter 10. Cuckoo Search Optimization

Abstract
In this chapter, a metaheuristic method so-called cuckoo search (CS) algorithm is utilized to determine optimum design of structures for both discrete and continuous variables. This algorithm is recently developed by Yang [1] and Yang and Deb [2, 3], and it is based on the obligate brood parasitic behavior of some cuckoo species together with the Lévy flight behavior of some birds and fruit flies. The CS is a population-based optimization algorithm and, similar to many other metaheuristic algorithms, starts with a random initial population which is taken as host nests or eggs. The CS algorithm essentially works with three components: Selection of the best by keeping the best nests or solutionsReplacement of the host eggs with respect to the quality of the new solutions or cuckoo eggs produced based randomization via Lévy flights globally (exploration)Discovering of some cuckoo eggs by the host birds and replacing according to the quality of the local random walks (exploitation) [2]
A. Kaveh

Chapter 11. Imperialist Competitive Algorithm

Abstract
In this chapter an optimization method is presented based on a sociopolitically motivated strategy, called imperialist competitive algorithm (ICA). ICA is a multi-agent algorithm with each agent being a country, which is either a colony or an imperialist. These countries form some empires in the search space. Movement of the colonies toward their related imperialist, and imperialistic competition among the empires, forms the basis of the ICA. During these movements, the powerful imperialists are reinforced, and the weak ones are weakened and gradually collapsed, directing the algorithm toward optimum points. Here, ICA is utilized to optimize the skeletal structures which are based on [1, 2].
A. Kaveh

Chapter 12. Chaos Embedded Metaheuristic Algorithms

Abstract
In nature complex biological phenomena such as the collective behavior of birds, foraging activity of bees, or cooperative behavior of ants may result from relatively simple rules which however present nonlinear behavior being sensitive to initial conditions. Such systems are generally known as “deterministic nonlinear systems” and the corresponding theory as “chaos theory.” Thus real-world systems that may seem to be stochastic or random may present a nonlinear deterministic and chaotic behavior. Although chaos and random signals share the property of long-term unpredictable irregular behavior and many of random generators in programming softwares as well as the chaotic maps are deterministic; however chaos can help order to arise from disorder. Similarly, many metaheuristic optimization algorithms are inspired from biological systems where order arises from disorder. In these cases disorder often indicates both non-organized patterns and irregular behavior, whereas order is the result of self-organization and evolution and often arises from a disorder condition or from the presence of dissymmetries. Self-organization and evolution are two key factors of many metaheuristic optimization techniques. Due to these common properties between chaos and optimization algorithms, simultaneous use of these concepts can improve the performance of the optimization algorithms [1]. Seemingly the benefits of such combination are generic for other stochastic optimization, and experimental studies confirmed this although this has not mathematically been proven yet [2].
A. Kaveh

Chapter 13. Enhanced Colliding Bodies Optimization

Abstract
Colliding bodies optimization (CBO) was employed for size optimization of skeletal structures in Chap. 7. In this chapter, the enhanced colliding bodies optimization (ECBO) is presented that utilizes memory to save some historically best solution and uses a random procedure to avoid local optima which is also applied to skeletal structures [1, 2]. The capability of the CBO and ECBO is compared through three trusses and two frame structures. The design constraints of steel frames are imposed according to the provisions of LRFD–AISC.
A. Kaveh

Chapter 14. Global Sensitivity Analysis-Based Optimization Algorithm

Abstract
In this chapter a single-solution metaheuristic optimizer, namely, global sensitivity analysis-based (GSAB) algorithm [1], is presented that uses a basic set of mathematical techniques, namely, global sensitivity analysis. Sensitivity analysis (SA) studies the sensitivity of the model output with respect to its input parameters (Rahman [2]). This analysis is generally categorized as local SA and global SA techniques. While local SA studies the sensitivity of the model output about variations around a specific point, the global SA considers variations of the inputs within their entire feasibility space (Pianosi and Wagener [3], Zhai et al. [4]). One important feature of the GSA is factor prioritization (FP), which aims at ranking the inputs in terms of their relative contribution to output variability. The GSAB comprises of a single-solution optimization strategy and GSA-driven procedure, where the solution is guided by ranking the decision variables using the GSA approach, resulting in an efficient and rapid search. The proposed algorithm can be studied within the family of search algorithms such as the random search (RS) by Rastrigin [5], pattern search (PS) by Hooke and Jeeves [6], and vortex search (VS) by Dog and Ölmez [7] algorithms. In this method, similar to these algorithms, the search process is achieved in the specified boundaries. Contrary to these algorithms that use different functions for decreasing the search space, in the present method, the well-known GSA approach is employed to decrease the search boundaries. The minimization of an objective function is then performed by moving these search spaces into around the best global sample.
A. Kaveh

Chapter 15. Tug of War Optimization

Abstract
In this chapter, tug of war optimization (TWO) is presented as a newly developed nature-inspired, population-based metaheuristic algorithm. Utilizing a sport metaphor, the algorithm considers each candidate solution as a team participating in a series of rope-pulling competitions. The teams exert pulling forces on each other based on the quality of the solutions they represent. The competing teams move to their new positions according to Newtonian laws of mechanics. Unlike many other metaheuristic methods, the algorithm is formulated in such a way that considers the qualities of both of the interacting teams. TWO is applicable to global optimization of discontinuous, multimodal, non-smooth, and non-convex functions.
A. Kaveh

Chapter 16. Water Evaporation Optimization Algorithm

Abstract
Efficient metaheuristic optimization algorithms are developed to overcome the drawbacks of some traditional methods in highly nonlinear engineering optimization problems with high complexity, high dimension, and multimodal design spaces and to gain increasing popularity nowadays [1]. Performance assessment of a metaheuristic algorithm may be used by solution quality, computational effort, and robustness [2] directly affected by its two contradictory criteria: exploration of the search space (diversification) and exploitation of the best solutions found (intensification).
A. Kaveh

Chapter 17. Vibrating Particles System Algorithm

Abstract
In the recent years, many metaheuristics with different philosophy and characteristics are introduced and applied to a wide range of fields. The aim of these optimization methods is to efficiently explore the search space in order to find global or near-global solutions. Since they are not problem specific and do not require the derivatives of the objective function, they have received increasing attention from both academia and industry.
A. Kaveh

Chapter 18. Cyclical Parthenogenesis Optimization Algorithm

Abstract
Over the last few decades, metaheuristic algorithms have been successfully used for solving complex global optimization problems in science and engineering. These methods, which are usually inspired by natural phenomena, do not require any gradient information of the involved functions and are generally independent of the quality of the starting points. As a result, metaheuristic optimizers are favorable choices when dealing with discontinuous, multimodal, non-smooth, and non-convex functions, especially when near-global optimum solutions are sought, and the intended computational effort is limited.
A. Kaveh

Chapter 19. Optimal Design of Large-Scale Frame Structures

Abstract
Discrete or continuous size optimization of large-scale, high-rise, or complex structures leads to problems with large number of design variables and large search space and requires the control of a great number of design constraints. Separate design decisions for each variable would be allowed. Thus, the optimizer invoked to process such a sizing problem is given the possibility to really optimize the objective function by detecting the optimum solution within a vast amount of possible design options. The huge number of available design options typically confuses an optimizer and radically decreases the potential of effective search for a high-quality solution. This chapter is based on the recent development on design of large-scale frame structures (Kaveh and Bolandgerami [1]).
A. Kaveh

Chapter 20. Multi-Objective Optimization of Truss Structures

Abstract
In this chapter a multi-objective optimization (MOP) is presented that uses the main concepts of charged system search algorithm (Kaveh and Massoudi [1]).
A. Kaveh
Weitere Informationen

Premium Partner

    Marktübersichten

    Die im Laufe eines Jahres in der „adhäsion“ veröffentlichten Marktübersichten helfen Anwendern verschiedenster Branchen, sich einen gezielten Überblick über Lieferantenangebote zu verschaffen. 

    Bildnachweise