Skip to main content
main-content

Über dieses Buch

This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

Inhaltsverzeichnis

Frontmatter

Invited Papers

Frontmatter

The Richness of Modeling and Simulation and an Index of Its Body of Knowledge

The richness of modeling and simulation (M&S) and its increasing importance are emphasized. The three aspects of professionalism as well as stakeholders of M&S are documented. Work being done by the author on M&S body of knowledge (BoK) is outlined. Several other BoK and M&S BoK studies are referred to. The conclusions section includes the emphases that wide-spread application and ever increasing importance of modelling and simulation necessitate an agreed on body of knowledge index and its elaboration as well as the preservation of the integrity of the M&S discipline.
Tuncer Ören

Modelling for Managing the Complex Issue of Catchment-Scale Surface and Groundwater Allocation

The management of surface and groundwater can be regarded as presenting resource dilemmas. These are situations where multiple users share a common resource pool, and make contested claims about their rights to access the resource, and the best use and distribution of the resource among competing needs. Overshadowed by uncertainties caused by limited data and lack of scientific knowledge, resource dilemmas are challenging to manage, often leading to controversies and disputes about policy issues and outcomes. In the case of surface and groundwater management, the design of collective policies needs to be informed by a holistic understanding of different water uses and outcomes under different water availability and sharing scenarios. In this paper, we present an integrated modelling framework for assessing the combined impacts of changes in climate conditions and water allocation policies on surface and groundwater-dependent economic and ecological systems. We are implementing the framework in the Namoi catchment, Australia. However, the framework can be transferred and adapted for uses, including water planning, in other agricultural catchments.
Anthony Jakeman, Rebecca Kelly (nee Letcher), Jenifer Ticehurst, Rachel Blakers, Barry Croke, Allan Curtis, Baihua Fu, Sondoss El Sawah, Alex Gardner, Joseph Guillaume, Madeleine Hartley, Cameron Holley, Patrick Hutchings, David Pannell, Andrew Ross, Emily Sharp, Darren Sinclair, Alison Wilson

Papers

Frontmatter

Kinetic Analysis of the Coke Calcination Processes in Rotary Kilns

Kinetic analysis of the green petroleum coke calcining processes using the simulation program HYSYS and actual industrial data is presented. The rates of physical and chemical phenomena of interest, such as the rate of moisture removal, rates of volatile matter release and combustion, rates of coke dust and sulphur combustion were all represented by their kinetic models. This paper gives a detailed description of the simulation of these processes using HYSYS "kinetic reactor" module. The results were compared with actual industrial rotary kiln data in order to validate the simulation and there was a reasonable agreement for the two different GPCs considered. The methodology of kinetics-based simulation described in this study may be used to predict coke calcining kilns performance regardless of the green coke composition.
E. M. Elkanzi, F. S. Marhoon, M. J. Jasim

Behavior of Elastomeric Seismic Isolators Varying Rubber Material and Pad Thickness: A Numerical Insight

A numerical approach for the determination of (a) the shear behavior under large displacements and (b) the compression elastic modulus of common parallelepiped elastomeric isolators is presented. Particular attention is devoted to the role played by the material used for the rubber pads and their thickness. For them, an experimental data fitting by means of both a nine constants Mooney-Rivlin and a five constants exponential law is utilized, within a Finite Element discretization of the isolator. Having at disposal a few experimental stretch-stress data points for each rubber compound in uniaxial tension, a cubic Bezier spline approach is firstly utilized, to generate numerically a large number of metadata containing the original experimental ones. Then, respectively the nine Mooney-Rivlin and five exponential law constitutive parameters are estimated through a least square approach. Once assessed the models, a full scale rectangular seismic isolator is analyzed when subjected to horizontal actions and normal compression, in order to provide estimates of the initial stiffness and the overall behavior of the isolator undergoing large deformations, using both models and for all the compounds considered. It is found that the global behavior may depend significantly on the material hypothesis assumed to model rubber and on pads thickness.
Gabriele Milani, Federico Milani

Numerical Simulation of Coastal Flows in Open Multiply-Connected Irregular Domains

We develop a numerical method for the simulation of coastal flows in multiply-connected domains with irregular boundaries that may contain both closed and open segments. The governing equations are the shallow-water model. Our method involves splitting of the original nonlinear operator by physical processes and by coordinates. Specially constructed finite-difference approximations provide second-order unconditionally stable schemes that conserve the mass and the total energy of the discrete inviscid unforced shallow-water system, while the potential enstrophy results to be bounded, oscillating in time within a narrow range. This allows numerical simulation of coastal flows adequate both from the mathematical and physical standpoints. Several numerical experiments, including those with complex boundaries, demonstrate the skilfulness of the method.
Yuri N. Skiba, Denis M. Filatov

System Dynamics and Agent-Based Simulation for Prospective Health Technology Assessments

Healthcare innovations open new treatment possibilities for patients and offer potentials to increase their life quality. But it is also possible that a new product will have negative influences on patient’s life quality, if it has not been checked before. To prevent latter cases three already established methods can be used in order to assess healthcare technologies and to inform regulatory agencies. But these tools share a common problem. They can only be applied when a product is already developed and high costs have been already produced. This work describes Prospective Health Technology Assessment. This approach uses hybrid simulation techniques in order to learn about the impacts of a new innovation before a product has been developed. System Dynamics is used to perform high-level simulation and Agent-Based Simulation allows to model individual behavior of persons.
Anatoli Djanatliev, Peter Kolominsky-Rabas, Bernd M. Hofmann, Axel Aisenbrey, Reinhard German

Simple and Efficient Algorithms to Get a Finer Resolution in a Stochastic Discrete Time Agent-Based Simulation

A conceptually simple approach on adjusting the time step to a finer one is proposed and an efficient two-level sampling algorithm for it is presented. Our approach enables the modelers to divide each original time step into any integral number of equally spaced sub-steps, and the original model and the finer one can be formally shown to be equivalent with some basic assumptions.
The main idea behind the two-level sampling algorithm is to make ”big decision” first and make subsequent ”small decisions” when necessary, i.e., first to decide if an event occurred in original time scale then refine the occurrence time to a finer scale if it does occur.
The approach is used on a stochastic model for epidemic spread and show that the refined model produces expected results. The computational resources needed for the refined model increases only marginally using the two-level sampling algorithm together with some implementation techniques which are also highlighted in the paper.
Approach proposed in this paper can be adapted to be used in merging continuous time steps into a super step to save simulation time.
Chia-Tung Kuo, Da-Wei Wang, Tsan-sheng Hsu

Numerical Study of Turbulent Boundary-Layer Flow Induced by a Sphere Above a Flat Plate

The flow past a three-dimensional obstacle on a flat plate is one of the key problems in the boundary-layer flows, which shows a significant value in industry applications. A direct numerical study of flow past a sphere above a flat plate is investigated. The immersed boundary (IB) method with multiple-direct forcing scheme is used to couple the solid sphere with fluid. The detail information of flow field and vortex structure is obtained. The velocity and pressure distributions are illuminated, and the recirculation region with the length of which is twice as much as the sphere diameter is observed in the downstream of the sphere. The effects of the sphere on the boundary layer are also explored, including the velocity defect, the turbulence intensity and the Reynolds stresses.
Hui Zhao, Anyang Wei, Kun Luo, Jianren Fan

Airflow and Particle Deposition in a Dry Powder Inhaler: An Integrated CFD Approach

An integrated computational model of a commercial Dry Powder Inhaler, DPI, device (i.e., Turbuhaler) is developed. The steady-state flow in a DPI is determined by solving the Navier-Stokes equations using FLUENT (v6.3) considering different flow models, e.g., laminar, k-ε, k-ω SST. Particle motion and deposition are described using an Eulerian-fluid/Lagrangian-particle approach. Particle/wall collisions are taken to result in deposition when the normal collision velocity is less than a size-dependent critical value. The flow rate and particle deposition are determined for a range of pressure drops (i.e., 800-8800Pa), as well as particle sizes corresponding to single particles and aggregates (i.e., 0.5-20μm). Overall, the simulation results are found to agree well with available experimental data for the volumetric outflow rate as well as the local and total particle deposition.
Jovana Milenkovic, Alexandros H. Alexopoulos, Costas Kiparissides

Solar Soldier: Virtual Reality Simulations and Guidelines for the Integration of Photovoltaic Technology on the Modern Infantry Soldier

Following recent advances in the field of thin and flexible materials, the use of product integrated photovoltaics (PIPV) for light harvesting and electric power generation has received increased attention today. PIPV is one of the most promising portable renewable energy technologies of today, especially for the defense industry and the modern infantry soldier. Nevertheless, there is limited work on light harvesting analysis and power generation assessment for its use in various military scenarios including on how to best integrate the technology on the infantry soldier. This study aims to fill this gap by accurately analyzing the light harvesting through virtual reality simulations. Following the virtual light analysis, an assessment of the power generation potential per scenario and investigation of the optimum integration areas of flexible PV devices on the infantryman are presented. Finally, there is a discussion of the key results, providing the reader with a set of guidelines for the positioning and integration of such renewable energy technology on the modern infantry soldier.
Ioannis Paraskevopoulos, Emmanuel Tsekleves

Simulation and Realistic Workloads to Support the Meta-scheduling of Scientific Workflows

When heterogeneous computing resources are integrated to create more powerful execution environments, new scheduling strategies are necessary to allocate work units to available resources. In this paper we apply simulation results to schedule the execution of scientific workflows in a resource integration platform. A simulator built upon Alea and GridSim has been implemented to simulate the behaviour of the grid and cluster computing resources integrated in the platform. Simulations are generated using realistic workloads and then analysed by a meta-scheduler to decide the most suitable resource for each workflow task execution. To improve simulation results synthetic workloads are dynamically created considering the current resources state and a set of log-recorded historical executions. The paper also reports the impact of the proposed techniques when experimentally applied to the execution of the Inspiral analysis workflow.
Sergio Hernández, Javier Fabra, Pedro Álvarez, Joaquín Ezpeleta

Dynamic Simulation of the Effect of Tamper Resistance on Opioid Misuse Outcomes

The objective of the study was to develop a system dynamics model of the medical use of pharmaceutical opioids, and the associated diversion and nonmedical use of these drugs. The model was used to test the impact of the a tamper resistance intervention in this complex system. The study relied on secondary data obtained from the literature and from other public sources for the period 1995 to 2008. In addition, an expert panel provided recommendations regarding model parameters and model structure. The behavior of the resulting systems-level model compared favorably with reference behavior data. After the base model was tested, logic to simulate the replacement of all opioids with tamper resistant formulations was added and the impact on overdose deaths was evaluated over a seven-year period, 2008-2015. Principal findings were that the introduction of tamper resistant formulations unexpectedly increased total overdose deaths. This was due to increased prescribing which counteracted the drop in the death rate. We conclude that it is important to choose metrics carefully, and that the system dynamics modelling approach can help to evaluate interventions intended to ameliorate the adverse outcomes in the complex system associated with treating pain with opioids.
Alexandra Nielsen, Wayne Wakeland

A Multi-GPU Approach to Fast Wildfire Hazard Mapping

Burn probability maps (BPMs) are among the most effective tools to support strategic wildfire and fuels management. In such maps, an estimate of the probability to be burned by a wildfire is assigned to each point of a raster landscape. A typical approach to build BPMs is based on the explicit propagation of thousands of fires using accurate simulation models. However, given the high number of required simulations, for a large area such a processing usually requires high performance computing. In this paper, we propose a multi-GPU approach for accelerating the process of BPM building. The paper illustrates some alternative implementation strategies and discusses the achieved speedups on a real landscape.
Donato D’Ambrosio, Salvatore Di Gregorio, Giuseppe Filippone, Rocco Rongo, William Spataro, Giuseppe A. Trunfio

Controlling Turtles through State Machines: An Application to Pedestrian Simulation

Undoubtedly, agent based modelling and simulation (ABMS) has been recognised as a promising technique for studying complex phenomena. Due to the attention that it has attracted, a significant number of platforms have been proposed, the majority of which target reactive agents, i.e. agents with relatively simple behaviours. Thus, little has been done toward the introduction of richer agent oriented programming constructs that will enhance the platforms’ modelling capabilities and could potentially lead to the implementation of more sophisticated models. This paper discusses TSTATES, a domain specific language, together with an execution layer that runs on top of a widely accepted agent simulation environment and presents its application to modelling pedestrian simulation in an underground station scenario.
Ilias Sakellariou

Stability Analysis of Climate System Using Fuzzy Cognitive Maps

In the present work we developed a soft computing model for the qualitative analysis of the Earth’s climate system dynamics throughout the implementation of fuzzy cognitive maps. For this purpose, we identified the subsystems in terms of which the dynamics of the whole system can be described. Then, with these concepts we built a cognitive map via the study of the documented relations among these concepts. Once the map was built, we used the technique of state vector and the adjacent matrix to found the hidden pattens, i.e the feedback processes among system’s nodes. Later on, we explored the sensibility of the model to changes in the weights of the edges, and also to changes in the input data values. Finally, we used fuzzy edges to analyze the causality flux among concepts and to explore possible solutions applied in specific edges.
Carlos Gay García, Iván Paz Ortiz

Fuzzy Models: Easier to Understand and an Easier Way to Handle Uncertainties in Climate Change Research

Greenhouse gas emission scenarios (through 2100) developed by the Intergovernmental Panel on Climate Change when converted to concentrations and atmospheric temperatures through the use of climate models result in a wide range of concentrations and temperatures with a rather simple interpretation: the higher the emissions the higher the concentrations and temperatures. Therefore the uncertainty in the projected temperature due to the uncertainty in the emissions is large. Linguistic rules are obtained through the use of linear emission scenarios and the Magicc model. These rules describe the relations between the concentrations (input) and the temperature increase for the year 2100 (output) and are used to build a fuzzy model. Another model is presented that includes, as a second source of uncertainty in input, the climate sensitivity to explore its effects on the temperature. Models are attractive because their simplicity and capability to integrate the uncertainties to the input and the output.
Carlos Gay García, Oscar Sánchez Meneses, Benjamín Martínez-López, Àngela Nebot, Francisco Estrada

Small-Particle Pollution Modeling Using Fuzzy Approaches

Air pollution caused by small particles is a major public health problem in many cities of the world. One of the most contaminated cities is Mexico City. The fact that it is located in a volcanic crater surrounded by mountains helps thermal inversion and imply a huge pollution problem by trapping a thick layer of smog that float over the city. Modeling air pollution is a political and administrative important issue due to the fact that the prediction of critical events should guide decision making. The need for countermeasures against such episodes requires predicting with accuracy and in advance relevant indicators of air pollution, such are particles smaller than 2.5 microns (PM2.5). In this work two different fuzzy approaches for modeling PM2.5 concentrations in Mexico City metropolitan area are compared with respect the simple persistence method.
Àngela Nebot, Francisco Mugica

Stochastic Resonance and Anti-cyclonic Rings in the Gulf of Mexico

In this work, we used a nonlinear, reduced gravity model of the Gulf of Mexico to study the effect of a seasonal variation of the reduced gravity parameter on ring-shedding behaviour. When small amplitudes of the seasonal variation are used, the distributions of ring-shedding periods are bi-modal. When the amplitude of the seasonal variation is large enough, the ring-shedding events shift to a regime with a constant, yearly period. If the seasonal amplitude of the reduce gravity parameter is small but a noise term is included, then a yearly regime is obtained, suggesting that stochastic resonance could play a role in the ring-shedding process taking place in the Gulf of Mexico.
Benjamín Martínez-López, Jorge Zavala-Hidalgo, Carlos Gay García

On Low-Fidelity Model Selection for Antenna Design Using Variable-Resolution EM Simulations

One of the most important tools of antenna design is electromagnetic (EM) simulation. High-fidelity simulations offer accurate evaluation of the antenna performance, however, they are computationally expensive. As a result, employing EM solvers in automated antenna design using numerical optimization techniques is quite challenging. A possible workaround are surrogate-based optimization (SBO) methods. In case of antennas, the generic way to construct the surrogate is through coarse-discretization EM simulations that are faster but, at the same time, less accurate. For most SBO algorithms, quality of such low-fidelity models may be critical for performance. In this work, we investigate the trade-offs between the speed and the accuracy of the low-fidelity antenna models as well as the impact of the model selection on the quality of the design produced by the SBO algorithm as well as the computational cost of the optimization process. Our considerations are illustrated using examples.
Slawomir Koziel, Stanislav Ogurtsov, Leifur Leifsson

An X-FEM Based Approach for Topology Optimization of Continuum Structures

In this study, extended finite element (X-FEM) is implemented to represent topology optimization of continuum structures in a fixed grid design domain. An evolutionary optimization algorithm is used to gradually remove inefficient material from the design space during the optimization process. In the case of 2D problems, evolution of the design boundary which is supper-imposed on the fixed grid finite element framework is captured using isolines of structural performance. The proposed method does not need any remeshing approach as the X-FEM scheme can approximate the contribution of boundary elements in the finite element frame work of the problem. Therefore the converged solutions come up with clear and smooth boundaries which need no further interpretation. This approach is then extended to 3D by using a 3D X-FEM scheme implemented on isosurface topology design.
Meisam Abdi, Ian Ashcroft, Ricky Wildman

Collaborative Optimization Based Design Process for Process Engineering

Traditionally, the paper mills have been designed using mostly engineering experiences and rules of thumbs. The main target of the design has been the structure of the plant, whereas finding the optimal operation of the plant has been left for the operators. Bi-level multi-objective optimization (BLMOO) offers a method for optimizing both the structure and the operation during the design phase. In order to use BLMOO in design projects, the business process has to be re-engineered. This research defines a process for applying BLMOO in process design in multi-organizational projects. The process is then evaluated by interviewing experts.
Mika Strömman, Ilkka Seilonen, Kari Koskinen

Hydrodynamic Shape Optimization of Fishing Gear Trawl-Doors

Rising fuel prices and inefficient fishing gear are hampering the fishing industry. Any improvements of the equipment that lead to reduced operating costs of the fishing vessels are highly desirable. This chapter describes an efficient optimization algorithm for the design of trawl-door shapes using accurate high-fidelity computational fluid dynamic models. Usage of the algorithm is demonstrated on the re-design of typical trawl-doors at high- and low-angle of attack.
Leifur Leifsson, Slawomir Koziel, Eirikur Jonsson

Wing Aerodynamic Shape Optimization by Space Mapping

This chapter describes an efficient aerodynamic design optimization methodology for wings in transonic flow. The approach replaces a computationally expensive high-fidelity computational fluid dynamic model (CFD) in an iterative optimization process with a corrected polynomial approximation model constructed by a cheap low-fidelity CFD model. The output space mapping technique is used to correct the approximation model to yield an accurate predictor of the high-fidelity one. The algorithm is applied to two transonic wing design problems.
Leifur Leifsson, Slawomir Koziel, Eirikur Jonsson

Efficient Design Optimization of Microwave Structures Using Adjoint Sensitivity

An important step of the microwave design process is the adjustment of geometry and material parameters of the structure under consideration to make it meet given performance requirements. Nowadays, it is typically conducted using full-wave electromagnetic (EM) simulations. Because accurate high-fidelity simulations are computationally expensive, automation of this process is quite challenging. In particular, the use of conventional numerical optimization algorithms may be prohibitive as these methods normally require a large number of objective function evaluations (and, consequently, EM simulations) to converge. The adjoint sensitivity technique that recently become available in commercial EM simulation software packages can be utilized to speed up the EM-driven design optimization process either by utilizing the sensitivity information in conventional gradient-based algorithms or by combining it with surrogate-based approaches. Here, several recent methods and algorithms for microwave design optimization using adjoint sensitivity are reviewed. We discuss advantages and disadvantages of these techniques and illustrate them through numerical examples.
Slawomir Koziel, Leifur Leifsson, Stanislav Ogurtsov

Backmatter

Weitere Informationen

Premium Partner

BranchenIndex Online

Die B2B-Firmensuche für Industrie und Wirtschaft: Kostenfrei in Firmenprofilen nach Lieferanten, Herstellern, Dienstleistern und Händlern recherchieren.

Whitepaper

- ANZEIGE -

Best Practices für die Mitarbeiter-Partizipation in der Produktentwicklung

Unternehmen haben das Innovationspotenzial der eigenen Mitarbeiter auch außerhalb der F&E-Abteilung erkannt. Viele Initiativen zur Partizipation scheitern in der Praxis jedoch häufig. Lesen Sie hier  - basierend auf einer qualitativ-explorativen Expertenstudie - mehr über die wesentlichen Problemfelder der mitarbeiterzentrierten Produktentwicklung und profitieren Sie von konkreten Handlungsempfehlungen aus der Praxis.
Jetzt gratis downloaden!

Bildnachweise