Skip to main content

2021 | Buch

Proceedings of the 5th International Symposium on Uncertainty Quantification and Stochastic Modelling

Uncertainties 2020

insite
SUCHEN

Über dieses Buch

This proceedings book discusses state-of-the-art research on uncertainty quantification in mechanical engineering, including statistical data concerning the entries and parameters of a system to produce statistical data on the outputs of the system. It is based on papers presented at Uncertainties 2020, a workshop organized on behalf of the Scientific Committee on Uncertainty in Mechanics (Mécanique et Incertain) of the AFM (French Society of Mechanical Sciences), the Scientific Committee on Stochastic Modeling and Uncertainty Quantification of the ABCM (Brazilian Society of Mechanical Sciences) and the SBMAC (Brazilian Society of Applied Mathematics).

Inhaltsverzeichnis

Frontmatter

Modeling and Tools

Frontmatter
Stick-Slip Oscillations in a Stochastic Multiphysics System

This work analyzes the stochastic response of a multiphysics system with stick-slip oscillations. The system is composed of two subsystems that interact, a mechanical with Coulomb friction and an electromagnetic (a DC motor). An imposed source voltage in the DC motor stochastically excites the system. This excitation combined with the dry-friction induces in the mechanical subsystem stochastic stick-slip oscillations. The resulting motion of the mechanical subsystem can be characterized by a random sequence of two qualitatively different and alternate modes, the stick- and slip-modes, with a non-smooth transition between them. The onset and the duration of each stick-mode are uncertain and depend on electromagnetic and mechanical parameters and variables, specially the position of the mechanical subsystem during the stick-mode. Duration and position are dependent random variables and must be jointly analyzed. The objective of this paper is to characterize and quantify this dependence, a novelty in the literature. The high amount of data required to perform the analysis and to construct joint histograms puts the problem into the class of big data problems.

Roberta Lima, Rubens Sampaio
Some Tools to Study Random Fractional Differential Equations and Applications

Random fractional differential equations are useful mathematical tools to model problems involving memory effects and uncertainties. In this contribution, we present some results, which extent their deterministic counterpart, to fractional differential equations whose initial conditions and coefficients are random variables and/or stochastic process. The probabilistic analysis utilizes the random mean square calculus. For the sake of completeness, we study both autonomous and non-autonomous initial value problems. The analysis includes the computation of analytical and numerical solutions, as well as their main probabilistic information such as the mean, the variance and the first probability density function. Several examples illustrating the theoretical results are shown.

Clara Burgos, Juan-Carlos Cortés, María-Dolores Roselló, Rafael-J. Villanueva
Global Sensitivity Analysis of Offshore Wind Turbine Jacket

Nowadays, offshore wind turbine (OWT) energy is considered as one of the most promising among renewable energies. Many researches have been done to study offshore wind turbine foundations like jacket and monopile. The sensitivity analysis of the foundation can also be found in several researches. But there are always some limitations in these related studies. For example, the sea current is not considered and the directions of wind and wave are assumed to be aligned. The behavior of one specific element or part of jacket is studied. In this paper, the global sensitivity of the maximum stress and displacement with respect to material, local geometry and environmental uncertain parameters are investigated for the offshore wind turbine jacket used in Code Comparison Collaboration Continuation (OC4) project. Compared to other previous studies, the wind parameters are replaced by the load actions simulated in aerodynamic software FAST developed by National Renewable Energy Laboratory (NREL). The sea current is considered and the directions of the wind, sea wave and current are assumed to be independent. Also, the maximum stresses of static analysis in the different parts of jacket are investigated. The Morris screening and Fourier amplitude sensitivity test methods are applied to perform global sensitivity analysis. The results show that maximum stresses of different parts of jacket are affected by different parameters. The zones of the jacket above the mean sea level (MSL) are more sensitive to the geometry parameters and redefined wind loads. However, the zones below MSL are affected a lot by the wave parameters. In addition, the results of the two global sensitivity analyses are mostly in good agreement.

Chao Ren, Younes Aoues, Didier Lemosse, Eduardo Souza De Cursi
Uncertainty Quantification and Stochastic Modeling for the Determination of a Phase Change Boundary

We consider the determination of solid/liquid interfaces by the solution of the Stefan problem, involving two heat equations in unknown domains (two-phase problem). We establish a regularized formulation of the Stefan problem, which is used to characterize approximated values of the field of temperatures as means of convenient stochastic processes, using Feynman-Kac representations. The results of the resulting stochastic method are compared to Finite Element Approximations and show to be comparable to T2 finite element approximations. We present an example of variability of the domains occupied by the phases. In future work, methods for the uncertainty quantification of infinite dimensional objects will be applied to characterize the variability of the regions.

Juan Manuel Rodriguez Sarita, Renata Troian, Beatriz Costa Bernardes, Eduardo Souza de Cursi
Multiscale Method: A Powerful Tool to Reduce the Computational Cost of Big Data Problems Involving Stick-Slip Oscillations

Nonlinear initial values problems are often used to model the dynamics of many different physical phenomena, for example, systems with dry friction. Usually, these nonlinear IVP do not present a known analytical solution. Then, in order to study these problems, a possible approach is to use approximation methods. The literature dealing with different types of approximation techniques is extensive. Usually, the methods are classified as numerical or analytical. Both can be accurate and provide approximations with any desired precision. However, their efficiencies in terms of computational cost can be very different when they are applied in problems involving big data, for example, stochastic simulations. With analytical methods it is possible to obtain an analytical expression as an approximation to the solution to the IVP, which may be very useful. For example, these analytical expressions can applied to speed up Monte Carlo simulations. The Monte Carlo method is an important tool, which permits to construct statistical models for random object transformations. To build an accurate statistical model (often histograms and sample statistics), several realizations of the transformation output are usually required, a big data problem. If each realization is obtained by a numerical integration, the computation of the Monte Carlo simulations can become a task with high computational and temporal costs. This paper shows that an option to reduce those costs is to use analytical approximations instead of numerical approximations. By doing this, instead of to perform a numerical integration for each realization, which is time consuming task, a simple substitution of values in the analytical expressions can be done. This article aims to compare the computational costs of the construction of statistical models by Monte Carlo simulations, using numerical and analytical approximations. The objective is to show the gain in terms of CPU time when analytical approximations are used instead of numerical ones. To exemplify the gain, an interesting big data problem involving stick-slip phenomenon is developed. The system analyzed has a mass moving over a belt involving random dry friction.

Mariana Gomes, Roberta Lima, Rubens Sampaio
Coupled Lateral-Torsional Drill-String with Uncertainties

This paper treats the nonlinear problem of a coupled lateral-torsional drill string. A simplified 3 DOF system is used to model the system. The nonlinearity has two main sources: (1) the nonlinear bit-rock interaction, and (2) the impact between the column and the borehole. We classified different dynamics regimes (stick-slip, forward whirl, backward whirl, etc) and construct a map varying the rotational speed imposed at the top and the weight-on-bit. The Maximum Entropy Principle is employed to construct a probabilistic model for the borehole wall friction, and the stochastic response is approximated using the Monte Carlo Method. A probabilistic map is built with the stochastic model used, where the probability of the system to reach a given dynamic regime is shown.

Lucas P. Volpi, Daniel M. Lobo, Thiago G. Ritto
A Stochastic Surrogate Modelling of a NonLinear Time-Delay Mechanical System

Nonlinear time-delay dynamic is present in a wide range of engineering problems. This is due to the modernization of structures related to the need of using lighter, more resistant and flexible materials. In mechanical systems, nonlinearities may have physical or geometric characteristics. Most of these systems may possess complex equations that demands a significant computer processing time in order to solve them. In addition, these systems may be subject to uncertainties, such as material properties, random forces, dimensional tolerances and others. The complexity and the time required to solve the equations will be increased with the addition of uncertainties to the inputs of the dynamic system model. In this case, a surrogate model based on Karhunen-Loève decomposition or polynomial chaos of dynamic system is a viable choice to reduce the complexity and the computational time of the problem, as well as obtaining the statistical responses of the model. Surrogate modeling (also known as metamodeling) is employed to replace the original model of high complexity by a simpler model whose computation cost is reduced. In the field of uncertainty quantification, the statistical moments of a complex model can be easily obtained once a surrogate model is created. Methods like KLD (Karhunen-Loève Expansion), which relies on the covariance function of the system and decompose the model into a set of eigenvalues and eigenvectors which represents the surrogate model, or PCE (polynomial chaos expansion), that uses a set of multivariate orthogonal polynomials to build the surrogate model are applied to represent the system output. The purpose of this paper is to build a surrogate model of a nonlinear mechanical system with time delay using PCE and KL. A comparison between the original model response will be made against the surrogate model.

Emanuel Cruvinel, Marcos Rabelo, Marcos L. Henrique, Romes Antonio Borges
A Stochastic Approach for a Cosserat Rod Drill-String Model with Stick-Slip Motion

Drill-strings employed in the oil extraction process present complex dynamics. Due to their slenderness (with length surpassing 1000 m), geometrical aspects of oil-wells, and contact forces, drill-strings exhibit a strong non-linear response. Many unwanted phenomena, like stick-slip oscillations, may occur under certain conditions. Moreover, there are several intrinsic uncertainties, e.g. in modelling the soil, which may lead to unwanted operation regimes and affect performance, i.e. degraded rate-of-penetration (ROP), and stick-slip. In this work, a stochastic approach to study drill-string dynamics is employed. The physical model is based on the Cosserat rod theory. Non-linearities of the drill-string problem may occur due to geometrical factors, such as finite displacements and rotations in deviated wells, as well as to contact and friction at the borehole wall and bit. In this particular paper, uncertainties are considered within the friction parameters. The random response is analysed through the trajectories of the drill-string axis and other quantities, such as the evolution of the angular velocities.

Hector Eduardo Goicoechea, Roberta Lima, Rubens Sampaio, Marta B. Rosales, F. S. Buezas
Stochastic Aspects in Dynamics of Curved Electromechanic Metastructures

In this work we perform uncertainty quantification in the dynamic response of curved electromechanic metastructures made up of piezoelectrics in Bimorph configuration. The study is performed through a new reduced 1D finite element model derived from the theory of linear elasticity and general piezoelectricity, obtained through the Hamilton’s Principle and Gauss’s Law, who serves as a mean deterministic reference. Parametric uncertainty is taken into account through different constructive and constitutive parameters. Probabilistic model is constructed with the basis of the deterministic model and both are calculated within finite element approaches. Monte Carlo Method is employed to calculate random realizations. A number of scenarios are evaluated in order to identify the parameter sensitivity. Also the reduced model is contrasted with different models for validation.

Lucas E. Di Giorgio, Marcelo T. Piovan
Local Interval Fields for Spatial Inhomogeneous Uncertainty Modelling

In an engineering context, design optimization is usually performed virtually using numerical models to approximate the underlying partial differential equations. However, valid criticism exists concerning such an approach, as more often than not, only partial or uninformative data are available to estimate the corresponding model parameters. As a result hereof, the results that are obtained by such numerical approximation can diverge significantly from the real structural behaviour of the design. Under such scarce data, especially interval analysis has been proven to provide robust bounds on the structure’s performance, often at a small-to-moderate cost. Furthermore, to model spatial dependence throughout the model domain, interval fields were recently introduced by the authors as an interval counterpart to the established random fields framework. However, currently available interval field methods cannot model local inhomogeneous uncertainty. This paper presents a local interval field approach to model the local inhomogeneous uncertainty under scarce data. The method is based on the use of explicit interval fields [1] and the recently used inverse distance weighting function [2]. This paper presents the approach for one dimension of spatial uncertainty. Nonetheless, the approach can be extended to an n-dimensional context. In the first part of the paper, a detailed theoretical background of interval fields is covered, and then the local interval fields approach is introduced. Furthermore, an academic case study is performed to compare the local interval field approach with inverse distance weighting.

Robin Callens, Matthias Faes, David Moens

Processes

Frontmatter
Uncertainty Quantification in Subsea Lifting Operations

Subsea lifting operations are dangerous and expensive. One typical problem is the amplification of dynamic forces on the lifting cable at deep water due to resonance of the cable-equipment system. So, it is necessary to guarantee that the cable is always tensioned to prevent slack conditions that lead to snap loads, and at the same time, the cable must be below its structural limit. Several models have been presented to analyze this phenomenon, but they did not consider uncertainties in the determination of the hydrodynamic coefficients ( $$C_a$$ and $$C_d$$ ), which affect considerably the response of the system. Therefore, the objective of this study is to evaluate the influence of the variability of these coefficients via a statistical description of the problem, using Markov Chain Monte Carlo, accept-reject method, maximum likelihood and Monte Carlo simulation in an integrated way. The variability on the structural resistance of the cable is also considered and a reliability study is presented. The stochastic analysis is compared with the deterministic one, and it is concluded that there is a probability of failure and slacking that could be neglected if the deterministic approach is used, which makes the stochastic analysis a more realistic diagnosis of the problem.

Luiz Henrique Marra da Silva Ribeiro, Leonardo de Padua Agripa Sales, Rodrigo Batista Tommasini
Product/Process Tolerancing Modelling and Simulation of Flexible Assemblies - Application to a Screwed Assembly with Location Tolerances

In industry, the modelling of product/process assemblies is based on the theory of Geometrical Product Specification – GPS – and Tolerancing Analysis. This industrial approach follows several international standards to specify the parts and build stack-ups models of tolerances of an assembly. The main hypothesis of these standards is the rigid workpiece principle. However, for large dimensions thin parts and assemblies as example, the effects of gravity and of the forces and/or displacements imposed by active tools, this rigid bodies assumption is not acceptable and “classic rigid stack-ups” can lead to non-representative results on functional requirements. Thus, this paper proposes an approach to take into account the flexibility of the parts and assemblies in the 3D tolerancing stack-ups. Coupling the tolerancing theory, the structural reliability approaches and FEM simulation, an original approach based on the stochastic polynomial chaos development method, the Sobol’s indices and FEM method is developed to build 3D flexible stack-ups and to estimate the main tolerance results. The choice of chaos meta-model is done to be close enough philosophy and form of the linear model of the standard NF E04-008:2013.

Tanguy Moro
Methodological Developments for Multi-objective Optimization of Industrial Mechanical Problems Subject to Uncertain Parameters

In this paper, we propose a non-intrusive methodology to obtain statistics on multi-objective optimization problems subject to uncertain parameters when using an industrial software design tool. The proposed methodology builds Pareto front samples with low computational cost and proposes a convenient posterior parameterization of the solution set, to enable the statistical analysis and, in perspective, the transformation of small sets of data in large samples, thanks to an Hilbertian approach. The statistics of objects, Hausdorff distance in particular, is applied to Pareto fronts to perform a statistical analysis. This strategy is first demonstrated on a simple test case and then applied to a practical engineering problem.

Artem Bilyk, Emmanuel Pagnacco, Eduardo J. Souza de Cursi
Uncertainties in Life Cycle Inventories: Monte Carlo and Fuzzy Sets Treatments

The Life Cycle Assessment (LCA) is an impact research methodology that focuses on the life cycle of a product (by extension, services), and is standardized by the ISO 14000 Series. This methodology has been applied in so many areas related to sustainable development, in order to evaluate the environmental, economic and social aspects of the processes of production and distribution of products and service goods. Despite this wide range of applications, the technique still presents weaknesses, especially in the question of the evaluation and expression of the uncertainties present in the various phases of the studies and inherent to the stochastic or subjective variations of the data sources and the generation of models, sometimes reducing the consistency and accuracy of the proposed results. In the present study, we will evaluate a methodology to deal with the best expression of such uncertainties in LCA studies, focusing on the Life Cycle Inventory (LCI) phase. The hypothesis explored is that the application of the Monte Carlo Simulation and Fuzzy Set Theory to the estimation and analysis of stochastic uncertainties in LCA allows a better expression of the level of uncertainty in terms of the Guide to Expression of Uncertainty in Measurements [11], in situations where the original life cycle inventory does not specify the initial uncertainties. The iron ore transport was selected as a process unit by means of an off-road- truck (OHT) with a load capacity of 220 tons and a power of 1700 kW, acting on the route between the mine and the primary crushing of a mining company, in the city of Congonhas (MG). Monte Carlo simulations and Fuzzy Set Theory applications were performed using spreadsheets (MS Excel). The LCA study was conducted in OpenLCA 1.6 (open source) software from data inventories of ELCD database 3.2, also freely accessible. The results obtained were statistically compared using Hypothesis Test and Variance Analysis to identify the effect of the techniques on the results of the Life Cycle Impact Assessment (LCIA) and a Sensitivity Analysis was performed to test the effect of the treatment and function of the distribution of probabilities in the expression of the parameters associated with the items of the original life cycle inventory. Research indicates that inventories with treated data may have their uncertainty expressed to a lesser degree than that expressed in the original inventory, with no change in the final values of the Life Cycle Impact Assessment (LCIA). The treatment of life cycle inventory data through Monte Carlo Simulation and Fuzzy Set Theory resulted in the possibility of expressing the LCI results with a degree of uncertainty lower than that used to express the uncertainty under the standards. Data treatment through Monte Carlo simulation with normal probability distribution showed the lowest values of uncertainty expression with significant difference in relation to the original inventory, at a significance level of 1%.

Marco Antônio Sabará
Manufacturing Variability of 3D Printed Broadband Multi-frequency Metastructure

Additive manufacturing has been used to propose several designs of phononic crystals and metamaterials due to the low cost to produce complex geometrical features. However, like any other manufacturing process, it can introduce material and geometrical variability in the nominal design and therefore can affect the structural dynamic performance. Locally resonant metamaterials are typically designed such that the distributed resonators have the same natural frequency or, in the case of rainbow metastructures, a well-defined spatial profile. In this work, manufacturing tolerances of beam samples produced from a Selective Laser Sintering process are assessed and variability levels are used to investigate the vibration suppression performance of broadband multi-frequency metastructures. Evenly spaced non-symmetric resonators are attached to a beam with U-shaped cross-section and partitioned by parallel baffled plates. An analytical model based on a transfer matrix approach is used to calculate transfer receptance due to a point time harmonic force. Moreover, a random field model is assumed based in previous experimental results and the effects of the correlation length, a measure of the spatial fluctuation, are also investigated for individual. The obtained results are expected to be useful for further robust design in mass produced industrial applications.

Adriano T. Fabro, Han Meng, Dimitrios Chronopoulos

Data Analysis and Identification

Frontmatter
UAV Autonomous Navigation by Image Processing with Uncertainty Trajectory Estimation

Unmanned Aerial Vehicles (UAV) is a technology under strong development, with application on several fields. For the UAV autonomous navigation, a standard scheme is to use signal from a Global Navigation System by Satellite (GNSS) onboard. However, such signal can suffer natural or human interference. Our approach applies image processing procedure for the UAV positioning: image edge extraction and correlation between drone image and georeferenced satellite image. A data fusion is also applied, for combining the inertial sensor data and positioning by image. The data fusion is performed by using neural network. The output from the data fusion neural network is the correction for the UAV trajectory. Here, the variance of the trajectory error is also predicted to quantify the uncertainty.

Gerson da Penha Neto, Haroldo Fraga de Campos Velho, Elcio Hideiti Shiguemori
Uncertainty Quantification in Data Fitting Neural and Hilbert Networks

We analyze the uncertainties in Neural and Hilbert networks. The first source of uncertainty is the variability of partition of the data in subsets for training, testing and validating. The analysis is made on the variability of the performance and of the weights of the networks. The effects of additive and multiplicative noises in the data are studied. The results of Neural and Hilbert Networks are compared. It appears that Hilbert Networks are more robust but imply a higher computational cost. The distributions of the outputs are studied and it appears that their means and modes may be used to improve the estimates furnished by the nets. The extension of Hilbert Networks to Element Based Networks is considered.

Leila Khalij, Eduardo Souza de Cursi
Climate Precipitation Prediction with Uncertainty Quantification by Self-configuring Neural Network

Artificial neural networks have been employed on many applications. Good results have been obtained by using neural network for the precipitation climate prediction to the Brazil. The input are some meteorological variables, as wind components for several levels, air temperature, and former precipitation. The neural network is automatically configured, by solving an optimization problem with Multi-Particle Collision Algorithm (MPCA) metaheuristic. However, it is necessary to address, beyond the prediction the uncertainty associated to the prediction. This paper is focused on two-fold. Firstly, to produce a monthly prediction for precipitation by neural network. Secondly, the neural network output is also designed to estimate the uncertainty related to neural prediction.

Juliana A. Anochi, Reynier Hernández Torres, Haroldo F. Campos Velho
Uncertainty Quantification in Risk Modeling: The Case of Customs Supply Chain

In this paper, Uncertainty Quantification (UQ) is used in risk modeling to de-scribe the time series describing the risks in customs supply chain. We start by introducing the Hilbertian approach related to representation of Random Varia-bles and addressing these approximations and their applications in UQ. Then an extension where these models are applied to the time series in order to handle the seasonal components of risks in customs supply chain. The models are fitted to time series describing the seized quantities of the smuggling of drugs on two sites using Moment Matching Method. the results furnish a good description of important properties of the data, namely, the polynomial expansion and Cumulative Distribution Function (CDF).

Lamia Hammadi, Eduardo Souza de Cursi, Vlad Stefan Barbu

Uncertainty Analysis and Estimation

Frontmatter
PLS Application to Optimize the Formulation of an Eco-Geo-Material Based on a Multivariate Response

The ecological and environmental issues today encourage the use of eco-geo-materials that consume less «grey energy» , such as raw earth building material. This raw earth material is inexpensive and is generally available on the construction site. The compressive strength and ductility have a major role on its mechanical behavior. However, few studies are effectively dedicated to estimate both the compressive strength and the ductility property of raw earth materials. Treated with a very low percentage of binders, the raw earth is transformed into an eco-material suitable for use in the construction. However, the experimental study of the raw earth concrete mechanical properties presents uncertainties related to the bio-sourced nature of its components, such as plant fibers and raw earth. In this study, the raw earth samples of 25 different formulations were casted and tested for 90 days of curing time. The samples were tested for unconfined compressive strength through which ductility index were inferred. In order to optimize the formulation of this raw earth material, a «Design of Experiments» was conducted to study the effect of the various components on these two mechanical properties. A multivariate statistical regression technique of PLS, Partial Least Square, was performed to evaluate the design. This PLS technique was selected because of the complicated experimental design data along with different constraints on model based on the two responses. The obtained results show this technique could be a helpful tool to improve and optimize a raw earth concrete formulation.

Saber Imanzadeh, Armelle Jarno, Said Taibi
Noise, Channel and Message Identification on MIMO Channels with General Noise

This work presents an application of Uncertainty Quantification (UQ) approaches to a Multiple-Input Multiple-Output (MIMO) system. The use of UQ techniques in this field is new and is one of the originalities of the paper. A second originality of this work is to furnish an efficient method to deal with non-gaussian noise, with non-zero mean. A third originality is that the method proposed allows the estimation of the noise, of the channel and the decoding of messages - the approach makes all the chain: determination of the distribution of the noise, identification of the Channel Matrix and detection of the symbols transmitted. The first step is made by using UQ techniques, which furnish the distribution of the noise, without assumption of Gaussian distribution or a mean equal to zero. A new application of UQ techniques furnishes the channel matrix. Then, the detection of symbols is performed by a method based on the determination of a selection matrix formed by lines of the identity matrix. Numerical examples show that the proposed approach is practical and efficient.

Lucas Nogueira Ribeiro, João César Moura Mota, Didier Le Ruyet, Eduardo Souza de Cursi
Estimation of the Extreme Response Probability Distribution of Offshore Structures Due to Current and Turbulence

For a model of offshore structure standing in a fluid, we derived the cumulative distribution function of the extreme base shear and the extreme bending moment, that is, the extreme value produced by the weighted sum of a spatially continuous non-Gaussian stochastic horizontal force field produced by current and turbulence, in a given time interval it is assumed to be stationary. In the proposed methodology, the theory of translated processes is used to solve the problem of finding the extreme of non-Gaussian stochastic processes. The application is carried out on a thick column whereas the stochastic field of turbulence is supposed to be Gaussian but not necessarily narrow-band. The cumulative distribution functions obtained are in good agreement with statistical results derived from simulations in the time domain.

Oscar Sanchez Jimenez, Emmanuel Pagnacco, Eduardo Souza de Cursi, Rubens Sampaio
Statistical Analysis of Biological Models with Uncertainty

In this contribution relevant biological models, based on random differential equations, are studied. For the sake of generality, we assume that the initial condition and the biological model parameters are dependent random variables with arbitrary probability distributions. We present a general methodology that enables us to provide a full probabilistic description of the solution stochastic process for each stochastic model. The statistical analysis is performed through the calculation of the first probability function by applying the random variable transformation technique. From the first probability density function, we can calculate any one-dimensional moment of the solution, including the mean and the variance as important particular cases. Our theoretical findings are applied to describe the probabilistic dynamics of Spirulina sp. biomass production in a particular medium using real data.

Vicent Bevia, Juan-Carlos Cortés, Ana Navarro-Quiles, Jose-Vicente Romero
Uncertainty Propagation in Wind Turbine Blade Loads

Minimizing the cost and enhancing the lifespan of wind turbines entails the optimization of the material distribution of wind turbine components (blades, tower, etc.) without compromising their structural safety. Wind turbines are often design using the IEC 61400-1 standard to provide an appropriate level of protection against damage from all hazards during the planned lifetime. Typically, aero-elastic simulations codes are used to determine loads and displacements time history in the wind turbine. To predict the fatigue damage limit of the wind turbine blade, it is important to quantify and model all relevant uncertainties but it requires a considered amount of simulation time, and a surrogate model can substitute this simulation, to decrease this time consuming part of the problem. In this study, Monte Carlo simulation and FAST code are used to simulate different wind conditions. Here, 10-min of effective simulations generate a time history for all forces and moments acting in 10 selected gages of the blade. Subsequently, we quantify the uncertainty of their maximum value using a Gaussian process (Kriging) and Deep Neural Network (DNN), fitting this maximum output values with their correspondent input values. For Kriging and DNN a good fitting was found for almost all output variables.

Wilson J. Veloz, Hongbo Zhang, Hao Bai, Younes Aoues, Didier Lemosse
Influence of Temperature Randomness on Vibration and Buckling of Slender Beams

Previous studies have demonstrated the influence of thermal stresses on the static and dynamic behavior of structures. In most cases of practical interest, temperature variations are governed by complex combinations of heat transfer mechanisms. As a result, the temperature values at different points of a structure can be considered as random variables. The present paper addresses the stochastic modeling of the influence of space-dependent temperature variations on the natural frequencies of beams. For this purpose, based on the hypotheses of the classical Euler-Bernoulli beam theory, a finite element model is constructed for the bending vibrations of beams, accounting for thermal influences. A particular scenario is considered in which the beam is subjected to random linearly-varying temperature fields, parameterized by two random variables. A probabilistic model is derived, which provides the PDF of the thermally-induced axial force from the PDFs of the random variables. Numerical simulations are performed for a clamped aluminum beam. Sampling-based statistics for the thermal axial load and the first six natural frequencies of the beam are presented. In addition, since thermal stresses can induce buckling, the probability of failure by this mechanism is also computed. Results enable to conclude that temperature uncertainty can be significant upon the vibration and buckling behavior of beams, which justifies its consideration in structural analyses.

Everton Spuldaro, Luiz Fabiano Damy, Domingos A. Rade
Investigating the Influence of Mechanical Property Variability on Dispersion Diagrams Using Bayesian Inference

Phononic crystals are periodic structures that exhibit frequency ranges where elastic waves cannot propagate, also known as Bragg scattering band gaps. This study aims at simulating dynamic response of a periodic plane frame structure using Bayesian statistics to understand the bandgap behavior with respect to the variability in the Young’s modulus and mass density. The spectral element method is used to compute Frequency Response Functions (FRF). The spatial distribution of simulated FRFs at each frequency is used as a virtual measurement, from which the wavenumbers are obtained via a Prony type method. A Markov chain Monte Carlo (MCMC) algorithm which considers the difference between the simulated and virtually observed wavenumbers is used to simulate the posterior distribution of the mechanical properties. Stochastic dispersion diagrams are simulated via Monte Carlo from these posterior distributions. The MCMC algorithm, by using the additional information related to the wavenumber, presented a more precise wavenumber dispersion estimation. These investigations are relevant due to the expected variability of mechanical properties of periodic frame structures made by additive manufacturing.

Luiz Henrique Marra Silva Ribeiro, Vinícius Fonseca Dal Poggetto, Danilo Beli, Adriano T. Fabro, José Roberto F. Arruda
A Computational Procedure to Capture the Data Uncertainty in a Model Calibration: The Case of the Estimation of the Effectiveness of the Influenza Vaccine

In this paper we propose a technique to estimate the effectiveness of the influenza vaccine. The effectiveness of the vaccine is estimated every year, when the influenza season has finished, analyzing samples of a high number of patients in the emergency departments of the hospitals, and this makes it very expensive.To do so, our proposal consists of a difference equations model that simulates the transmission dynamics of the influenza where the vaccine effectiveness is included as a parameter to be determined. The proposed technique will calibrate the model parameters taking into account the uncertainty of the data provided from reported cases of influenza. The calibration will return an estimation of the vaccine effectiveness with a 95% confidence interval.

David Martínez-Rodríguez, Ana Navarro-Quiles, Raul San-Julián-Garcés, Rafael-J Villanueva

Optimization

Frontmatter
Uncertainty Quantification of Pareto Fronts

Uncertainty quantification of Pareto fronts introduces new challenges connected to probabilities in infinite dimensional spaces. Indeed, Pareto fronts are, in general, manifolds belonging to infinite dimensional spaces: for instance, a curve in bi-objective optimization or a surface in three objective optimization. This article examines the methods for the determination of means, standard deviations and confidence intervals of Pareto fronts. We show that a punctual mean is not adapted and that the use of chaos expansions may lead to difficulties. Then we propose an approach based on a variational characterization of the mean and we show that it is effective to generate statistics of Pareto fronts. Finally, we examine the use of expansions to generate large samples and evaluate probabilities connected to Pareto fronts.

Mohamed Bassi, Emmanuel Pagnacco, Roberta Lima, Eduardo Souza de Cursi
Robust Optimization for Multiple Response Using Stochastic Model

Due to a lot of uncertainties in the robust optimization process, especially in multiple response problems, many random factors can cost doubt on results. The aim of this paper is to propose a robust optimization method for multiple response considering the random factors in the robust optimization design to solve the aforementioned problem. In this paper, we research the multi-response robustness optimization of the anti-rolling torsion bar using a stochastic model. First, the quality loss function of the anti-rolling torsion bar is determined as the optimization object, and the diameters of the anti-rolling torsion bar are determined as the design variables. Second, the multi-response robust optimization model, considering random factors (such as the loads), is established by using the stochastic model. Finally, the Monte Carlo sampling method combined with a non-dominated sorting genetic algorithm II (NSGA II) is adopted to solve this robust optimization problem, and then the robust optimization solution is obtained. The research results indicate that the anti-rolling torsion bar weight decreases, and the stiffness and fatigue strength increase. Furthermore, the quality performance of the anti-rolling torsion bar gets better, and the anti-disturbance ability of the anti-rolling torsion bar gets stronger.

Shaodi Dong, Xiaosong Yang, Zhao Tang, Jianjun Zhang
Robust Design of Stochastic Dynamic Systems Based on Fatigue Damage

The design of engineering structures goes through several phases before arriving at a final concept or prototype. Comfort, safety and reliability are desirable characteristics to be achieved, but when it comes to obtaining lighter structures, this can be a problem as to the effects of mechanical vibration, such as fatigue in metals. Material properties related to fatigue are only obtained experimentally using standardized specimens and controlled environments. Results obtained by these tests present a statistical character and carry uncertainties and measurement errors that may significantly affect the final fatigue failure condition. In this context, the inclusion of uncertainties in a computational model becomes essential. This work presents a methodology for fatigue failure prediction by applying the Sines’ fatigue criterion and allowing fatigue analysis to be performed numerically during the design phase. Uncertainties are included in the model using the stochastic finite elements method, with random fields discretized by the so-called Karhunen-Loève expansion method. As stochastic analysis demands multiple function evaluations, the computational cost involved becomes high. Due to this, the application of a model condensation procedure is necessary. After presenting the theory, a robust multiobjective optimization procedure is performed to enhance fatigue life of a thin plate subjected to cyclic loads, which is directly in conflict with reducing its mass. This procedure seeks for not only one point in the search space, but a whole group of solutions where all are treated as optimal and are less susceptible to parameter fluctuations. Numerical results are presented in terms of FRF, stress responses in the frequency domain and Sines fatigue index for each finite element composing the plate.

Ulisses Lima Rosa, Lauren Karoline S. Gonçalves, Antonio M. G. de Lima
Stochastic Gradient Descent for Risk Optimization

This paper presents an approach for the use of stochastic gradient descent methods for the solution of risk optimization problems. The first challenge is to avoid the high-cost evaluation of the failure probability and its gradient at each iteration of the optimization process. We propose here that it is accomplished by employing a stochastic gradient descent algorithm for the minimization of the Chernoff bound of the limit state function associated with the probabilistic constraint. The employed stochastic gradient descent algorithm, the Adam algorithm, is a robust method used in machine learning training. A numerical example is presented to illustrate the advantages and potential drawbacks of the proposed approach.

André Gustavo Carlon, André Jacomel Torii, Rafael Holdorf Lopez, José Eduardo Souza de Cursi
Time-Variant Reliability-Based Optimization with Double-Loop Kriging Surrogates

Time-variant reliability (TvR) analysis allows capturing the time-dependence of the probability of failure and the uncertainty of the deterioration process. The assessment of TR of existing structures subjected to degradation is an important task for taking decisions on inspection, maintenance and repair actions. The Time-variant Reliability-Based Design Optimization (TvRBDO) approach aims at searching the optimal design that minimizes the structural cost and to ensure a target reliability level during the operational life. However, for engineering problems, the TvRBDO problems may become computationally prohibitive when complex simulation models are involved (ie. Finite element method). This work proposes a surrogate-assisted double-loop approach for TvRBDO, where the outer-loop optimizes the objective function and the inner loop calculates the time-dependent reliability constraints. The time-dependent reliability of the inner loop is calculated by Monte Carlo simulations at discreted time intervals. To reduce the number of function evaluations, an inner Kriging is used to predict the response of limit state functions. The so-called single-loop Kriging surrogate method (SILK) developed by Hu and Mahadevan is adopted to calculate the time-variant reliability. The reliability results of the inner-loop are then used to train an outer-loop Kriging, and Expected Feasible Function (EFF) is used to improve its accuracy. After the outer-loop Kriging is trained, TvRBDO is conducted based on it, and the program stops when the difference between two optimum is less then allowance error. Two examples are used to validate this method.

Hongbo Zhang, Younes Aoues, Didier Lemosse, Hao Bai, Eduardo Souza De Cursi
A Variational Approach for the Determination of Continuous Pareto Frontier for Multi-objective Problems

In this paper, a novel approach is proposed to generate set of Pareto points to represent the optimal solutions along the Pareto frontier. This approach, which introduces a new definition of dominance, can be interpreted as a representation of the solution of the multi-objective optimization under the form of the solution of a problem in variational calculus. The method deals with both convex and non-convex problems. In order to validate the method, multi-objective numerical optimization problems are considered.

Hafid Zidani, Rachid Ellaia, Edouardo Souza De Cursi
Backmatter
Metadaten
Titel
Proceedings of the 5th International Symposium on Uncertainty Quantification and Stochastic Modelling
herausgegeben von
José Eduardo Souza De Cursi
Copyright-Jahr
2021
Electronic ISBN
978-3-030-53669-5
Print ISBN
978-3-030-53668-8
DOI
https://doi.org/10.1007/978-3-030-53669-5

Neuer Inhalt