Skip to main content

2017 | Buch

14th International Probabilistic Workshop

insite
SUCHEN

Über dieses Buch

This book presents the proceedings of the 14th International Probabilistic Workshop that was held in Ghent, Belgium in December 2016. Probabilistic methods are currently of crucial importance for research and developments in the field of engineering, which face challenges presented by new materials and technologies and rapidly changing societal needs and values. Contemporary needs related to, for example, performance-based design, service-life design, life-cycle analysis, product optimization, assessment of existing structures and structural robustness give rise to new developments as well as accurate and practically applicable probabilistic and statistical engineering methods to support these developments. These proceedings are a valuable resource for anyone interested in contemporary developments in the field of probabilistic engineering applications.

Inhaltsverzeichnis

Keynotes

Optimizing Adaptable Systems for Future Uncertainty

Demands on structures and infrastructures change over their service life and cannot be predicted with certainty. Adaptable (or flexible) infrastructure designs are thus potentially beneficial, enabling easier adjustments of the systems at a later stage. However, systematic quantitative investigations and corresponding recommendations are missing. In Špačková and Straub (Bayesian models for long-term adaptation decisions. Working paper, ERA Group, TU München, Germany) (2016), we present a framework for such an analysis, which is based on sequential decision processes. In this contribution, we summarize the approach and focus on the interpretation of flexibility. We show that the framework enables quantification of the value of flexibility, to answer the question: what is the maximum amount that should be spent additionally to ensure system flexibility? Two case studies illustrate that this value is strongly dependent on a number of factors, in particular on the types of uncertainty present and the amount of future information collected in the future.

Freak Events, Black Swans, and Unknowable Unknowns: Impact on Risk-Based Design

To design means making informed decisions about suitable alternatives in the face of uncertainties. As a result, structural design criteria and inspection plans reflect the objective of satisfactory performance under well selected extreme conditions. The extent to which the extreme boundary is “pushed” depends on the design approach (ex: component vs system design), the nature and the consequences of the hazards, and risk acceptance, all of which fit neatly into the traditional framework of decision theory. This basic framework is also broad enough to include wider socio-economic and environmental objects, so that provisions with respect to robustness, resilience, sustainability, and risk mitigative measures in general, can be effectively accounted for. Various civil engineering fields suffer from a perception that we don’t dig deep enough, that we fail to consider “beyond extreme” scenarios. Every major accident, or any exceptional natural disaster, or any surprising combination of circumstances, triggers a new call for re-examination of the design rationale: if a freak event can be explained, then surely it should be (have been) accounted for. This paper looks at what really lies beyond our “design frontier”. We distinguish between three broad classes of events: far-out extremes for heavy-tailed hazards, scenarios marked by very unlikely combinations of events (perfect storms), and so-called unknowable unknowns. We identify, from a decision making point of view, which objectives, which tools, and which risk measures can be used, and which lessons can be learned.

Structural Reliability Methods and Statistical Approaches

Extrapolation, Invariance, Geometry and Subset Sampling

In the last years the subset sampling method has often been used in reliability problems as a tool for calculating very small probabilities. The method extrapolates from an initial Monte Carlo estimate for the probability content of a failure domain found by a suitable higher level of the original limit state function. Then iteratively conditional probabilities are estimated for values of the limit state function decreasing to zero. But there are implicit assumptions about the structure of the failure domains which have to be fulfilled that the method works properly. It is shown by examples that at least in some cases if these assumptions are not fulfilled, erroneous results may be obtained. For the further development of the subset sampling concept it might be desirable to find approaches where it is possible to ascertain that these implicit assumptions are not violated or how to avoid by an increased computational effort misleading influences of the structure of the limit state functions.

Performance of Various Sampling Schemes in Asymptotic Sampling

This article deals with the possibility of using Asymptotic Sampling (AS) for the estimation of failure probability. The AS algorithm requires samples of multidimensional Gaussian random vectors. There are many alternative means of obtaining such samples and the selected sampling strategy influences the performance of the AS method. Several reliability problems (testing functions) have been selected in order to test AS with various sampling schemes. First, the functions are analysed using AS in combination with (i) Monte Carlo designs, (ii) LHS designs optimized using the Periodic Audze-Eglājs (PAE) criterion and, (iii) designs prepared using Sobol sequences. Afterwards, the same set of problems is solved without utilizing the AS procedure. This is achieved via the direct estimation of failure probability. All the results are also compared with the exact failure probability value.

Moving Least Squares Metamodels—Hyperparameter, Variable Reduction and Model Selection

The objective of metamodel applications is to obtain a large amount of system information from a small data set. Areas of application within the Computer-aided engineering are e.g. optimization problems, robust design engineering or sensitivity analysis. This paper deals with the metamodel techniques Least Squares (LS) regression and Moving Least Squares (MLS) as well as with their application in case of multivariate and nonlinear system behavior. In this context, LS regression represents a widely used method, which is limited in application due to the fixed polynomial order and the resulting relationship between existing support points and necessary polynomial coefficients. A more flexible metamodel technique regarding the description of nonlinearities is the MLS approach. In this procedure, the support points are weighted to build a local polynomial. The multivariate MLS-application is implemented by an anisotropic distance measure and a variable reduction. The selection of the most appropriate metamodel is tested for a deterministic model framework of mathematical test functions regarding the polynomial order, variable reduction and metamodel technique.

Comparing Three Methodologies for System Identification and Prediction

Most civil infrastructure in service today was built during the second half of 20th century and is now reaching the end of its design life. Replacement of all aging civil infrastructure is a drain on national and global economies. Design models for civil infrastructure are justifiably conservative. Decision making related to asset management activities such as repair, improvement and extension of existing infrastructure can be enhanced through structural identification and capacity prediction. Recent advances in sensing and computing technologies enable use of model-based data interpretation methods, such as residual minimization, Bayesian model updating and error-domain model falsification (EDMF) for structural identification. In the traditional Bayesian-model-updating approach for parameter identification, the uncertainty is assumed to be defined by uncorrelated Gaussian distributions. However, in civil infrastructure, the uncertainty associated with the system is rarely Gaussian and often systematic with high, yet unknown, correlations. In this paper, a modified Bayesian model updating methodology with L-norm-based likelihood function is compared with EDMF and traditional Bayesian methodology. The traditional Bayesian model updating methodology may provide biased prediction when correlations are unknown. The results obtained using the modified Bayesian model-updating approach are similar to the results obtained using EDMF. The three methodologies are compared with respect to their ease of integration of domain knowledge and their adaptability to changing information. Compared with traditional Bayesian model updating methodology, EDMF and modified Bayesian model updating methodologies provide robust, albeit less precise, prediction of structural response at unmeasured locations for civil-engineering infrastructure. Finally, EDMF has advantages over Bayesian methodologies for practical engineering use.

Global Sensitivity Analysis of Reinforced Concrete Walls Subjected to Standard Fire—A Comparison of Methods

A global sensitivity analysis is a powerful method to determine the governing stochastic input variables of a resistance model. Especially variance based methods, like Sobol indices, allow a ranking of probabilistic input variables to judge on the importance for the considered resistance model. In case of a Monte Carlo simulation, the “matrix method” proposed by Saltelli is well established to estimate the Sobol indices S i . But in case of complex, nonlinear limit state functions, the numerous evaluations of the examined model can be cumbersome to compute. Hence the question of more simple methods, which also allow a ranking of probabilistic input variables, arises. In this paper, a global sensitivity analysis of reinforced concrete walls subjected to a standard fire is performed. The results of the “matrix method” by Saltelli for the Monte Carlo estimates of the first order Sobol indices are compared to Spearman’s rank order correlation coefficients and a conceptual implementation for the calculation of the Sobol indices S i . The results indicate that the results for the conceptual implementation and the “matrix method” are close to each other. The rank order correlation coefficients by Spearman are suitable for the ranking of input variables, especially for linear models.

Probability and Statistics

Comparison of Computed and Observed Probabilities of Failure and Core Damage Frequencies

The Fukushima event has again questioned the safety of nuclear power plants (NPP). Statisticians and parts of the public claim that there is a significant difference of the Core Damage Frequency (CDF) based on Probabilistic Safety Analysis (PSA) and on historical data of accidents including the Fukushima event. This paper compares the results of both approaches including a short discussion of the goal values, the different methodologies, special properties of the different technical systems and the history. Furthermore, the paper extends the comparison approach also to the safety of structures. The safety of structures is often expressed as Probability of Failure (PF). Finally, results of both comparisons are given in the paper.

Probability of a Large Fire in a Road Tunnel Bayesian Inference

Article 13 of the EU Directive on minimum safety requirements for tunnels in the Trans-European Road Network states that a “risk analysis, where necessary, shall be carried out”. In the Netherlands, the risk of death for road users in a tunnel (internal risk) is calculated with a model for quantitative risk analysis, which is called “QRA-tunnels”. The probability of fire is an input parameter of this model. In 2013 the probability of a large fire has been calculated with a Bayesian approach, based on one large fire in 1978. In 2014 a second large fire has occurred in a Dutch tunnel. In this paper Bayesian inference is used for the calculation of the probability of a large fire in a road tunnel, based on two large fires. The calculations result in a slight but not significant increased probability of a large fire in a road tunnel.

Statistical Extrapolation for Extreme Traffic Load Effect Estimation on Bridges

This paper presents a comparison of two methods of statistical extrapolation, the method of block maxima with fit of generalised extreme value distribution, and the level cross counting method with fit of Rice’s formula, applied for estimation of extreme traffic load effects on road bridges. For the purpose of this investigation, both real traffic streams from actually recorded traffic data and synthetic traffic streams generated by means of traffic simulation based on probabilistic descriptors for the essential traffic parameters are analysed. These traffic streams are then applied to a selected bridge structure performing structural analysis, to obtain load effect time histories of a certain structural response parameter due to this traffic loading. These time histories constitute the base for subsequent extreme load effect estimation using the different methods of statistical extrapolation. The methods, their main assumptions and limitations are shortly reviewed. Numerical results from their application are presented in detail and compared to each other. Sets of different traffic stream lengths and extrapolation ranges (desired return periods of extreme load effects) are analysed to investigate their effect on uncertainties and convergence behaviour. Based on this comparison, the appropriateness of application of the methods is discussed.

Uncertainty Quantification

Uncertainty Quantification for Force Identification and Response Estimation in Structural Dynamics

Three sources of uncertainty are present when applying system inversion techniques for force identification and response estimation in structural dynamics; measurement noise, modeling errors, and unmodeled excitation that is acting on the structure besides the forces that are identified. The latter may for example consist of wind loads or other sources of ambient excitation. This paper presents a novel approach for quantification of the estimation uncertainty introduced by measurement noise and unmodeled excitation. The proposed methodology is applied for a state-of-the-art joint input-state estimation algorithm, but can be easily extended to other force identification and response estimation algorithms. The uncertainty on the estimated quantities can be used to design a sensor network and to determine the optimal noise statistics that are applied for joint input-state estimation. A validation is performed using data obtained from full-scale experiments on a footbridge.

Uncertainty Quantification of Creep in Concrete by Taylor Series Expansion

If deterministic creep prediction models are compared with actual measurement data, often significant differences can be observed. These inconsistencies are associated to various uncertainties. First, the physical mechanism causing creep of concrete is not yet fully understood. Hence, uncertainty of prediction models can be attributed to an insufficient description of the physical mechanism causing creep. It is very likely that certain influences of creep of concrete are not fully considered in current prediction models, resulting in systematic model errors. Because this error is due to a lack of understanding of the underlying physical mechanisms it can only be quantified by comparing prediction results with experimental data. Secondly, the stochastic character of the input parameters form an additional source of uncertainty which can be quantified by the variance of the model response. The coefficient of variation in function of time-duration is a useful measure to quantify the level of uncertainty due to the stochastic nature of the input parameters. In literature statistical analysis by means of numerical simulations are often used for this matter. However, even for specialized sampling techniques, a large amount of samples is necessary to cover the relevant ranges of various input parameters. The aim of the present study is to provide an approximate uncertainty quantification based on a Taylor series approach. Such method has the advantage that is does not require sampling nor the knowledge of the probability density function of the input parameters. This approximate method to quantify the uncertainty due to the input parameters is evaluated and compared with the statistical analysis for several creep prediction models available in literature and design codes.

Uncertainty Quantification of Extrapolation Techniques for Bonded Anchors

The multi-decade deformation prediction of bonded anchors under sustained load is paramount to the safe and economical design of civil structures, especially for adhesive anchors. Yet, there is considerable lack of guidelines on how to actually predict the life-time performance of such systems given various installation and in-service conditions, the aging visco-elastic nature of both involved materials—concrete and polymer-based mortar—and typically short observation durations of weeks only. In this contribution the authors critically review the current practice of product qualification by means of 1000 h creep tests on entire anchor systems, extrapolated to 50 years and compared to a deterministic limit. Based on available data sets for two typical products on the market the current practice as specified in most normative documents and recommendations is investigated, both from a scientific and practical point of view. As a result, uncertainties owing to different sources are elaborated, the suitability of the creep prediction model in combination with various extrapolation techniques is evaluated, and safety factors based on those are discussed.

Uncertainty Quantification Applied to a Fire-Exposed Glued-Laminated Timber Beam

As a natural material, the response of timber structures under normal conditions and to fire is subject to wide variability. Deterministic models therefore struggle to reflect the reality of the response of timber since small variations in input influence the output significantly. However it is relatively straightforward to quantify uncertainties in model inputs in order to determine the uncertainties in the model response by employing uncertainty quantification (UQ) techniques. UQ of structural response to fire traditionally employs Monte Carlo techniques (Eamon and Jensen 2013) which are computationally expensive for a large number of variables. Deterministic Sampling (DS) (Hessling 2013) is a relatively new efficient alternative method for UQ. DS assumes that a continuous probability density function can be replaced by an ensemble of discrete deterministic samples if the two representations have the same statistical moments. DS has been demonstrated applied to, e.g. CFD simulations (Anderson et al. 2016). This paper applies DS techniques to study glued-laminated (glulam) timber in fire. Results are compared with random sampling techniques to show the validity of this method in this application.

Uncertainty Modelling

Generation of Spatially Embedded Random Networks to Model Complex Transportation Networks

Random networks are increasingly used to analyse complex transportation networks, such as airline routes, roads and rail networks. So far, this research has been focused on describing the properties of the networks with the help of random networks, often without considering their spatial properties. In this article, a methodology is proposed to create random networks conserving their spatial properties. The produced random networks are not intended to be an accurate model of the real-world network being investigated, but are to be used to gain insight into the functioning of the network taking into consideration its spatial properties, which has potential to be useful in many types of analysis, e.g. estimating the network related risk. The proposed methodology combines a spatial non-homogeneous point process for vertex creation, which accounts for the spatial distribution of vertices, considering clustering effects of the network and a hybrid connection model for the edge creation. To illustrate the ability of the proposed methodology to be used to gain insight into a real world network, it is used to estimate standard structural statistics for part of the Swiss road network, and these are then compared with the known values.

Effect of Climate Change on Snow Load on Ground: Bayesian Approach for Snow Map Refinement

Alteration of ground snow loads due to the climate change may significantly impact the reliability of existing structures, as well as design Codes for new ones. In the paper a novel technique for snow load map refinement is proposed where ground snow loads derived starting from gridded climate data provided by climate models are combined with observed point measurements of snow loads and then suitably updated. First, an a priori random field of characteristic ground snow loads at the sea level is deduced from the analysis of gridded climate data. This prior random field is discretized by the truncated Karhunen-Loeve expansion, to separate the spatial and the stochastic domain and to reduce the dimension of the problem. The distribution of the resulting standard normal random variables are then updated incorporating point measurements of ground snow loads collected in the past and using the Markov Chain Monte Carlo method to sample the posterior. The Bayesian approach results in a more trustable, refined snow load map, and furthermore prospects a dynamic, sequential model updating procedure as new observed data becomes available.

Imposed Correlation Between Random Field and Discrete Particle Placement

In recent time it has be recognised that capturing the spatial variability of mechanical properties of heterogeneous materials is essential for a realistic estimation of uncertainties in probabilistic structural analyses. Several authors have successfully proposed random field concepts in combination with standard finite elements. Although some discrete element models such as the lattice discrete particle model (LDPM) already mimic microstructural effects associated with the heterogeneity of concrete very well, when compared to the continuum framework, there are still known reasons for introducing higher order spatial variability. One such key issue concerns the ability to capture the well-known statistical size effect associated with the physical meaning of an auto-correlation length in material property fields. The presented paper addresses the issue of imposing correlation between a particular structural discretisation (i.e. radius and placement of particles in a discrete framework) and the realization of a random field for spatially variable material properties. Authors present here a novel method for imposing correlation between the discrete particle placement and a governing random field and illustrate the important implications towards enhanced realism in the reliability based assessment of concrete fastening systems.

A Bayesian Network for the Definition of Probability Models for Masonry Mechanical Parameters

A methodology for the definition of probabilistic models for material strength through a visual analysis of masonry structures is presented. A Bayesian Network whose nodes are represented by the masonry class and masonry features is developed based on the so called Masonry Quality Index method. The network is improved taking into account further quantitative information derived from previously tested masonry structures, similar to the one under assessment, supplemented by engineering judgment on the masonry features. Masonry mechanical properties can be so inferred using the network, given the results of qualitative investigation. The so established probability model can be used for preliminary reliability analysis or as prior distribution for further updating.

A Bayesian Network for the Definition of Probability Models for Compressive Strength of Concrete Homogeneous Population

A methodology for the definition of probabilistic models for concrete compressive strength through the outcomes of non-destructive investigation is presented. Results of standard compressive tests on concrete are collected, and, in order to identify homogeneous concrete populations, corresponding to individual concrete classes, an innovative approach is suggested, based on fitting the crude histogram of all available data with a mixture model. The results lead to the definition of a Bayesian Network whose nodes are represented by the concrete class and the concrete compressive strength. The network is improved with a further variable representing the strength estimated through non-destructive tests. The concrete compressive strength will be thus inferred using the network, considering the estimated resistance.

Probabilistic Tsunami Hazard Assessment Through Large Scale Simulations

Recent occurrences of catastrophic events of coupled ground motions and tsunami waves have raised concerns about the need for comprehensive tsunami mitigation planning. In this field, the quantification of tsunami waves, which can be obtained from tsunami hazard assessment, is an important data. The results of tsunami hazard assessments show the intensity of the impact on a coastal area versus its return period. This study performs probabilistic tsunami hazard assessment (PTHA) in Lhoknga coastline, Aceh, Indonesia. PTHA employs information about earthquake-caused tsunami faults and their return period. The generation, propagation, and inundation of tsunami waves on the coastline are simulated using available tsunami numerical simulation. In addition, smoothed particle hydrodynamic (SPH) method is used to observe the spread of tsunami flood on the dry land. This open new opportunity to assess tsunami flooding hazard within a probabilistic environment.

Applied Structural Reliability Analysis

Probabilistic Slope Stability Analysis Using Approximative FORM

In the Netherlands, all primary flood defences are periodically tested against statutory safety standards. The new safety assessment framework WTI 2017 (defined in terms of allowable probability of flooding) allows for probabilistic as well as semi-probabilistic assessments, which are based on a partial factor approach. To ensure consistency between probabilistic and semi-probabilistic assessments, the semi-probabilistic rules have to be (re)calibrated based on probabilistic analyses. Therefore one needs a robust and reliable algorithm for the probabilistic analysis of inner slope stability of levees and river embankments. In this contribution we present a probabilistic calculation method, which is based on the First Order Reliability Method (FORM) and fragility curves. The method involves de reliability evaluation at various water levels (fragility curves) and subsequently a FORM combination over these water levels to obtain an overall reliability index and FORM sensitivity coefficients. The method balances computational time with accuracy and proved very efficient and accurate. This method is based on an approach, which has been used in the calibration of semi-probabilistic safety factors and probabilistic risk analysis of flood defences in the Netherlands in the past. However, the accuracy of this method has never been investigated. Therefore, this approach is evaluated within a probabilistic benchmark using different test functions from literature and compared to standard reliability methods. Additionally, this approach is applied in a case study on probabilistic analysis of a river embankment.

Bayesian Updating of Slope Reliability in Undrained Clay with Vane Shear Test Data

In-situ test data, monitoring data and other site-specific information are a common basis for assessing geotechnical performance. These information enable one to learn probabilistic models of uncertain geotechnical properties and update the reliability estimate of geotechnical structures. This learning process is facilitated by the application of Bayesian analysis, which makes optimal use of site-specific information. The objective of this study is to investigate the application of Bayesian analysis to update the probabilistic description of spatially varying soil properties and the reliability of slope stability with in-situ test data. For proper characterization of the prior information on the undrained shear strength s u , a non-stationary random field model is proposed to account for the depth-dependent nature of s u . Bayesian updating for learning the distribution of s u and updating the slope reliability is performed with the adaptive BUS approach with subset simulation. The approach is applied to a saturated clay slope in spatially variable soil. The spatial distribution of the s u is updated with vane shear test data. In addition, the effect of the borehole location on the updated slope reliability is investigated, to inform future optimal test program.

Structural Reliability in Design and Analysis of Tensile Structures

In order to achieve a semi-probabilistic verification format such as used for the analysis of conventional structures in the Eurocodes, research into structural reliability calculations for tensile surface structures is needed. Appropriate partial factors have to be proposed and evaluated. To gain insight, a scholastic example with three cable segments is analysed. In order to take into account the uncertainties associated to the pre-tensioned system, Latin Hypercube Sampling is applied to sample six main influencing variables. The cable system is designed according to the ultimate limit state under loading, considering a partial factor of 1.35 for the pretension. The structural reliability of both designs is evaluated. An increase of the partial factor for pretension from 1 to 1.35 results in an increase of the reliability index from 2.27 to 5.42.

Probabilistic Assessment of Wind-Loaded Façade Elements

Façade elements are the most vulnerable parts of buildings with respect to wind loading. In this paper it is assessed to what extent wind-loaded façade elements fulfill the minimum reliability requirements as set in the Eurocodes. The reliability analysis needed for this assessment should account for the uncertainties in each of the parameters relevant for design; both in the structural resistance part and in the wind-loading part. Additionally, the analysis needs to account for the uncertainties in the physical models as such. Until now, no generally applicable assessment procedure exists for this purpose. By combining the knowledge of existing literature, this paper provides a generally applicable assessment procedure that is able to determine the structural reliability of wind loaded façade elements in terms of the failure probability. The assessment procedure uses in-situ wind speed measurements obtained from meteorological stations and location-specific pressure coefficient measurements obtained from boundary-layer wind-tunnels. Wind-directionality effects are taken into account explicitly. In order to show its potential, the assessment procedure is applied on a case-study. The results show that, due to wind-directionality effects, the failure probability of the façade elements is highly dependent on the orientation of the building. Furthermore the results show that the failure probability is almost fully determined by the governing incident wind-directions only.

Shear Resistance of Prestressed Girders: Probabilistic Design

The paper describes a complex approach to probabilistic design of precast structural members made form advanced cementitious composites. First, a series of material, small scale component tests have been conducted in collaboration between two laboratories. Based on these tests identification of fracture-mechanical parameters (and their statistics) for two concrete mixtures used for the production of precast structural members was performed. Subsequently, studies have been performed on (a) full scale pre-stressed concrete roof elements (b) ten scaled laboratory tested elements. Mentioned experiments served as basis for deterministic nonlinear modelling of precast members and subsequent probabilistic evaluation of structural response variability. Final results may be utilized as thresholds for loading of produced structural elements and they aim to present probabilistic design as less conservative compared to classic partial safety factor based design and alternative ECOV method.

Reliability Assessment of Buried Pipelines for Through-Wall Bending Stress

This paper presents a computational framework for estimating the reliability of a buried pipeline under the influence of bending stress due to ovality. The in-service corrosion reduces the ability of the buried pipe wall to resist the pressure that is loaded externally. In this study, the bending stress due to ovality of the pipe wall is examined by considering the adverse effect of corrosion. If the imposed stress exceeds the ultimate strength of the pipe material, failure becomes very possible, and the general performance of the pipeline network will be affected. In most cases, the impact can lead to different failure conditions. To account for the amount of the damage of buried pipe due to corrosion is often difficult as a result of the unavailability of information about the parameters and also the associated uncertainties. Therefore, there is a need to employ a robust and computationally efficient approach to assessing the reliability of a buried pipe. For this reason, the reliability analysis of the stress mentioned above is modelled using a combination of Line Sampling (LS) and Important Sampling (IS) for a time-dependent assessment. The combined method helps to reduce the computational cost and improve the efficiency of the simulation approach. The study also shows that the change in underground water table could have a significant effect on the likelihood of failure of the buried pipes.

Sensitivity Studies Within a Reliability Analysis of Cross Sections with Carbon Concrete

The utilization of fibre reinforced polymer (FRP) reinforcements and textile reinforced concrete (TRC) comprised of carbon fibres is a promising approach to replace the steel reinforcement commonly used in concrete structures. It opens up new structural design opportunities for civil engineers and architects. Due to differences in the stress-strain-relations observed under tension in these reinforcement materials and those well-known stress-strain-relations in steel, new design guidelines are inevitable. In the semi-probabilistic safety concept, unavoidable scatters of properties, which influence the resistance of a structure should be regarded using partial safety factors. Due to the lack of experience they are calibrated following a well-proven method—the reliability theory. As a preliminary work, the present paper deals with the quantification of sensitivities of a resistance-stress-model (R-S-Model) for a textile reinforced concrete cross section under bending load. Thereby, the probability of failure represents the result quantity and the influence of the variation of numerous, scattering input parameters on this value is determined. The result of the investigation should lead to a better understanding of the most important factors within the presented engineering model and the consequences for the definition of safety margins for a reliable and simultaneously economic calibration of a partial safety factor for the material strength of carbon.

Risk Analysis and Optimization

Risk Analysis of Bridge Falsework Structures

Bridge falsework systems are traditionally used to support the formwork during the construction of concrete bridges. These structures have a significant impact on the cost, construction rate and safety of the supported structures. In recent years a high number of accidents involving bridge falsework systems have been reported particularly in the developing world. In order to increase the safety and the efficiency of these systems a risk-informed structural design methodology was developed and applied to selected proprietary system using the results of experimental tests of different types of joints, the results of advanced numerical analyses and newly developed structural robustness and structural fragility indices. Illustrative examples are given detailing the steps and calculations needed, including consideration of model and statistical uncertainties, and the results obtained are discussed. Furthermore, based on the findings obtained through the analysis of different strategies for decreasing risks two scenarios are studied in detail: a reference (baseline) scenario and one selected improved (alternative) scenario. For the cases analysed, it is concluded that if the cost of the permanent structure considerably exceeds the cost of the temporary structure, which is often the case, the extent of improvements in terms of structural and economical risks may justify the extra costs incurred by implementing simple Quality Management procedures.

Reliability-Based Methodology for the Optimal Design of Viscous Dampers

Energy dissipation devices are widely utilized to improve the response of structures subjected to dynamic loadings (e.g. earthquakes, winds). In particular, viscous dampers are hydraulic devices widely employed in structural engineering that dissipate the kinetic energy by producing a damping force against the motion. Despite the uncertainty present in the loads (seismic input) and in the structural models, simplified approaches for the design of the damper properties often neglect the response dispersion due to these uncertainties, or treat them in a simplified way by focusing on the mean response only. In this study, a novel reliability-based methodology for the optimal design of nonlinear viscous dampers is proposed. The methodology involves a reliability analysis nested in an outer optimization loop, which seeks the minimization of an optimal function related to the damper cost subjected to the reliability constraint on the structural performance. In particular, subset simulation is used in the inner loop, while the optimization problem is solved via the COBYLA algorithm. The application of the Subset Simulation and of the proposed design approach is illustrated by considering a realistic case study consisting of a three-storey building equipped with nonlinear viscous dampers, for different levels of the damper nonlinearity.

Optimization of a Landing Gear System Including Uncertainties

The optimization of structural behaviour in the presence of uncertainty is very demanding, especially for complex structures. Robust design optimization (RDO) and reliability-based design optimization (RBDO) are two techniques commonly adopted to deal with the optimization of performance of systems under uncertainty. In the presence multi-objective problems for complex performance criteria, the traditional RDO and RBDO are not always suitable because of two main problems: the prohibitive computational cost and the neglect of higher-order moments. In this paper, a novel optimization strategy, based upon evolutionary algorithms and inverse reasoning, is presented. It was conceived in order to deal with problems that are difficult to solve when adopting RBDO or RDO. To this end, reduced order models are built up using surrogate models together with a singular value/high order singular value decomposition. The proposed algorithm is used to minimize the probability of failure assuring a reliable design, providing an understanding of the acceptable range of uncertainties and keeping robustness. A representative nonlinear landing gear design problem is used to demonstrate the approach, showing how an optimized structure can be found that avoids the “shimmy” phenomenon.

Probabilistic Assessment of New and Existing Structures

Probabilistic Analysis of Combination Rules in Eurocodes

Alternative procedures for combinations of actions together with reliability elements recommended in the National Annexes to Eurocodes in some CEN countries are investigated using probability-based reliability methods. Presented reliability analyses of structural members indicate that the reliability level of some members might be in some cases lower than the recommended reliability level and should be further calibrated.

Floor Live Loads of Building Structures

Floor live loads are fundamental to structural design of buildings. A correct understanding of the intensity of loading is necessary for economic and safe design of structures. In the practical application, the defined design load values are not always the same in design codes. According to EN 1991-1-1, for example, the characteristic value of live load for office buildings is suggested to 3 kN/m2. In DIN EN 1991-1-1/NA (2010), however, the live load for offices is defined with a characteristic value of 2 kN/m2. Furthermore, the occupancy type of a building may be changed after a period of time. In this case, the building structure should be checked for the load carrying capacity when the new load is applied. Therefore, more accurate and reliable information of the load is very useful for the reliability evaluation of the studied structure. To check the accuracy of live load values for design, floor live loads were modelled and numerically simulated for different usage situations. The simulation used the statistical results of existing load surveys as input parameters. Characteristic values of live load for different load effects and possible live load reductions were determined.

Methodology for Evaluating the Safety Level of Current Accepted Design Solutions for Limiting Fire Spread Between Buildings

External fire spread between buildings is internationally considered as a major concern for buildings in dense urban environments. While design guidelines differ between countries, the fundamental methods currently used for limiting the risk of fire spread between buildings are generally limited to specifying the minimum required separation distance for a given unprotected façade area, or conversely, limiting the maximum allowable unprotected façade area for a given separation distance. The safety level associated with the current design guidelines is however unknown, making the implementation of innovative, safer and more cost-effective design solutions difficult. In order to assess the safety target implicitly incorporated in currently accepted design solutions, a methodology is developed for evaluating the annual probability of reaching unacceptable radiation intensities at the opposite façade. As a case study, the methodology is applied to a design which is in agreement with the current UK requirements specified in BR 187. This case study exposes inconsistencies in the current design guidelines, indicating the need for developing explicit safety targets.

Robustness Assessment—A New Perspective to Achieve a Performance Indicator

Robustness has been recognized as interesting research topic due to several collapses that have been occurring over last years. Indeed, this subject is related with global failure or collapse. However, its definition is not consensual since several definitions have been proposed in the literature. This short-paper aims to present a framework for assessing bridge’s robustness as a probabilistic performance indicator. In this study, a non-linear model of a clamped beam with two point loads using DIANA software was developed to validate the framework presented. By means of a probabilistic approach, the load carrying capacity and structural safety were evaluated. In this regard, special focus is placed on an adaptive Monte Carlo simulation procedure to achieve a proper meta-model.

Probabilistic Concepts of Upcoming European Document on Assessment of Existing Structures

The new European project team CEN TC250/WG2.T1 Assessment and Retrofitting of Existing Structures became active in November 2015. The team is responsible for conversion of the relevant parts of Part III of the JRC Scientific and Policy Report into CEN Technical Specification. The upcoming CEN document is to be related to the probabilistic concepts and fundamental requirements of the EN Eurocodes. The Technical Specification should concern all types of buildings, bridges and construction works, including geotechnical structures, exposed to all kinds of actions. The project team developed already the first draft of the Technical specification and submitted it to the technical committee CEN TC250 in April 2016. It contains requirements, general framework of assessment, data updating, structural analysis (linear, nonlinear, dynamic), verifications (partial factors, probabilistic methods, risk assessment), past performance, interventions, annexes (flowchart, time-dependent reliability, assessment of heritage structures). A detailed contents and additional sections of the Technical Specification are to be completed within a year.

Present and Future Probabilistic Challenges for Maintenance of Reinforced Concrete Structures

In the coming decades the most developed countries face the task of maintaining their infrastructure. Probabilistic decision-making tools can support this challenge under the conditions of cost-effectiveness and safety requirements. Even though the fib MC2010 (2010) for concrete structures provides a basis for decision making, a probabilistic overall concept for dealing with existing structures is missing. In the framework of the development of fib MC2010 (2010), more emphasis is put on the assessment of existing structures and hence this paper identifies challenges with respect to a probabilistic maintenance concept for existing concrete structures.

Metadaten
Titel
14th International Probabilistic Workshop
Copyright-Jahr
2017
Electronic ISBN
978-3-319-47886-9
Print ISBN
978-3-319-47885-2
DOI
https://doi.org/10.1007/978-3-319-47886-9