main-content

## Über dieses Buch

This volume presents the proceedings of the 18th International Probabilistic Workshop (IPW), which was held in Guimarães, Portugal in May 2021. Probabilistic methods are currently of crucial importance for research and developments in the field of engineering, which face challenges presented by new materials and technologies and rapidly changing societal needs and values. Contemporary needs related to, for example, performance-based design, service-life design, life-cycle analysis, product optimization, assessment of existing structures and structural robustness give rise to new developments as well as accurate and practically applicable probabilistic and statistical engineering methods to support these developments. These proceedings are a valuable resource for anyone interested in contemporary developments in the field of probabilistic engineering applications.

## Inhaltsverzeichnis

### Decision Analysis Applied to Natural Hazards

Formal methods to handle decision-making under uncertainty that have been created for business management lend themselves to applications in many other areas, in which uncertainties play a major role. Hence, the authors and their co-workers have applied decision analysis to landslides since the 1980′s but many other approaches to landslide assessment and management have in principle done so. The keynote lecture itself will illustrate the application of decision analysis with many examples. For this reason, we concentrate in this paper on the principles of decision-making under uncertainty and the concept of using these principles in hazard and risk analysis of natural threats. We also like to note that what we present here is a summary of our past work. The paper starts with an introduction to the decision-making process and its application to natural threats. Risk management of natural threats is then demonstrated in detail with decision trees and Bayesian networks. This leads to sensitivity analyses to determine which risk management action is most effective.

Herbert H. Einstein, Rita L. Sousa

### Probabilistic Seismic Risk Assessment of School Buildings

The inadequate behavior of existing school buildings observed during past earthquakes in Italy have underlined the need to accurately understand their seismic performance. In order to do so, different metrics can be adopted to characterize their seismic response, either more focused on structural aspects or economic variables. This paper assesses the seismic risk level for three case study school buildings, representing the main typologies found within the Italian school building stock, and comments on the eventual need for retrofitting. A probabilistic-based earthquake engineering (PBEE) performance assessment is carried out using detailed numerical models, analyzed under ground motion records of increasing intensity, to quantify risk-based decision variables, such as expected annual loss and mean annual frequency of collapse. As an alternative to the detailed PBEE framework, a simplified seismic risk classification framework, recently applied in Italy, was also implemented. Different uncertainty parameters are included in the risk estimation frameworks, with a view also to future large-scale implementation of cost-benefit analyses. Lastly, one of the school buildings is further analyzed to understand the impact of the structural modelling uncertainty in the risk estimates and the consequent need for its proper consideration. The results show how the simplified risk classification framework is, as expected, conservative with respect to the detailed component-based approach, as well as the need for retrofitting of some of the building structural systems.

Ricardo Monteiro

### Towards Climate Change Adaptation of Existing and New Deteriorating Infrastructure

Infrastructure assets are essential components to the economical development of modern societies. They are designed to ensure target levels of serviceability and safety on the basis on past experiences and current knowledge on design, construction and maintenance practices. However, changes in climate could modify the lifetime performance of infrastructure by increasing or decreasing failure risks. Therefore, a rational and scientific approach is necessary to deal with the adaptation of existing and new deteriorating infrastructure in a comprehensive way. This keynote paper provides an overview of recent works on this area including: (1) assessment of climate change effects, (2) adaptation to new environmental conditions for future climate change scenarios and (3) decision-making under a changing climate. Several examples for different kind of deteriorating infrastructure assets are also presented and discussed in this paper.

Emilio Bastidas-Arteaga

### A DC Optimal Power Flow Approach to Quantify Operational Resilience in Power Grids

The primary objective of resilience engineering is to analyse and mitigate the risk of a system once a vulnerability has been triggered by an attack. Resilience is a multidimensional concept in the field of engineering and incorporates restoration in the form of a performance and time. Nodal restoration is a key factor in the analysis of resilience in systems, and the properties of the nodes can be analysed to assess the states on the system. The model proposed for the power grid to demonstrate the failure of the network has been used to simulate probability of contingencies on the system and applies a Sequential Monte Carlo simulation to simulate the energy supplied. Additionally, a weather model incorporating the effects of both severe winds and lightning storms has been applied to act as a trigger to the contingency. Once failure of one component has occurred, it cannot be repaired until the network’s performance reaches zero. Given failure of all components, the network will immediately start its restoration phase, and using the same algorithm for optimal power flow calculations, a DC power flow approach is implemented to assess the energy supplied to the whole network in a transient model until the network’s loads meet the demand criteria completely.

Zarif Ahmet Zaman, Edoardo Patelli

### A Novel Analytical Method Set for Damage Control and Care-Process Management by the Cathedral of Milan

Since the introduction of expert systems for the preservation of Cultural Heritage, several research projects developed codified procedures for the condition assessment of the materials composing historical buildings. The identification of the decay typologies observed on the materials was considered a first step for supporting future interventions on the historical surfaces of the buildings. This kind of formal tool, providing data collected over the course of time by periodical survey campaigns, showed also other potentialities, like damage prediction and risk assessment. In recent years, the Veneranda Fabbrica del Duomo (VFD), the board managing the Cathedral of Milan, developed new approaches for preventing the risks connected to its rich apparatus of stone decorations. Based on the wide experience matured in a long-lasting calibration of good practices set for preserving the architectural features of the temple, the VFD adopted recently a new analytical procedure, set on behalf of the association of the Italian cathedrals (AFI), for detecting risk conditions and evaluating the evolution of decays and their potential consequences. The proposed method was studied by the authors, within the convention between Veneranda Fabbrica and Politecnico, in order to verify its reliability through several simulations of different scenarios. Moreover, this study pointed up the difficulties concerning an objective evaluation of the parameters on which the analytical procedure is based and therefore the need of defining criteria for an effective and reliable data gathering and processing to support decision-making. The expected results should provide alarm in case of dangerous scenarios and recommendations concerning the planning of preservation actions: the updating of the inspection interval, the necessity of further diagnostic investigations and the urgency of repair interventions.

Francesco Canali, Lorenzo Cantini, Anthoula Konsta, Stefano Della Torre

### A Quick Criterion for Calculating Waiting Phenomena at Intersections

Calculations of queues length and waiting times at intersections are essential to evaluate the quality of circulation at road junctions (Level of Service, LOS). These calculations are carried out with the theory of waiting phenomena (probabilistic and/or deterministic queue theory) and different models are adopted, depending on whether the operating conditions of the traffic are stationary or not. In technical practice for some time, both for sub-saturation and over-saturation situations for the intersection arms, the formulations of the time-dependent queues obtained with the so-called criterion of coordinates have been used. Depending on the degree of saturation of an input arm (traffic intensity), this criterion allows the transition from probabilistic solutions to deterministic ones. In the paper, after a brief review of time-dependent solutions, a quick criterion is provided for calculating the length of queues and waiting times in the event of peak traffic—as well as the duration of the effects of the latter—obtained under specific characteristics of the arrival processes at the intersection; a demonstration is given of how this criterion leads to solutions conforming to the deterministic type; estimates of the errors, which arise from the criterion developed in this paper to replace a time-dependent formulation, are provided in terms of confidence intervals with varying the degree of saturation.

Raffaele Mauro, Marco Guerrieri, Andrea Pompigna

### A Reliability Based Crack Propagation Model for Reinforced Concrete Bridge Piers Subject to Vehicle Impact

Suman Roy, Andrew Sorensen

### Accounting for Joined Probabilities in Nation-Wide Flood Risk Profiles

A risk profile provides information about the probabilities of event impacts of varying magnitudes. In this study, a probabilistic framework is developed to derive a national-scale flood risk profile, which can be used for disaster risk management and financial planning. These applications typically require risk profiles over a wide range of return periods. For most countries, the historical record of flood impacts is limited to a few decades, insufficient to cover the longest return periods. To overcome this limitation, we developed a stochastic model that can generate arbitrarily long synthetic time series of flood events which have the same statistical characteristics as the historical time series. This includes the joint occurrence probabilities of flood events at different locations across the country. So, the probability of each pair of locations experiencing a flood event in the same event should be the same for the synthetic series as for the historic series. To this end, a novel approach based on ‘simulated annealing’ was implemented. Results show an almost exact reproduction of the statistical properties of the historical time series.

Ferdinand Diermanse, Joost V. L. Beckers, Cathy Ansell, Antoine Bavandi

### An Adaptive Subset Simulation Algorithm for System Reliability Analysis with Discontinuous Limit States

Efficient computational methods for system reliability assessment are of importance in many contexts, where crude Monte Carlo simulation is inefficient or infeasible. These methods include a variety of importance sampling techniques as well as subset simulation. Most of these methods function in an adaptive manner, whereby the sampling density gradually approaches the failure domain. The adaptation can work well when the limit state function describing system performance is continuous. However, many system reliability problems involve limit state functions that are non-continuous over the input sample space. Such situations occur in both connectivity- and flow-based problems, due to the binary or multi-state random variables entering the definition of the system performance or the discontinuous nature of the performance function. When solving this kind of problem, the standard subset simulation algorithm with fixed intermediate conditional probability and fixed number of samples per level can lead to significant errors, since the discontinuity of the output can result in an ambiguous definition of the sought percentile of the samples and, hence, of the intermediate domains. In this paper, we propose an adaptive subset simulation algorithm to determine the reliability of systems with discontinuous limit state functions. The proposed algorithm chooses the number of samples and the conditional probability adaptively. Numerical examples are provided to demonstrate the accuracy and efficiency of the proposed algorithm.

Jianpeng Chan, Iason Papaioannou, Daniel Straub

### An Efficient Solution for Reliability Analysis Considering Random Fields—Application to an Earth Dam

Performing a reliability analysis using Monte Carlo Simulation (MCS) is usually time-consuming for cases with expensive-to-evaluate deterministic models or small failure probabilities. The computational burden of such analysis can be significantly alleviated by replacing the deterministic model with a meta-model. However, the meta-modeling techniques suffer from the curse of dimensionality issue. They are thus less efficient for geotechnical reliability analyses involving random fields (RF) since the considered problems are often high dimensional due to the RF discretization. This paper introduces a new procedure based on the Sparse Polynomial Chaos Expansions (SPCE) which can address the above-mentioned issues. It deals with high dimensional stochastic problems in two stages: the first stage consists in reducing the input dimension by the Sliced Inverse Regression (SIR), while the second stage constructs a SPCE with respect to the reduced dimension and then performs an MCS. Additionally, an adaptive experimental design technique is proposed for the construction of the SPCE model. The modified algorithm (termed as A-SPCE/SIR) is applied to an earth dam problem in which the cohesion and friction angle are modelled by lognormal RFs. The effects of the vertical autocorrelation distance and the input cross-correlation on the dam reliability are investigated. The efficiency and accuracy of the A-SPCE/SIR are highlighted by comparing with the direct MCS and a previous study.

Xiangfeng Guo, Daniel Dias, Qiujing Pan

### An Overview of Performance Predictive Models for Railway Track Assets in Europe

A railway system degrades over time due to several factors such as aging, traffic conditions, usage, environmental conditions, natural and man-made hazards. Moreover, the lack or inadequate maintenance and restoration works may also contribute to the degradation process. In this aspect it is important to understand the performance of transportation infrastructures, the variables influencing its degradation, as well as the necessary actions to minimize the degradation process over time, improve the security of the users, minimize the environment impact as well as the associated costs. Thus, it is crucial to follow structured maintenance plans during the life cycle of the infrastructure supported by the forecasting of the degradation over time. This paper presents a brief description of the variables influencing the degradation of a rail-way system, and the way the performance of the railway track can be measured, within a probabilistic environment. The work developed in other transportation infrastructures, like roadway, is briefly presented for comparison purposes and benchmarking. It also presents an overview of the predictive models being used in railway systems, from the mechanistic to the data-driven models, where the statistical and artificial intelligence models are included.

Maria José Morais, Hélder S. Sousa, José C. Matos

### Application of Fragility Analysis to Timber-Framed Structures for Seismic and Robustness Assessments

In the past few years, the construction of multi-storey timber buildings has increased significantly in locations where high-intensity ground motions are likely to occur. On the other hand, the fast development of wood engineered products, as glued-laminated timber (GLT) and cross-laminated timber (CLT), has been challenging researchers to provide adequate guidelines for the design and assessment of structures built in seismic regions. Some guidelines and analysis methods considered in seismic design can improve robustness, commonly described as the ability of structures to sustain limited damage without disproportionate effects. This paper proposes a probabilistic methodology for seismic and robustness assessment of timber-framed structures. The seismic performance and the progressive collapse potential of a three-storey building are here exemplified through the proposed methodology, which accounts for uncertainties in mechanical properties of members and connections, as well as for external loads. The Latin Hypercube Sampling (LHS) was used in each assessment to generate a set of 1000 structural models. Each structural model corresponds to a realization of the random variables used to define the structural model. Incremental dynamic analyses were performed to develop seismic fragility curves for different damage levels. The fragility functions for robustness assessment were developed for distinct damage scenarios, exploiting the results of an alternate load path analysis (ALPA) that involved the performance of nonlinear static analyses (pushdown analyses). The methodology presented is suitable for risk-based assessments that consider the occurrence of different exposures, such as earthquakes, impacts, and explosions, while considering the direct and indirect consequences of failures. However, the methodology involves time-consuming analyses with distinct load scenarios, which can constitute a burdening task within a typical building design phase.

Leonardo G. Rodrigues, Jorge M. Branco, Luís A. C. Neves, André R. Barbosa

### Assessment of Design Concepts for Post-installed Punching Shear Retrofitting

Punching shear is a brittle form of failure observed in reinforced concrete slab structures and occurs without any visible signs before failure. This phenomenon typically arises around the slab-column connections, due to transverse forces being highly concentrated in these areas and can cause that the column punches through the slab. This type of failure is very brittle. The unpredictability of its occurrence makes it a particularly critical and dangerous phenomenon. Several methods have been developed for retrofitting and strengthening existing flat slabs against punching shear failure using different reinforcement-types, like shear bolts, screw anchors or bonded anchors. These methods are called post-installed shear reinforcement for existing flat slab systems. This study aims to assess the safety and economic performance of the Eurocode 2 (EC2) design method for the design of post-installed reinforcement in an existing flat slab structure endangered by punching shear, using probabilistic analysis. The probabilistic analysis was conducted based on the Monte Carlo simulation technique implemented using a MATLAB code developed in the study. The reliability indices obtained for EC2 design procedure were found to be close to the EN 1990 target reliability level.

Oladimeji B. Olalusi, Puneh Mowlavi, Nikolaos Mellios, Panagiotis Spyridis

### At Issue: The Gaussian Autocorrelation Function

This paper focuses on the use of Gaussian autocorrelation functions (ACF) in civil engineering applications involving random processes and random fields. It aims at debunking misgivings, verifying facts and figures, and formulating practical conclusions. A large majority of civil engineers active in random field modelling and reliability analysis is quite content to point out that the routine use of Gaussian autocorrelation functions is part of standard practice and perfectly harmless. A common approach in 2D random field problems, for instance, is to estimate an appropriate correlation length on some physical or empirical basis, and then plug it into a multivariate ACF that is both isotropic, and separable into a product of univariate ACFs: if both of these objectives are to be met, the Gaussian ACF naturally stands out as it is in fact the only real function to possess both of these properties. But as early as the nineteen-sixties, a substantive piece of electrical engineering literature pointed to “issues” and “red flags”. The claim was that the Gaussian ACF produces unrealistic results, violates certain principles concerning both the modelling and the estimation of random properties, and runs into results that possibly defy common sense. Similarly, geostatisticians have been issuing warnings of hyper-predictability, super-smoothness, wildly underestimated estimation errors, and artificial results in applications such as spatial kriging using Gaussian ACFs, leading to the recommendation that the Gaussian model should never be used in practice. This paper revisits the use of the Gaussian ACF and presents a sober but principled look at the entire issue. Importantly, it also considers the pros and cons of replacement ACF models and adjusted ACF models. The paper includes examples and measurable outcomes with the aim of providing a fair assessment and justifiable recommendations.

Marc A. Maes, Karl Breitung, Markus R. Dann

### Bridge Case Studies on the Assignment of Partial Safety Factors for the Assessment of Existing Structures

Aging bridges in combination with an ever-growing traffic volume are a matter of concern all over the world. Consequently, the reassessment of existing bridges is gaining importance rapidly. This paper presents two bridge case studies considered within the IABSE Task Group 1.3 “Calibration of Partial Safety Factors for the Assessment of Existing Bridges”. The so-called design value method (DVM) and adjusted partial factor method (APFM), introduced in fib Bulletin 80 and both relying on a partial factor format, are considered in this paper. The objectives are (i) to illustrate how DVM and APFM can be used when specifying partial safety factors for assessment of existing bridges, and (ii) to discuss some of the assumptions that are implied by these methods. Two case studies are considered for illustration in this paper: a single span reinforced concrete slab and a 3-span continuous reinforced concrete slab.

André Orcesi, Vazul Boros, Marija Kušter Marić, Ana Mandić Ivanković, Miroslav Sýkora, Robby Caspeele, Jochen Köhler, Alan O’Connor, Franziska Schmidt, Salvatore Di Bernardo, Nisrine Makhoul

Marcel Nowak, Franziska Schmidt, Oliver Fischer

### Construction Risk Management in Portugal—Identification of the Tools/Techniques and Specific Risks in the Design and Construction Phases

The environment of construction projects (CP) is especially important, as well as the context in which they started, developed, and completed. Its effects should be closely monitored, controlled whenever possible, and considered as a high-risk source. Despite the high impact of risks during the construction project life cycle, very few researches and papers do analyse them. So, the main purpose of this paper is to investigate the most relevant tools or techniques for risk management and identify the specific risks for designers and contractors that affect the phases of concept and execution of CP. So, a questionnaire was designed and administered to construction professionals (contractors and designers). The results show that the process of planning risk responses is the one evidencing less knowledge of the tools used by the construction professionals. Regarding designer specific risks in the project phase, the “problems of coordinating the various specialities” or “customer lacking the necessary experience or resources to support the project” have a high impact; in the construction phase, the “continuous change in the scope” of the project is the most impacting risk. For contractor specific risks, in the budgeting phase, “work’s quality and value are insufficient for the cost” or “substandard budgeting documents” have a high impact; in the construction phase, the “low-skilled workforce” followed by “work’s quality and value are insufficient for the cost” are the most impacting risks. Based on the results obtained, the findings of the research contribute to an understanding of the major tools and techniques used by designers and contractors for risk management and for the identification of the specific risks arising from life cycle phases that can affect a construction project in Portugal.

António J. Marinho, João P. Couto

### Cumulative Failure Probability of Deteriorating Structures: Can It Drop?

The reliability of deteriorating structures at time t is quantified by the probability that failure occurs within the period leading up to time t. This probability is often referred to as cumulative failure probability and is equal to the cumulative distribution function of the time to failure. In structural reliability, an estimate of the cumulative failure probability is obtained based on probabilistic engineering models of the deterioration processes and structural performance. Information on the condition and the loading contained in inspection and monitoring data can be included in the probability estimate through Bayesian updating. Conditioning the probability of failure on the inspection or monitoring outcomes available at time t (e.g. detections or no detection of damages) can lead to a reduction in that probability. Such a drop in the cumulative failure probability might seem counterintuitive since the cumulative failure probability is a non-decreasing function of time. In this paper, we illustrate—with the help of a numerical example—that such a drop is possible because the cumulative probability before and after the updating is not based on the same information, hence not on the same probabilistic model.

Ronald Schneider, Daniel Straub

### Development of Culvert Risk Condition Evaluation for Decision-Making Within Road Infrastructure Management

Regarding road infrastructure management systems, culverts need to be assessed in order to avoid failures and road collapses. So, periodic inspections framework and condition rating implementation has an important role for life service estimation and reliability evaluation. In addition, the risk can be avoided through condition rating merged with culverts exposure and vulnerabilities. This will provide information to support decision-making and prioritize interventions. In this paper a new approach for decision-making process is presented taking into consideration the global risk index (αG). The proposal includes a set of culverts descriptors, weight attribution and aggregation rules complying with external factors such as hazards, condition rates and consequences. Moreover, a case study with 25 different systems is conducted to qualitatively assess culverts global risk index and prioritize needed interventions.

Fernando Sousa, Sara Dias, José C. Matos, Aires Camões

### Discussion of the Number of Risk Classes for Risk Based Maintenance

The importance and application of risk-based maintenance planning is growing in many areas, such as oil production or infrastructure management. For example, there exist already risk-based maintenance concepts for bridges and tunnels. However, these risk-based concepts require criteria for the decision of necessary actions. Such criteria must be risk classes in a risk-based maintenance concept. In this paper, the proposals for the number and limits of risk classes from different areas are compiled, discussed and recommendations for practice are given. However, experience has shown, that not only objective factors, but also subjective factors, such as the acceptance of risk classes by the inspectors, must be considered to achieve a successful application of risk-based maintenance.

Dirk Proske, David Tschan

### Dynamic Response Equivalence of a Scaled Bridge Model Due to Vehicular Movement

The design of scaled testing is important for establishing equivalence with a full-scale structure but if difficult since the geometry and the material both need to be scaled. For a good, scaled testing, it is important to demonstrate the results of the scaled original structure and the designed scaled testing behave similarly, so that there is control over experimentation. Despite existing guidance around this topic, such equivalence is sometimes not checked appropriately, leading to uncertainties and variations in scaled testing which significantly compromises the usefulness of such experiments. This paper addresses this topic for a bridge-vehicle interaction problem and demonstrates how a scaled testing can show equivalence with respect to its full-scale counterpart. A Buckingham-Pi approach has been taken for scaling and the assumptions around the models and the responses are defined to establish the boundaries of the responses that are intended to be replicated. The non-dimensional parameters are defined and guide the design of future experiments. The conversion of a complex cross-sectional profile to an equivalent beam with made of a different material is dictated by the matching of modelled responses of the scaled responses of the original structure versus the unscaled responses of the experimental structure. The match indicates that establishment of such equivalence is particularly relevant for carrying out future experiments within the laboratory and subsequently linking it to full-scale structures for implementing sensors or carrying our intervention aspects such as repairs. The work also emphasizes on how a well-designed scaled testing should have a numerical benchmark for future interpretation and understanding assumptions around such interpretations when comparing full-scale experiments with controlled laboratory-based experiments, reducing uncertainty around such comparisons. The presented work is expected to be of interest for both researchers and practicing engineers.

Paul Cahill, Vikram Pakrashi

### Energy Based Model of Vehicle Impacted Reinforced Bridge Piers Accounting for Concrete Contribution to Resilience

Suman Roy, Andrew Sorensen

### Establishment of Suitable General Probabilistic Model for Shear Reliability Analysis

Adequate characterization and quantification of the model uncertainties in shear resistance models are identified as one of the key issues in reliability analysis of shear reinforced concrete beams. Previous studies indicate high model uncertainty in shear prediction. Model uncertainties for various shear predictive models are characterised here, based on a recent and well-vetted database of shear failures. The characterisation includes the estimation of main statistical parameters; and importantly also correlation and regression analysis to assess the consistency of model uncertainties over ranges of design parameters. The aim is to identify models suitable for use as a general probabilistic model (GPM) in future reliability assessments. The Variable Strut Inclination Method (VSIM) displayed high bias, variability and various correlations with shear input parameters which make it an unsuitable choice for GPM. The modified compression field theory (MCFT) showed low bias and variability with consistent model uncertainties over the ranges of shear design parameters and is thus suitable as GPM.

### Estimation of the Global Health Burden of Structural Collapse

In this article, the global health burden of structural collapse is determined. To this end, mean ratios of structures to inhabitants are determined and applied to the world population, subdivided into industrialized and developing countries. Based on known collapse frequencies of structures, mean annual worldwide collapse numbers of structures are calculated. Furthermore, the average number of victims is estimated and then used to estimate worldwide victim numbers, considering not only fatalities but also injuries. These victim numbers are converted into “lost life years”, a parameter often used as a measure of a health risk and compared to some other causes of victims.

Dirk Proske

### Evaluation of Partial Safety Factors for the Structural Assessment of Existings Masonry Buildings

The assessment of existing structures and infrastructures is a primary task in modern engineering, both for its key economic significance and for the extent and the significance of the built environment, nonetheless operational rules and standards for existing structures are often missing or insufficient, especially for masonry constructions. Existing masonry buildings, even in limited geographical regions, are characterized by many masonry types, differing in basic material, mortar, block shape, block texture, workmanship, degree of decay and so on. For these reasons, relevant mechanical parameters of masonry are often very uncertain; their rough estimation thus leads to inaccurate conclusions about the reliability of the investigated structure. In this work, a methodology to derive a refined probabilistic description of masonry parameters is first outlined starting from the analysis of a database of in-situ tests results collected by the authors. In particular, material classes, representing low, medium and high-quality masonry, are identified for a given masonry typology by means of the definition of a Gaussian Mixture Model. The probability density functions so obtained are the fundamental basis for the implementation of probabilistic analysis methods. In particular, the study will focus on the evaluation of masonry classes for compressive strength of stone masonry, considering a relevant database of semi-destructive, double flat jacks, in-situ test results. The statistical properties of the identified masonry classes, which can be used for the direct probabilistic assessment of structural performance of masonry walls under vertical loads, are finally considered for the evaluation of suitable partial safety factors, γM, to be used in the engineering practice.

Pietro Croce, Maria L. Beconcini, Paolo Formichi, Filippo Landi, Benedetta Puccini, Vincenzo Zotti

### FORM/SORM, SS and MCMC: A Mathematical Analysis of Methods for Calculating Failure Probabilities

Breitung, Karl W.A basic problem in structural reliability is the calculation of failure probabilities in high dimensional spaces. FORM/SORM concepts are based on the Laplace method for the pdf of the failure domain at its modes. With increasing dimensions the quality of SORM decreases considerably. The straightforward solution would have been to improve the SORM approximations. However, instead of this, a new approach, subset simulation (SS) was championed by many researchers. By the proponents of SS it is maintained that SS does not suffer from the deficiencies of SORM and can solve high-dimensional reliability problems for very small probabilities easily. However by the author in numerous examples the shortcomings of SS were outlined and it was finally shown that SS is in fact a disguised Monte Carlo copy of asymptotic SORM. The points computed by SS are converging towards the beta points as seen for example in the diagrams in many SS papers. One way to improve FORM/SORM one runs, starting near the modes i.e. beta points, MCMC’s which move through the failure domain $$F=\{\mathbf {x} ; g(\mathbf {x})< 0\}$$ F = { x ; g ( x ) < 0 } with $$g(\mathbf {x})$$ g ( x ) the LSF. With MCMC one can calculate integrals over F with the pdf $$\phi (\mathbf {x})$$ ϕ ( x ) , but not the normalizing constant P(F). However, a little artifice helps. Comparing the failure domain with another having a known probability content; not P(F) has to be estimated, but the quotient of these two probabilities. A good choice for this is $$F_L=\{x ; g_L(\mathbf {x})< 0\}$$ F L = { x ; g L ( x ) < 0 } given by the linearized LSF $$g_L(\mathbf {x})$$ g L ( x ) , so $$P(F_L)= \Phi (-|\mathbf {x}^*|)$$ P ( F L ) = Φ ( - | x ∗ | ) with $$\mathbf {x}^*$$ x ∗ the beta point. Running two MCMC’s, one on F and one on $$F_L$$ F L by comparing them it is possible to obtain an estimate for the failure probability P(F). Another way is to use a modified line sampling method. For each design point for a random set of points on the tangential plane the distance of the plane to the limit state surface on the ray normal to the tangential space is determined and the corresponding normal line integral. Improving FORM/SORM by MCMC adds the advantages of analytic methods to the flexibility of the Monte Carlo approach.

Karl Breitung

### Fractile Based Sampling Procedure for the Effective Analysis of Engineering Structures

The non-linear analysis of the performance of engineering structures requires in general a huge computational effort. Moreover, in some cases a model updating procedure is needed. In this contribution, a model updating procedure has been applied for the simulation of pre-stressed reinforced concrete (RC) beams. The combined ultimate shear and flexure capacity of the beams is affected by many complex phenomena, such as the multi-axial state of stress, the anisotropy induced by diagonal concrete cracking, the interaction between concrete and reinforcement (bond), and the brittleness of the failure mode. Spatial distribution of material properties may be considered by random fields. Furthermore, statistical and energetic size effects may influence the analysis. To incorporate all the mentioned affects within a probabilistic analysis by using Monte Carlo simulation, feasibility limits are achieved quickly. Therefore, the aim was to improve the sampling technique for the generation of the realizations of the basic variables for, a general, computationally complex analysis tasks. The target was to develop a method similar to a simplified probabilistic method e.g. Estimation of Coefficient of Variation (ECoV). Therefore the so-called fractile based sampling procedure (FBSP) by using Latin Hypercube Sampling (LHS) has been developed. It allows a drastic reduction in the computational effort and allows the consideration of correlations between the individual basic variables (BV). However, fundamental aspect of the presented procedure is the appropriate selection of a leading basic variable (LBV). The appropriate choice of the LBV among the defined BVs is essential for mapping the correct correlation. Three methods for the determination of the LBV were investigated in this paper.

Alfred Strauss, Beatrice Belletti, Thomas Zimmermann

### Fragility Curves for Fire Exposed Structural Elements Through Application of Regression Techniques

The structural fire engineering community has demonstrated a growing interest in probabilistic methods in recent years. The trend towards consideration of probability is, amongst others, driven by an understanding that further advances in detailed numerical models are potentially offset by the basic uncertainty in the input parameters. Consequently, there has been a call for the development of fragility curves for fire-exposed structural elements, to support the application of probabilistic methods both in design as well as in standardization. State-of-the-art structural fire engineering models are, however, commonly very computationally expensive, even for simple cases such as isolated structural elements. This can be attributed to the requirement of coupling thermal and mechanical analyses, and to the large non-linearity in both the heating of structural elements and the resulting mechanical effects of temperature-induced degradation and strains. This severely hinders the development of fragility curves beyond very specific cases, especially when including a stochastic description of the (natural) fire exposure. In the current contribution the application of regression techniques to structural fire engineering modeling is explored, as a stepping stone towards establishing a methodology for the efficient development of fragility curves for fire-exposed structural members. A simplified model with limited computational expense is applied to allow for validation of the proof-of-concept.

Ranjit K. Chaudhary, Ruben Van Coile, Thomas Gernay

### Identification of Risk Management Models and Parameters for Critical Infrastructures

The resilience of an area/region/country or society is directly related to the performance of its Critical infrastructures (CI), especially when it is affected by extreme events. The increasing number of catastrophic events, such as terrorist attacks or natural disasters (tsunamis, fires, floods), alerted Europe and other nations worldwide to take measures for preventing or reducing possible consequences against these situations. CI are commonly defined as facilities, systems and assets, essential for the maintenance of vital social functions, and their disruption or destruction may significantly impact the well-being of society. It is mandatory for any nation to identify which Infrastructures must be defined as critical, by analyzing the impacts provoked by an extreme event and the society’s dependence towards this Infrastructure. For this purpose, European Commission established a procedure for the identification and designation of European CI ensuring to avoid different approaches within the EU. Three cross-cutting criteria where defined: (a) Casualties; (b) Economic-effect; (c) Public effect. This paper aims to introduce different risk management models for CI and the parameters necessary for quantification of these Methodologies. There are several models for risk management, the ones studied and introduced in this paper were applied in different countries and types of CI, these vary from deterministic approaches to probabilistic methods. The critically parameters are related in governmental, economical, security and welfare terms, these parameters are important for two main reasons: (1) to keep updated the critical index and the maps of risks and vulnerability that predictive models may use; (2) Current tools are essentially based on models weighed by qualitative weights, not allowing the complete analysis of one-off events.

Oscar J. Urbina, Elisabete R. Teixeira, José C. Matos

### Implementation of Reliability Methods in a New Developed Open-Source Software Library

Structural reliability methods aim at the computation of failure probabilities of structural systems with methods of statistical analysis due to varied uncertainties occurring during their design, building or even operating conditions. However, in the field of civil engineering, the use of structural reliability methods unfortunately remains limited to specific cases. Most of the software available has still a limited range concerning wide parametric studies for analysis with reliability methods in civil engineering. This paper describes a new open-source software library as an effective tool for reliability analysis in civil engineering. The goal is to facilitate the adoption of reliability methods among engineers in practice as well as to provide an open platform for further scientific collaboration. The new library is being developed as a so-called “R package” in open-source programming software “R”. The package is capable of carrying out systematic parameter studies using different probabilistic reliability methods, as FORM, SORM, Monte Carlo Simulation. Based on this, an overview on the probabilistic reliability methods implemented in the package as well as results of parametric studies is given. The performance of the package will be shown with a parametric study on a practical example. Most important results of the parametric study as well as the correctness of different reliability methods will be described in the paper. By describing probabilistic methods using an example in practice, engineers can get a basic understanding behind the ideas of probability theories. Further work will result in large parameter studies, which will support the development of a new guideline for reliability in civil engineering. This guideline describes techniques of code calibration as well as to determine new partial safety factors (e.g. for non-metallic reinforced concrete, fixing anchors, etc.). Furthermore, advanced reliability methods (e.g. Monte Carlo with Subset Sampling) will be implemented in the new R package.

Jan Philip Schulze-Ardey, Tânia Feiri, Josef Hegger, Marcus Ricker

### Influence of an In-Situ Inspection on the Reliability Analysis of an Ancient Timber Roof

Despite the durability of timber and its efficient performance seen in the built heritage, it has become a common practice, in Portugal, to replace ancient timber roof structures by concrete or steel roof ones. The main reason may be attributed to the difficulty in assessing the real condition of timber structures with respect to its actual level of conservation. In this work a reliability analysis of an ancient timber roof, from a Portuguese neoclassic building of the eighteenth century, is made to evidence the importance of different levels of information taking into account visual and geometric inspections. The impact of posterior knowledge obtained from non-destructive tests is evaluated by comparing the probability of failure and the reliability index on two distinct scenarios. The first scenario considers only prior information for the mechanical properties of timber elements and apparent cross-sections for the structural members. On the other hand, the second scenario considers the results of an in-situ inspection that provides the residual cross-section of the principal members, as well as the updating of the modulus of elasticity and density, based on a Bayesian Updating procedure that takes advantage of the results of a database of non-destructive tests. Latin Hypercube Sampling (LHS) was used in this study to generate a set of structural models, in which each model corresponds to a realization of the assumed random variables. Apart from the mechanical properties, the uncertainties related to permanent, snow and wind loads, are included according to the provisions of the Joint Committee on Structural Safety (JCSS). The presented results indicate that in-situ inspections have to be a priority on the assessment of ancient timber structures. The absence of a careful assessment of deteriorated elements can lead to incorrect conclusions about the structural safety. Additionally, the use of a probabilistic framework allows to a better definition of intervention plans by providing the reliability of distinct critical elements.

Leonardo G. Rodrigues, Hélder S. Sousa

### Inherent Variability of Geotechnical Properties for Finnish Clay Soils

Compared to manufactured materials, soil properties often exhibit significant variability even within a seemingly homogeneous soil layer. The uncertainty related to this variability can be dealt in a robust manner by means of reliability-based methods. Hence, effort has been made to collect soil statistics in order to provide approximate guidelines for selecting the value of coefficient of variation (COV) of inherent variability. It has been observed that the COV value for the same physical property tends to vary within a relatively narrow range, meaning that the literature COV ranges could be utilized with some confidence on sites which lack sufficient soil data. However, it is not certain whether these prior COV values can be used in Finland since many Finnish clay soils are soft and sensitive due to their unique geological history shaped by the last post-glacial processes. Hence, this paper evaluates the inherent variability of various geotechnical properties (index, strength and consolidation properties) in four clay soil sites located in Southern Finland. Besides prior ranges of COV, this paper provides prior ranges of the mean soil property, applicable for soft post-glacial clays and clayey gyttjas. Furthermore, the shape of the probability distribution is evaluated for various soil properties at one clay site by means of normality tests and visual assessment. It is concluded that the derived COV values are in accordance with literature ranges, but for more reliable estimates, soil statistics derived for Finnish clay soils should be preferred when possible. Nonetheless, no literature range can replace extensive site-specific soil statistics. Finally, it is confirmed that nearly all the soil properties at the studied Finnish clay site can be modelled as normal or lognormal distribution.

Monica S. Löfman, Leena K. Korkiala-Tanttu

### Integration of the Analysis of the Error of Geometric Dimensions Modeled with a Probabilistic Approach

Metrology is extensively used in the manufacturing industry to determine whether the dimensions of parts are within their tolerance interval. However, errors cannot be avoided. If the metrology experts are actually aware of it, and currently able to identify the different sources that contribute to making errors, very few research has been made in this area to develop metrology methods accounting for such errors. The probability density function of the error is here assumed to be given as an input. This work deals with a batch of measures and its statistical properties. The first proposed method aims to correct the effects of the errors to the distribution that characterize the entire batch. Then a second method tries to estimate for each single measure, the dimension that is being the most likely given by a measure, after the error is deducted. It is based on the output knowledge of the first method and integrates it with Bayesian statistics. Only Gaussian distributions are considered in the paper. Their relevance is shown through one example applied on simulated data.

Marc Gille, Pierre Beaurepaire, Fabien Taghon, Antoine Dumas, Nicolas Gayton, Thierry Yalamas

### International Codes in the Prediction of Load-Bearing Capacity of Slender Columns

The bearing-buckling capacity of slender columns is carefully addressed in modern codes by different analytical methods and non-linear advanced numerical methods for structural analysis. This work summarizes an experimental campaign and the safety formats of international codes available for the prediction of slender columns load-bearing capacity. This work is motivated on a round-robin investigation considering numerical simulations supported on non-linear material models and second-order effects (one of the methods suggested in the Eurocode), that proved an expressive overestimation of the load capacity of single slender columns when comparing the numerical results with the experimental ones. Consequently, a high interest in the safety formats of international codes arises, aiming the comparison of the predictions provided by different methods described in design codes from Canada, China, Japan and USA with the European one (namely, the nominal curvature method), as well as the identification of strengths and weaknesses of such methods.

Alfred Strauss, Neryvaldo Galvão, José C. Matos, Michael Hauser, Benjamin Tãubling, Mohamed Soliman, Mohammad Tamini, Xin Ruan, Lingfeng Zhu, Hiroki Ishibashi

### Investigation of Parameter Uncertainties Inherent to the Geotechnical Design of Bank Revetments at Inland Waterways

This paper describes the effects of uncertainty inherent to the choice of hydraulic load and soil parameters on the geotechnical revetment design. As for the practitioner, the effect of uncertainties on the required armour layer thickness is studied. Uncertainties inherent to revetment design mainly result from the load and resistance parameters employed in the design. At present, design loads are obtained from empirical equations and worst-case ‘design vessel passages’. Characteristic soil parameters are defined on the basis of a limited number of field and laboratory tests. Thus, uncertainties arise with regard to the choice of characteristic values. In order to investigate the effects of parameter uncertainty on the revetment design, distributions and correlations of loads are assessed using vessel passages observed in the field. In ensuing uncertainty analyses it is found that at present available data does not allow approximating loads by means of probability functions, whereas for the soil parameters the results indicate that the minima of the soil parameters govern the design. However, it is also found that when considering more than one soil parameter as random variable, a less conservative design can be achieved as with the individual minima. As a conclusion, recommendations regarding parameter choice and design procedure are provided.

Julia Sorgatz, Jan Kayser

### Life-Cycle Cost Analysis of a Viaduct Considering Uncertainties on the Interventions Plan

Life-Cycle Analysis is usually referred to as the assessment of a system that includes the three pillars of sustainability, i.e. economic, environmental, and social aspects. Many works have been developed presenting and discussing methodologies and frameworks to include the evaluation of these aspects. In general, economic performance is the most addressed, followed by the environmental aspect. A highly generalized formulation is usually used regarding social aspects since it corresponds to the society at large. The present work is focused on the economic aspects of the assessment, implementing a life-cycle cost analysis methodology to an infrastructure system (viaduct) covering all direct costs for agency/owners. Furthermore, to indirectly account for climate changes, and the requirements of the quality control plan (QCP) for operational issues, a semi-probabilistic approach is carried out. The life cycle cost analysis assumes a uniform probabilistic distribution for the intervention time of maintenance and rehabilitation activities. With this, variations in the degradation processes due to climate changes and/or intervention needs to fulfill requirements from QCP, are considered. For this purpose, the time at which each intervention might occur in the future is assumed as a random variable. Monte Carlo simulation is then used to compute the cost of several different scenarios. On other hand, the present work provides another approach referred as the deterministic approach. It estimates the life cycle costs deterministically, and considers at the end of the analysis the uncertainties associated with the life-cycle process in a rough way assuming a general coefficient of variation of ±20%. The main goal of this work is to understand the impact that uncertainties in the intervention schedule might have on the final life-cycle cost.

Carlos Santos, Mário Coelho, Monica Santamaria, José C. Matos, Mauricio Sanchéz-Silva

### Location Dependency on Resilience and Material Intensity of an Office Building Keeping an Eye on Seismic Zone Implications

To describe the urban metabolism of the built environment, bottom-up approaches are often used in literature. In these approaches building typologies play an important role, where the material intensity of building types is described by material composition indicators. These indicators are mostly static and refer to the construction method of the respective type. However, the influence of the geological conditions of the location on the material requirement has so far hardly been examined, if at all. Many of the fast-growing cities, especially in developing countries, which generate a high demand for materials, are located in areas that are regularly hit by earthquakes. This is accompanied by different static requirements for one and the same building type if it is located in different seismic zones. Using FEM, for a selected office building it was investigated which role plays the location of this building with regard to its material consumption, i.e. how the static requirements from the seismic load in different zones affect the demand for mass relevant building materials of the supporting structure. For this purpose, a 3D building model was created and the load-bearing components were dimensioned under the load combinations typical for the seismic zones. The modeling results show a clear dependence of the building material requirement on the seismic zone in which the building type is located. However, static building type-specific approaches with constant material composition indicators do not reflect this effect. In conclusion, it should be noted that seismic zone dependence should be given greater consideration in the modeling of material composition indicators in the future.

Regine Ortlepp, Mahar A. Gul

### Long Term Evaluation of the Structural Reliability of an Existing Concrete Prestressed Bridge

Reliability is an important factor to determine how safe is a structure. The aim of this study is to use the concept of reliability in order to manage the maintenance and to plan the interventions that could be necessary. The first part includes the calibration of the model, verifying the obtained results. The second part provides a 100-samples nonlinear analysis, considering the statistically important random variables. Each sample is generated considering the mean and standard deviation values of each random variable, using the Hypercube Latin method to couple them. The output is the load factor probability distribution. Using an overload probabilistic curve, the reliability index is computed, according to the Monte Carlo method. The third part illustrates the corrosion effect calculation, using FIB Bulletin 34 guidelines. Once determined the corroded area and the corrosion depth during time, the reliability index is computed, using different time values. The trend of reliability index during time is obtained in relationship with variation of the standard deviation and the load factor values.

Tommaso Donolato, Neryvaldo Pereira, José C. Matos

### Model Updating with Reduced Experimental Data

Bayesian updating is increasingly used in structural engineering; it is applicable as an inverse method to identify the model of uncertainty which best matches some available experimental data. This paper discusses the application of such methods to models with multiple outputs in case the experimental data is reduced. Standard updating methods involve the covariance matrix, which becomes rank deficient in case the experimental data is too scarce. The use of this rank deficient matrix with the standard methods is first discussed. A new method is then proposed; it relies on the generation of multiple samples of the prior distribution. These samples are used to « enrich » the missing data and construct a prior distribution of the terms of the covariance matrix that cannot be identified from the data.

Pierre Beaurepaire

### Numerical Modeling of an Extrusion-Based Concrete Printing Process Considering Spatially and Temporarily Varying Material and Process Parameters

During the past few years, additive manufacturing techniques for concrete have gained extensive attention. In particular, the extrusion-based 3D concrete printing exhibited a rapid development. Previous investigations are mostly based on experimental studies or even trial-and-error tests. A more profound understanding of the relationships between the process and material parameters and the manufactured structure can be advanced by numerical modeling and simulation. It enables to study a wide range of parameters such that dependencies of properties of the printed product on different influencing factors can be identified. Taking into account the uncertain nature of process and material parameters of the extrusion-based concrete printing, the process can be reliably controlled and finally optimized. The presented study uses a pseudo-density for a finite element based modeling approach. The pseudo-density determines the properties of the individual finite elements, analogous to the soft-killing method of topology optimization. Layer by layer the previously created elements are activated. Material parameters are described as temporally and spatially variable to reflect the temporally variable printing process. First results of a reliability estimation are shown for a 2D modeled additively manufactured wall.

Albrecht Schmidt, Meron Mengesha, Luise Göbel, Carsten Könke, Tom Lahmer

### Parameter Uncertainties in Flow Rate and Velocity Analysis of Heavy Rain Events

Flooding due to intensive precipitation poses a major threat to lives and property. Information about flood-prone areas is needed in order to reduce potential risks via mitigation and adaptation measures. The flood probability of a certain point in the landscape depends, firstly, on the projected frequency and characteristics of heavy rainfall events that generate surface runoff and, secondly, on the specific properties of the terrain that determine flow and runoff, e.g. the surface morphology and the hydraulic roughness of diverse surfaces or land uses. Simulation models of surface water flow are standard tools for the assessment of flood dynamics caused by extreme precipitation events. In order to make informed decisions that take modelling uncertainties into account as well as to get an idea of the probability space, it is necessary to quantify the effects of alternative sets of model parameters by drawing on different data sources as well as spatial and temporal resolutions of the input data. For the current study, we evaluated the impact of different parameter sets on the flow rate and velocity as determined for a study area in south-eastern Germany using the hydronumeric computational fluid dynamics model HiPIMS. The considered parameters were rainfall input (time and space invariant, spatially invariant and time varying, space and time varying), hydraulic roughness and spatial resolution of the digital elevation model. We present point-based time series of flow rates and velocities to indicate the bandwidths of probable flooding dynamics. Results show that the modelled flow rates and velocities are strongly dependent on the particular form of rainfall data as well as the spatial resolution of the digital elevation model. The effects of variations in hydraulic roughness are also found to be significant while in all cases the location of data capture points in the catchment area has a strong influence.

Axel Sauer, Regine Ortlepp

### Prediction of Concrete Breakout Strength of Single Anchors in Shear

This study proposes a machine learning algorithim—a Gaussian process regression (GPR)—for predicting the concrete breakout capacity of single anchors in shear. To this end, experimental strength of 366 tests on single anchors with concrete edge breakout failures were collected from literature to establish the experimental database to train and test the model. 70% of the data were used for the model training, and the rest were used for the model testing. Shear influence factors such as the concrete strength, the anchor diameter, the embedment depth (technically the influence length), and the concrete edge distance were taken as the model input variables. The generated predictive model yielded a determination coefficient $${\text{R}}^{2}$$ R 2  = 0.99 for both the training and testing data sets. Predictions from the developed models were compared to that of the other existing models (Eurocode 2 and ACI 318) to validate its performance. The developed model provided a better prediction of the experimentally observed shear strength, compared to the existing models, yielding low mean absolute error, low bias and variability when tested.

### Probabilistic Characterization of the Axial Load Bearing Capacity of a Concrete Column Exposed to the Standard Fire

To demonstrate adequate structural fire safety for exceptional designs, the uncertainties of the design input parameters must be explicitly considered. In this contribution, the case study included in ISO/CD TR 24679-8:2020 of a concrete column subject to a standardized heating regime is revisited considering improved uncertainty modelling for the input parameters. Monte Carlo simulations are applied to obtain the distribution of the axial load bearing capacity of the column, $${P}_{\max}$$ P max , at 240 min of ISO 834 standard fire exposure. The obtained distribution however does not fit any distribution type commonly assumed for the resistance effect. To get more detailed information on the parameters governing the distribution of $${P}_{\max}$$ P max , and to allow for the application of more efficient calculation procedures and the development of design guidance, a detailed analysis of the obtained distribution for the load bearing capacity is conducted. The effect of each of the input parameters’ uncertainty on the column capacity is quantified using three different methods of sensitivity analysis. Furthermore the distribution type describing the concrete load bearing capacity for the considered standard fire exposure is evaluated in detail. It is concluded that the parameter defining the quantile of the concrete strength retention is the main contributor to the variability of the column capacity at 240 min standard fire exposure. Furthermore, it is found that the column capacity can be described by a mixed lognormal distribution, considering constituent lognormal distributions for fixed concrete strength retention parameter values. Based on these findings, improvements for probability of failure calculations of fire-exposed concrete columns are developed. The analysis provides insight for the reliability-based design of concrete columns exposed to fire, achieving a specified target safety level.

Balša Jovanović, Ruben Van Coile

### Probabilistic FEM-Analysis for the Retaining Wall of a Deep Excavation at SLS

Common practice for design of retaining walls for deep excavations is by using characteristic values for geotechnical parameters—as a cautious estimate—for Serviceability Limit State (SLS) and combined with partial factors for Ultimate Limit State (ULS), as indicated in the current design codes such as the Eurocodes. However, more complex probabilistic approaches are increasing in application in order to provide a more uniform level of reliability, thus reducing the cost of the investment or the risk, or both. Also, in terms of tools and methods for performing the calculations, the Finite Element Method (FEM) is very popular nowadays due accessible computers power and user-friendly specialized software which can provide more realistic model, with affordable calculation effort. The present paper presents a case study of applied full probabilistic analysis of a retaining wall for real project deep excavation in Bucharest city, Romania, by FEM calculation in Plaxis 2D software coupled with Probabilistic Toolkit (PTK) software for reliability calculation. The limit function is set on a target value for the displacements of the retaining wall to allow to design for the SLS, since this is in many cases the governing state for deep excavations in urban areas. Different probability distributions are used for assessing the statistics of the geotechnical parameters and the reliability results obtained through these are discussed. Also, a discussion is made on the necessity of including more specific target reliability values for SLS verification and especially for temporary structures in the design codes.

Alexandra Ene, Timo Schweckendiek, Horatiu Popa

### Probabilistic Methods for Code Calibration Exemplified for the Punching Shear Resistance Model Without Shear Reinforcement

To prevent punching shear failures, modern structural codes, as EN 1992-1-1, offer design provisions for punching shear. To verify the performance of the new code equations, those values are usually compared to test results. However, values for the safety factors are often determined using calibration on older code provisions. The deterministic nature of such approach introduces imprecisions that can compromise the safety level of a structural component. Nowadays, modern reliability analysis techniques, allied to increasing computer capabilities, provide efficient precision to evaluate the safety level of structural components. This paper contributes to code calibration through different reliability analysis techniques where the safety level of design provisions for punching shear resistance without shear reinforcement is investigated. This study addresses three reliability techniques: Mean-Value First Order Second Moment (MVFOSM), First-Order Second Moment (FOSM) and Monte-Carlo (MC-IS) with Importance Sampling. Thus, the parameters influencing the punching shear capacity are stochastically modelled and evaluated. Then, the reliability indices β, used as a practical measurement of the safety level, are estimated and compared to a β-target of 3.8 given by EN 1990 for a 50-year period. The results seem to confirm that the punching shear provisions for structural elements without shear reinforcement according to EN 1992-1-1 achieve a required safety level in line with the β-target. The study shows that the techniques FOSM and MC-IS seem appropriate to determine the failure probability of the design equations for the punching shear capacity without shear reinforcement. Furthermore, the study suggests that the MVFOSM method may not be suitable to evaluate the absolute safety level of those design equations. The study shows that the design equations are more sensitive to the variable describing the model uncertainty than to any other variable, which stresses the importance of an adequate statistical analysis of the basic variables of the resistance model.

Tânia Feiri, Marcus Ricker, Jan Philip Schulze-Ardey, Josef Hegger

### Probabilistic Modeling of Impact of Vehicles on the Road Furniture

Near driving lanes of roadways, numerous components of transport infrastructure are located along the route. Such components have to be secured by restraint systems, and in many cases different road lanes must also be effectively separated from each other. The focus of the study presented herein is to present a probabilistic approach for the departure of motor vehicles from their intended lane. Presently, assessments of the road infrastructure regarding possible accidents are primarily oriented to evaluating the resistance side. On the other hand, this paper intends to address the impact side by focusing on the likelihood of impact of vehicles on the road furniture. In order to determine the probability of impact, parameters of the traffic composition of the alignment, and of the pavement conditions were studied. A novel methodology is presented herein, which by accounting for these factors assesses the fragility of the infrastructure sub-system. The assessment joins both road engineering physics and expert judgements, and it is incorporated in spreadsheet tool. The feasibility of the tool is demonstrated, and sensitivities of the evaluation process are discussed and evaluated.

Alfred Strauss, Panagiotis Spyridis, Ivan Zambon, Thomas Moser, Christian Honeger, Dan M. Frangopol

### Probabilistic-Based Consequence Analysis for Transport Networks

The aim of this paper is to propose a methodological framework for consequence analysis of transportation networks. The probabilistic framework is based on the definition of performance indicators that describe the time-dependent functionality of the asset/system, starting from a pre-existing normal performance state, capturing the time and evolution of disruption during and after the disruption and during the recovery/restoration stage. A proposed case study that will be used for the demonstration of the applicability of the framework is described.

### Probability of Flooding Due to Instability of the Outer Slope of a Levee

The Netherlands is protected against major floods by a system of primary flood defenses (levees, dunes and hydraulic structures). These must comply with standards defined in terms of maximum allowable probabilities of flooding. Therefore, a new assessment framework for the main failure mechanisms is based on a probabilistic approach. One of the failure mechanisms which is not yet following such a probabilistic approach is instability of the outer slope of a levee. This failure mechanism is of importance after a rapid water level drop after a high-water event. In such an event, the pore water pressures in the levee are still high, and if an instability happens flooding can occur when (1) there is no time to take emergency measures before a second high-water event follows and (2) there is insufficient residual strength to prevent flooding during this consecutive high-water event. Levee reinforcement projects in the Netherlands allocate significant resources to resolve the presumed lack of safety of the levee due to the outer slope instability mechanism. Hence, this paper discusses how outer slope stability safety can be assessed probabilistically. A failure due to outer slope instability depends, besides the characteristics of the levee, on the peak water level, the water level drop velocity, the inter-arrival time between two consecutive high-water events and the time needed to take emergency measures. In this paper, a framework based on event trees is presented, using Intensity-Duration-Frequency-curves to include the time dependent statistics of the water level drop. This is a novel approach for outer slope instability in the Netherlands and results in less conservatism in assessments and designs, and therefore less required resources to mitigate the mechanism.

Anton W. van der Meer, Ana Teixeira, Arno P. C. Rozing, Wim Kanning

### Reliability Analysis of Timber Elements Under Different Load Types and Identification of Critical Scenarios for the Evaluation of Existing Structures

Maria Loebjinski, Wolfgang Rug, Hartmut Pasternak

### Reliability Assessment of Oil and Gas Pipeline Systems at Burst Limit State Under Active Corrosion

Civil infrastructures such as oil and gas transportation systems play a vital role in industrial and public energy distribution and consumption. A large number of existing oil and gas transportation pipelines in many cities in the USA are running at the end of their design life and are at risk. Failure in these systems can potentially cause adverse effects to the society, economy, and environment. Asset managers often need to prioritize the critical segments based on the risk of failure, available budget, and resources. In this paper, the fitness for service of oil and gas pipelines and network integrity are evaluated probabilistically using various burst pressure models to prioritize the riskiest segments to support asset management. The current state-of-the-art practice of burst failure models for pressurized metallic pipelines is compared using a physical probabilistic approach. Since metallic pipelines for oil and gas transportation are typically designed for a long lifespan and experience localized corrosion deterioration throughout their lifetime, a steady-state corrosion model was assumed for accounting for the effect of external corrosion deterioration on the burst pressure of pipelines. A Monte Carlo Simulation technique is utilized to generate the fragility curves of pipelines considering corrosion deterioration over time. Uncertainties involved in various parameters related to burst failure and fragility estimation are modelled based on the knowledge gained from past research. A comparative analysis is presented for various fragility models of pipelines. Also, system reliability was evaluated using a minimum cut sets approach. The proposed approach is illustrated for a simple hypothetical oil/gas transmission system. Outcomes of the study show a consistent trend of failure for various models over time. The results of the probabilistic models of burst failures are analyzed, and recommendations are provided to support asset management planning.

Ram K. Mazumder, Abdullahi M. Salman, Yue Li

### Risk Assessment of a Railway Bridge Subjected to a Multi-hazard Scenario

Bridges present valuable assets for the rail and road network by providing cross at critical links such as waterways, valleys, and other types of facilities. However, these types of structures are exposed to several threats during their life-cycle such as natural hazards and deterioration, which compromise their performance. To assess the condition state of such infrastructure and define maintenance and mitigation strategies, several performance indicators of quantitative nature have been proposed during the last decades by several researchers. Among those indicators, risk has received great attention as it enables to account for both the performance of infrastructures subjected to hazard events, and the consequences associated to an inadequate level of service of the infrastructure. Nevertheless, risk is not a stationary indicator, i.e. several parameters involved in the estimation of risk are time-dependent. One of them comprises the structural capacity of infrastructures, which is affected by deterioration effects over time. This gradual deterioration can be regarded as an interceptable hazard, which may act simultaneously with other non-interceptable hazards such as natural events (e.g. earthquakes). Therefore, a risk assessment framework should account for the probability of having these multiple hazards acting during the service life of infrastructures. The aim of this paper is to conduct a risk assessment for a railway bridge subjected to a multi-hazard scenario, i.e. an observable interceptable hazard corresponding to chloride induced corrosion of the reinforcing steel in reinforced concrete elements, together with seismic hazard. The results of the study demonstrate the relevance of considering time-dependent deterioration effects on the risk assessment of bridges, as the increase in the seismic fragility over time is significant. These findings are relevant for decision-making to plan and execute optimal interventions.

João Fernandes, Monica Santamaria, José C. Matos, Daniel V. Oliveira, António Abel Henriques

### Risk Assessment of Road Infrastructures as Key for Adaptability Measures Selection

Erica L. Arango, Hélder S. Sousa, José C. Matos

### Risk-Driven Decision Making Within the Observational Method: Case Study Based on the New International Airport of Mexico City

In geotechnical design, the Observational Method poses as an attractive solution for reducing construction costs without compromising safety, especially when dealing with a high level of uncertainty. Additionally, the benefits of the Observational Method can be elevated when it is applied in a probabilistic concept. Designing the soil improvement of the runways for the New International Airport of Mexico City (Nuevo Aeropuerto Internacional de la Ciudad de México—NAICM) holds significant risk due to the extremely soft soil, the soil-related uncertainties and the strict pavement operation requirements. Instead of opting for an over-conservative and costly design, the Observational Method was adopted in order to steer the soil improvement works according to monitored soil behaviour. The analysis presented in this paper, which is based on an example inspired by the NAICM, employs a probabilistic framework, composed of several probabilistic tools, in order to estimate the reliability of the design. Specifically, incoming monitoring (soil response) data is utilized in several reliability updating steps, giving insight into the probability of the design meeting the operational requirements. Moreover, assessing the reliability of a design allows for the quantification of risk, which can pose as a strong motivator during the decision-making process. Design decisions, such as application of mitigation measures, can be made according to the direction of risk minimization. Finally, the entire procedure of the Observational Method and the steering of the design throughout the soil improvement phase are illustrated in a decision tree. This paper draws conclusions on the benefits of incorporating probabilistic concepts in large scale projects with strong uncertainties, as well as utilizing risk as motivation for decision making, which eventually proves to be valuable for project management.

Antonios Mavritsakis, Martin de Kant, Joost van der Schrier

### Rockburst Risk Assessment Based on Soft Computing Algorithms

A key aspect that affect many deep underground mines over the world is the rockburst phenomenon, which can have a strong impact in terms of costs and lives. Accordingly, it is important their understanding in order to support decision makers when such events occur. One way to obtain a deeper and better understanding of the mechanisms of rockburst is through laboratory experiments. Hence, a database of rockburst laboratory tests was compiled, which was then used to develop predictive models for rockburst maximum stress and rockburst risk indexes through the application of soft computing techniques. The next step is to explore data gathered from in situ cases of rockburst. This study focusses on the analysis of such in situ information in order to build influence diagrams, enumerate the factors that interact in the occurrence of rockburst, and understand the relationships between them. In addition, the in situ rockburst data were also analyzed using different soft computing algorithms, namely artificial neural networks (ANNs). The aim was to predict the type of rockburst, that is, the rockburst level, based on geologic and construction characteristics of the mine or tunnel. One of the main observations taken from the study is that a considerable percentage of accidents occur as a result of excessive loads, generally at depths greater than 1000 m. In addition, it was also observed that soft computing algorithms can give an important contribution on determination of rockburst level, based on geologic and construction-related parameters.

Joaquim Tinoco, Luis Ribeiro e Sousa, Tiago Miranda, Rita Leal e Sousa

### Semi-empirical Based Response Surface Approach for Reliability Evaluation of Steel Plates with Random Fields of Corrosion

The paper presents a semi-empirical based response surface approach for structural reliability analysis of steel plates with non-uniform corrosion represented by random fields. The approach consists of using a semi-empirical design equation as simplified response surface model, which is then calibrated iteratively by means of the results of non-linear finite element analyses at the design points calculated by the First Order Reliability Method. This technique has been successfully applied to problems formulated in terms of discrete random variables and is now applied to problems involving spatial variability of structural parameters represented by random fields. The approach is first illustrated with an example of the ultimate strength of plates with random imperfections and material properties and then applied to plates with random fields of corrosion discretized using the Expansion Optimal Linear Estimation method. The results obtained by the semi-empirical based response surface approach and by coupling directly the First Order Reliability Method and the finite element code are compared.

Angelo P. Teixeira, Carlos Guedes Soares

### Spatial Variability of Rebar Corrosion and Performance Evaluation of Corroded RC Structures Using Probabilistic Analysis and Finite Element Method

Corrosion of steel reinforcement is spatially distributed over RC structures due to several factors such as different environmental exposure, concrete quality and cover. Ignoring the effect of spatial variability is a drastic simplification for the prediction of the remaining service life of RC structures. Therefore, it is essential to identify the parameters influencing the spatial steel corrosion and structural performance of corroded RC structures. In this paper, an experimental research was conducted to study the effects of current density, concrete cover, rebar diameter, and fly ash on the spatial variability of steel weight loss, corrosion crack, and structural behavior of corroded RC beams using X-ray and digital image processing technique. The test results showed that low current density induced highly non-uniform corrosion associated with few large pits and cracks at certain locations while higher current density produced more uniform corrosion and cracks occurred over the whole beam. Gumbel distribution parameters were derived from the experimental data of steel weight loss to model spatial steel corrosion. A novel approach was established to assess the reliability of RC structures using finite element analysis and probabilistic simulation considering the spatial variability in steel weight loss. Using the Gumbel distribution parameters derived from the steel weight loss data associated with higher current density may underestimate the non-uniformity of corrosion distribution which can lead to an overestimation of the load capacity of corroded RC structures.

Mitsuyoshi Akiyama, Dan M. Frangopol, Mingyang Zhang

### Statistical Dependence Investigation Related to Dowel-Type Timber Joints

The design of timber connections with dowel-type fasteners is dependent on the knowledge of their mechanical behavior and failure modes. Concerning the main design parameters, the timber embedment strength and the dowel bending moment capacity are the parameters that govern the load-carrying capacity. The correlation between the timber embedment strength and the dowel bending moment capacity has not been sufficiently addressed in the literature yet. However, since they both share a common dependency to the timber density, they are probably correlated. To investigate this, traditional distribution fitting procedures, as well as copula functions, are implemented to consider the correlation between them. By doing so, it is aimed to evaluate the effectiveness of the different approaches in describing the dependence structure of the variables and their influence on the structural reliability. It was found that, for single dowel-type connections, the impact of the copulas on the results is small. It is indicated that, unless significantly nonlinear correlations exist among the data, the results obtained by applying different copula functions will probably be very close.

Caroline D. Aquino, Leonardo G. Rodrigues, Wellison S. Gomes, Jorge M. Branco

### Stochastic Carbon Dioxide Forecasting Model for Concrete Durability Applications

Over the Earth’s history, the climate has changed considerably due to natural processes affecting directly the earth. In the last century, these changes have perpetrated global warming. Carbon dioxide is the main trigger for climate change as it represents approximately up to 80% of the total greenhouse gas emissions. Climate change and concrete carbonation accelerate the corrosion process increasing the infrastructure maintenance and repair costs of hundreds of billions of dollars annually. The concrete carbonation process is based on the presence of carbon dioxide and moisture, which lowers the pH value to around 9, in which the protective oxide layer surrounding the reinforcing steel bars is penetrated and corrosion takes place. Predicting the effective retained service life and the need for repairs of the concrete structure subjected to carbonation requires carbon dioxide forecasting in order to increase the lifespan of the bridge. In this paper, short term memory process models were used to analyze a historical carbon dioxide database, and specifically to fill in the missing database values and perform predictions. Various models were used and the accuracy of the models was compared. We found that the proposed Stochastic Markovian Seasonal Autoregressive Integrated Moving Average (MSARIMA) model provides $$R^{2}$$ R 2 value of 98.8%, accuracy in forecasting value of 89.7% and a variance in the value of the individual errors of 0.12. When compared with the CO2 database values, the proposed MSARIMA model provides a variance value of −0.1 and a coefficient of variation value of −8.0 $${\text{e}}^{ - 4}$$ e - 4 .

Bassel Habeeb, Emilio Bastidas-Arteaga, Helena Gervásio, Maria Nogal

### Stochastic Degradation Model of Concrete Bridges Using Data Mining Tools

Bridges have a significant importance within the transportation system given that their functionality is vital for the economic and social development of countries. Therefore, a high level of safety and serviceability must be achieved to guarantee an operational state of the bridge network. In this regard, it is necessary to track the performance of bridges and obtain indicators to characterize the evolution of structural pathologies over time. In this paper, the time-dependent expected deterioration of bridge networks is investigated by use of Markov chains models. Bridges in a network are likely to share similar environmental conditions but depending on their functional class may be exposed to different loading conditions that diversely affect their structural deterioration over time. Moreover, the deterioration rate is known to increase with time due to aging. Hence, it is useful to identify and divide the bridge network into classes sharing similar deterioration trends in order to obtain a more accurate prediction. To this end, data mining tools such as two-step cluster analysis is applied to a dataset obtained from the National Bridge Inventory (NBI) database, in order to find associations among the bridge characteristics that could contribute to build a more specific degradation model which accurately explains and predicts the future condition of concrete bridges. The results demonstrate a particular deterioration path for each cluster, where it is evidenced that older bridges and those having higher Average Daily Traffic (ADT) deteriorate faster. Therefore, the degradation models developed following the proposed methodology provide a more accurate prediction when compared to a single degradation model without clustering analysis. This more reliable models facilitate the decision process of bridge management systems.

Yina F. M. Moscoso, Monica Santamaria, Hélder S. Sousa, José C. Matos

### Stochastic Simulation of Clay Brick Masonry Walls with Spatially Variable Material Properties

In the assessment of existing masonry structures, a high variability of material properties can be observed. The variability is also present within a single wall, which raises the question of how this spatial variability influences the load-bearing capacity and the reliability of an assessed masonry wall. With regard to reliability, lower quantile values of the load-bearing capacity are decisive. For this reason, the influence of spatial variability on the probability distribution of the load-bearing capacity has to be known. In this paper, clay brick masonry walls in compression are investigated by Monte Carlo simulations utilising a nonlinear finite element model. For the validation of the finite element model, experimental investigations of the stress redistribution capability of masonry walls with weak spots were carried out. The numerical model follows a simplified micro-modeling approach with unit-to-unit variability of the material properties. Results of the stochastic simulations are shown for varying wall length, slenderness and coefficients of variation of the material properties. The obtained statistical distributions of the load-bearing capacity are evaluated with respect to acceptable design values for ensuring structural reliability. It is shown that spatial variability leads to a reduction of the mean load-bearing capacity, but the overall variability of the load-bearing capacity is much smaller than that of the spatially varying material properties. Compared to an approach assuming homogeneity within the wall, the consideration of spatial variability leads to an increase of suitable design values.

Dominik Müller, Tilo Proske, Carl-Alexander Graubner

### Study on the Accuracy of Chloride Determination Methods and Their Predictions

At ASFiNAG, most structures are made of reinforced or prestressed concrete. In Austria strong winters are obligatory. Therefore, de-icing measures with salt are used for thawing. These substances contain chlorides, that ingress into the concrete and lead to degradation. Several elements are strongly exposed to chlorides, such as girders of overpasses and columns of bridges beside and between roadways. For condition assessment and service life prediction of existing road structures the determination of a reliable chloride content is key. The article presents the results of the chloride content of reinforced concrete obtained by two methods. One is the conventional Cl-determination described in standards. The other is the LA-ICP-MS (Laser Ablation Inductively Coupled Plasma Mass Spectrometry) method, which is a fast, reliable, accurate and high-resolution analysis method. This method allows the determination of the chloride content as a fraction of cement and additionally distinguishes between the aggregate and the cement phase. The profiles were determined densely at steps of 3 mm in depth. Regressions with different boundary conditions were used to fit the obtained data according to Fick’s second law. For comparison and prediction purposes, the corresponding convection depth as well as the chloride diffusion coefficients were determined. These parameters, as well as the fluctuations of chloride profiles for one year deliver important insights for assessment and prediction. Significant differences were observed in the results obtained by the applied analysis methods. The study addresses the origins of these differences and shows the variances when it comes to prediction of remaining service life. The results are compared and discussed to show the complex nature and sensitivity of the derived input parameters. These results show on one hand the importance for an accurate chloride analysis (LA-ICP-MS) and gives hints for an improved assessment of structures.

Fritz Binder, Stefan L. Burtscher, Alfred Strauss

### The Impact of Clustering in the Performance Prediction of Transportation Infrastructures

In the context of transportation infrastructures management, bridges are a critical asset due to their potential of becoming network’s bottlenecks. Unfortunately, this aspect has been emphasized due to several bridge failures, occurred in the last years worldwide, resulting from climate change-related hazards. Given this, it is important to establish accurate tools for predicting the structural condition and behavior of bridges during their lifetime. The present paper addresses this topic taking into account one of the statistical models most used and generally accepted in existing bridge management systems—Markov’s stochastic approach, which is further described. These statistical models are highly susceptible to the data that feeds them. Quite often, the step related with data cleaning and clustering is not properly conducted, being the most commonly available data sets adopted in bridge’s performance prediction. This paper presents a comparative analysis between different performance predictions. The only different between consecutive scenarios corresponds to the subset of bridges database used in each analysis. It was found that the development of good data clusters is of utmost importance. Contrarily, the use of poor clusters can lead to deceiving results which hinder the actual deterioration tendency, thus leading to wrong maintenance decisions.

Carlos Santos, Sérgio Fernandes, Mário Coelho, José C. Matos

### Uncertainty Assessment in Building Physics Related Problems Using Stochastic Finite Element Method

In the calculations of the heat transfer, the material parameters are usually based on the laboratory tests of the given material. Afterwards, they are applied in the calculations as deterministic values, after taking into account effects of relative humidity, temperature and material aging. However, one can distinguish various uncertainties for the material systems. In the calculations of the energy demand, they may induce significant variations of the results. In the article, analysis of uncertain thermal conductivity of expanded and extruded polystyrene, with relation to the values declared by the producer, as well as of density and thermal conductivity of constructive material, is investigated. The possible variations of the thermal conductivity of the insulating materials are based on the statistical analysis of the database provided by the Construction Control Authority in Poland. Two methods are applied in order to determine expected value and variance of temperature field and heat flux on the internal side of the wall: the tenth order perturbation stochastic Finite Element Method and the Monte Carlo method. The partial derivatives of temperature with respect to a random variable are determined using the Direct Differential Method. Whilst giving very accurate results, the perturbation SFEM is much more efficient than the Monte Carlo method for transient heat transport in a double-layer external envelope. The highest variance has been calculated for a node situated in between the constructive and the insulating layer, regardless of which material random property has been considered. The heat loss variation is related to the thermal resistance of the layer.

Witold Grymin, Marcin Koniorczyk

### Uncertainty Associated to Regression Models used for Assessing the Stiffness of Structural Timber Elements

The evaluation of the mechanical behaviour of timber beams or glued laminated timber lamellas in-service are generally a difficult task due to the different sources of uncertainty involved (small knowledge about the initial quality of timber, small samples, models uncertainty, human errors). The use of statistical methods that can incorporate part of the uncertainty are probably a suitable way to ensure that the predictions made could provide a reliable prediction of the desired property. In most situations while performing in situ assessment of timber structures, the application of non or semi-destructive testing (NDT or SDT) methods relies on regression linear models showing noticeable different coefficients of determination. Another source of uncertainty happens when making in-situ testing relying on the application of existing regression models to timber members without being sure about the wood species or the origin of the wood species. Can these models be used when it is commonly accepted that knowledge on timber’s origin and species have a major impact on the capability to predict strength and stiffness? To comply with uncertainty several studies have been trying to use statistical methods that can incorporate prior information (e.g. Bayesian methods) or uncertainty (e.g. Markov chain Monte Carlo—MCMC). In the present paper uncertainty associated to the use of linear regression models are discussed using as example the prediction of static modulus of elasticity from dynamic modulus of elasticity. For that purpose, data taken from literature and from studies conducted at LNEC are compared, analysed and discussed having in mind to verify the utility of the application of Bayesian linear regression approach and Monte Carlo Markov Chains (MCMC) estimation.