Skip to main content
main-content

Über dieses Buch

This book constitutes the refereed post-proceedings of the 10th IFIP WG 2.5 Working Conference on Uncertainty Quantification in Scientific Computing, WoCoUQ 2011, held in Boulder, CO, USA, in August 2011. The 24 revised papers were carefully reviewed and selected from numerous submissions. They are organized in the following topical sections: UQ need: risk, policy, and decision making, UQ theory, UQ tools, UQ practice, and hot topics. The papers are followed by the records of the discussions between the participants and the speaker.

Inhaltsverzeichnis

Frontmatter

UQ Need: Risk, Policy, and Decision Making

Uncertainties Using Genomic Information for Evidence-Based Decisions

For the first time, technology exists to monitor the biological state of an organism at multiple levels. It is now possible to detect which genes are activated or deactivated when exposed to a chemical compound; to measure how these changes in gene expression cause the concentrations of cell metabolites to increase or decrease; to record whether these changes influence the over-all health of the organism. By integrating all this information, it may be possible not only to explain how a person’s genetic make-up might enhance her susceptibility to disease, but also to anticipate how drug therapy might affect that individual in a particularized manner.
But two related uncertainties obscure the path forward in using these advances to make regulatory decisions. These uncertainties relate to the unsettled notion of the term “evidence” — both from a scientific and legal perspective. From a scientific perspective, as models based on genomic information are developed using multiple datasets and multiple studies, the weight of scientific evidence will need to be established not only on long established protocols involving p-values, but will increasingly depend on still evolving Bayesian measures of evidentiary value. From a legal perspective, new legislation for the Food and Drug Administration has only recently made it possible to consider information beyond randomized, clinical trials when evaluating drug safety. More generally, regulatory agencies are mandated to issue laws based on a “rational basis,” which courts have construed to mean that a rule must be based, at least partially, on the scientific evidence. It is far from certain how judges will evaluate the use of genomic information if and when these rules are challenged in court.
Pasky Pascual

Considerations of Uncertainty in Regulatory Decision Making

NRC’s approach to ensuring the safety of nuclear power includes two complementary approaches, one more deterministic and one more probabilistic. These two approaches address the uncertainties in the underlying methods and data differently, with each approach having its strengths and limitations.
This paper provides some background on the historical evolution of deterministic and probabilistic methods in the regulation of nuclear power plants, describes the Commission’s policy on the use of probabilistic methods to complement the more traditional deterministic approach, and identifies some example challenges as a staff group considers a strategic vision of how the agency should regulate in the future.
Mark A. Cunningham

An Industrial Viewpoint on Uncertainty Quantification in Simulation: Stakes, Methods, Tools, Examples

Simulation is nowadays a major tool in R&D and engineering studies. In industrial practice, in both design and operating stages, the behavior of a complex system is described and forecast by a computer model, which is, most of time, deterministic. Yet, engineers coping with quantitative predictions using deterministic models deal actually with several sources of uncertainties affecting the inputs (and occasionally the model itself) which are transferred to the outputs. Therefore, uncertainty quantification in simulation has garnered increased importance in recent years. In this paper we present an industrial viewpoint of this practice. After a reminder of the main stakes related to uncertainty quantification and probabilistic computing, we will focus on the specific methodology and software tools which have been developed for treating this problem at EDF R&D. We conclude with examples illustrating applied studies recently performed by EDF R&D engineers arising from different physical domains.
Alberto Pasanisi, Anne Dutfoy

Living with Uncertainty

This paper describes 12 years of experience in developing simulation software for automotive companies. By building software from scratch, using boundary integral methods and other techniques, it has been possible to tailor the software to address specific issues that arise in painting processes applied to vehicles and to provide engineers with results for real-time optimization and manufacturing analysis. The title provides the focus and the paper describes how living under the shadow of uncertainty has made us more innovative and more resourceful in solving problems that we never really expected to encounter when we started on this journey in 1999.
Patrick Gaffney

Uncertainty and Sensitivity Analysis: From Regulatory Requirements to Conceptual Structure and Computational Implementation

An approach to the conversion of regulatory requirements into a conceptual and computational structure that permits meaningful uncertainty and sensitivity analyses is descibed. This approach is predicated on the description of the desired analysis in terms of three basic entities: (i) a probability space characterizing aleatory uncertainty, (ii) a probability space characterizing epistemic uncertainty, and (iii) a model that predicts system behavior. The presented approach is illustrated with results from the 2008 performance assessment for the proposed repository for high-level radioactive waste at Yucca Mountain, Nevada.
Jon C. Helton, Cédric J. Sallaberry

UQ Theory

Bayes Linear Analysis for Complex Physical Systems Modeled by Computer Simulators

Most large and complex physical systems are studied by mathematical models, implemented as high dimensional computer simulators. While all such cases differ in physical description, each analysis of a physical system based on a computer simulator involves the same underlying sources of uncertainty. These sources are defined and described below. In addition, there is a growing field of study which aims to quantify and synthesize all of the uncertainties involved in relating models to physical systems, within the framework of Bayesian statistics, and to use the resultant uncertainty specification to address problems of forecasting and decision making based on the application of these methods. We present an overview of the current status and future challenges in this emerging methodology, illustrating with examples drawn from current areas of application including: asset management for oil reservoirs, galaxy modeling, and rapid climate change.
Michael Goldstein

Verified Computation with Probabilities

Because machine calculations are prone to errors that can sometimes accumulate disastrously, computer scientists use special strategies called verified computation to ensure output is reliable. Such strategies are needed for computing with probability distributions. In probabilistic calculations, analysts have routinely assumed (i) probabilities and probability distributions are precisely specified, (ii) most or all variables are independent or otherwise have well-known dependence, and (iii) model structure is known perfectly. These assumptions are usually made for mathematical convenience, rather than with empirical justification, even in sophisticated applications. Probability bounds analysis computes bounds guaranteed to enclose probabilities and probability distributions even when these assumptions are relaxed or removed. In many cases, results are best-possible bounds, i.e., tightening them requires additional empirical information. This paper presents an overview of probability bounds analysis as a computationally practical implementation of the theory of imprecise probabilities that represents verified computation of probabilities and distributions.
Scott Ferson, Jack Siegrist

Defects, Scientific Computation and the Scientific Method

Computation has rapidly grown in the last 50 years so that in many scientific areas it is the dominant partner in the practice of science. Unfortunately, unlike the experimental sciences, it does not adhere well to the principles of the scientific method as espoused by for example, the philosopher Karl Popper. Such principles are built around the notions of deniability and reproducibility. Although much research effort has been spent on measuring the density of software defects, much less has been spent on the more difficult problem of measuring their effect on the output of a program. This paper explores these issues with numerous examples suggesting how this situation might be improved to match the demands of modern science. Finally it develops a theoretical model based on Shannon information which suggests that software systems have strong implementation independent behaviour and presents supporting evidence.
Les Hatton

Parametric and Uncertainty Computations with Tensor Product Representations

Computational uncertainty quantification in a probabilistic setting is a special case of a parametric problem. Parameter dependent state vectors lead via association to a linear operator to analogues of covariance, its spectral decomposition, and the associated Karhunen-Loève expansion. From this one obtains a generalised tensor representation The parameter in question may be a tuple of numbers, a function, a stochastic process, or a random tensor field. The tensor factorisation may be cascaded, leading to tensors of higher degree. When carried on a discretised level, such factorisations in the form of low-rank approximations lead to very sparse representations of the high dimensional quantities involved. Updating of uncertainty for new information is an important part of uncertainty quantification. Formulated in terms or random variables instead of measures, the Bayesian update is a projection and allows the use of the tensor factorisations also in this case.
Hermann G. Matthies, Alexander Litvinenko, Oliver Pajonk, Bojana V. Rosić, Elmar Zander

UQ Tools

Using Emulators to Estimate Uncertainty in Complex Models

The Managing Uncertainty in Complex Models project has been developing methods for estimating uncertainty in complex models using emulators. Emulators are statistical descriptions of our beliefs about the models (or simulators). They can also be thought of as interpolators of simulator outputs between previous runs. Because they are quick to run, emulators can be used to carry out calculations that would otherwise require large numbers of simulator runs, for example Monte Carlo uncertainty calculations. Both Gaussian and Bayes Linear emulators will be explained and examples given. One of the outputs of the MUCM project is the MUCM toolkit, an on-line recipe book for emulator based methods. Using the toolkit as our basis we will illustrate the breadth of applications that can be addressed by emulator methodology and detail some of the methodology. We will cover sensitivity and uncertainty analysis and describe in less detail other aspects such as how emulators can also be used to calibrate complex computer simulators and how they can be modified for use with stochastic simulators.
Peter Challenor

Measuring Uncertainty in Scientific Computation Using Numerica 21’s Test Harness

The test harness, TH, is a tool developed by Numerica 21 to facilitate the testing and evaluation of scientific software during the development and maintenance phases of such software. This paper describes how the tool can be used to measure uncertainty in scientific computations. It confirms that the actual behavior of the code when subjected to changes, typically small, in the code input data reflects formal analysis of the problem’s sensitivity to its input. Although motivated by studying small changes in the input data, the test harness can measure the impact of any changes, including those that go beyond the formal analysis.
Brian T. Smith

UQ Practice

Numerical Aspects in the Evaluation of Measurement Uncertainty

Numerical quantification of the results from a measurement uncertainty computation is considered in terms of the inputs to that computation. The primary output is often an approximation to the PDF (probability density function) for the univariate or multivariate measurand (the quantity intended to be measured). All results of interest can be derived from this PDF. We consider uncertainty elicitation, propagation of distributions through a computational model, Bayes’ rule and its implementation and other numerical considerations, representation of the PDF for the measurand, and sensitivities of the numerical results with respect to the inputs to the computation. Speculations are made regarding future requirements in the area and relationships to problems in uncertainty quantification for scientific computing.
Maurice Cox, Alistair Forbes, Peter Harris, Clare Matthews

Model-Based Interpolation, Prediction, and Approximation

Model-based interpolation, prediction, and approximation are contingent on the choice of model: since multiple alternative models typically can reasonably be entertained for each of these tasks, and the results are correspondingly varied, this often is a considerable source of uncertainty. Several statistical methods are illustrated that can be used to assess the contribution that this uncertainty component makes to the uncertainty budget: when interpolating concentrations of greenhouse gases over Indianapolis, predicting the viral load in a patient infected with influenza A, and approximating the solution of the kinetic equations that model the progression of the infection.
Antonio Possolo

Uncertainty Quantification for Turbulent Mixing Flows: Rayleigh-Taylor Instability

Uncertainty Quantification (UQ) for fluid mixing depends on the length scales for observation: macro, meso and micro, each with its own UQ requirements. New results are presented here for macro and micro observables. For the micro observables, recent theories argue that convergence of numerical simulations in Large Eddy Simulations (LES) should be governed by space-time dependent probability distribution functions (PDFs, in the present context, Young measures) which satisfy the Euler equation. From a single deterministic simulation in the LES, or inertial regime, we extract a PDF by binning results from a space time neighborhood of the convergence point. The binned state values constitute a discrete set of solution values which define an approximate PDF. The convergence of the associated cumulative distribution functions (CDFs) are assessed by standard function space metrics.
T. Kaman, R. Kaufman, J. Glimm, D. H. Sharp

From Quantification to Visualization: A Taxonomy of Uncertainty Visualization Approaches

Quantifying uncertainty is an increasingly important topic across many domains. The uncertainties present in data come with many diverse representations having originated from a wide variety of disciplines. Communicating these uncertainties is a task often left to visualization without clear connection between the quantification and visualization. In this paper, we first identify frequently occurring types of uncertainty. Second, we connect those uncertainty representations to ones commonly used in visualization. We then look at various approaches to visualizing this uncertainty by partitioning the work based on the dimensionality of the data and the dimensionality of the uncertainty. We also discuss noteworthy exceptions to our taxonomy along with future research directions for the uncertainty visualization community.
Kristin Potter, Paul Rosen, Chris R. Johnson

Efficient Computation of Observation Impact in 4D-Var Data Assimilation

Data assimilation combines information from an imperfect model, sparse and noisy observations, and error statistics, to produce a best estimate of the state of a physical system. Different observational data points have different contributions to reducing the uncertainty with which the state is estimated. Quantifying the observation impact is important for analyzing the effectiveness of the assimilation system, for data pruning, and for designing future sensor networks. This paper is concerned with quantifying observation impact in the context of four dimensional variational data assimilation. The main computational challenge is posed by the solution of linear systems, where the system matrix is the Hessian of the variational cost function. This work discusses iterative strategies to efficiently solve this system and compute observation impacts.
Alexandru Cioaca, Adrian Sandu, Eric De Sturler, Emil Constantinescu

Interval Based Finite Elements for Uncertainty Quantification in Engineering Mechanics

This paper illustrates how interval analysis can be used as a basis for generalized models of uncertainty. When epistemic uncertainty is presented as a range and the aleatory is based on available information, or when random variables are assigned an interval probability, the uncertainty will have a Probability Bound (PB) structure. When Interval Monte Carlo (IMC) is used to sample random variables, interval random values are generated. Interval Finite Element Method (FEM) is used to propagate intervals through the system and sharp interval solutions are obtained. Interval solutions are sorted and PBs of the system response are constructed. All relevant statistics are calculated characterizing both aleatory and epistemic uncertainty. The above mentioned sequence is presented in this work and illustrative examples are solved.
Rafi L. Muhanna, Robert L. Mullen

Reducing the Uncertainty When Approximating the Solution of ODEs

One can reduce the uncertainty in the quality of an approximate solution of an ordinary differential equation (ODE) by implementing methods which have a more rigorous error control strategy and which deliver an approximate solution that is much more likely to satisfy the expectations of the user. We have developed such a class of ODE methods as well as a collection of software tools that will deliver a piecewise polynomial as the approximate solution and facilitate the investigation of various aspects of the problem that are often of as much interest as the approximate solution itself. We will introduce measures that can be used to quantify the reliability of an approximate solution and discuss how one can implement methods that, at some extra cost, can produce very reliable approximate solutions and therefore significantly reduce the uncertainty in the computed results.
Wayne H. Enright

Hot Topics

Uncertainties in Predictions of Material Performance Using Experimental Data That Is Only Distantly Related to the System of Interest

There is a need for predictive material “aging” models in the nuclear energy industry, where applications include life extension of existing reactors, the development of high burnup fuels, and dry cask storage of used nuclear fuel. These problems require extrapolating from the validation domain, where there is available experimental data, to the application domain, where there is little or no experimental data. The need for predictive material aging models will drive the need for associated assessments of the uncertainties in the predictions. Methods to quantify uncertainties in model predictions, using experimental data that is only distantly related to the application domain, are discussed in this paper.
Wayne E. King, Athanasios Arsenlis, Charles Tong, William L. Oberkampf

A Note on Uncertainty in Real-Time Analytics

Today real-time analytics of large data sets is invariably computer-assisted and often includes a “human-in-the-loop”. Humans differ from each other and all have a very limited innate capacity to process new information in real-time. This introduces statistical and systematic uncertainties into observations, analyses and decisions humans make when they are “in the loop”. Humans also have unconscious and conscious biases, and these can introduce (major) systematic errors into human assisted or human driven analytics. This note briefly discusses the issues and the (considerable) implications they can have on real-time analytics that involves humans, including software interfaces, learning, and reaction of humans in emergencies.
Mladen A. Vouk

Backmatter

Weitere Informationen

BranchenIndex Online

Die B2B-Firmensuche für Industrie und Wirtschaft: Kostenfrei in Firmenprofilen nach Lieferanten, Herstellern, Dienstleistern und Händlern recherchieren.

Whitepaper

- ANZEIGE -

Globales Erdungssystem in urbanen Kabelnetzen

Bedingt durch die Altersstruktur vieler Kabelverteilnetze mit der damit verbundenen verminderten Isolationsfestigkeit oder durch fortschreitenden Kabelausbau ist es immer häufiger erforderlich, anstelle der Resonanz-Sternpunktserdung alternative Konzepte für die Sternpunktsbehandlung umzusetzen. Die damit verbundenen Fehlerortungskonzepte bzw. die Erhöhung der Restströme im Erdschlussfall führen jedoch aufgrund der hohen Fehlerströme zu neuen Anforderungen an die Erdungs- und Fehlerstromrückleitungs-Systeme. Lesen Sie hier über die Auswirkung von leitfähigen Strukturen auf die Stromaufteilung sowie die Potentialverhältnisse in urbanen Kabelnetzen bei stromstarken Erdschlüssen. Jetzt gratis downloaden!

Bildnachweise