Skip to main content
Top

2012 | Book

Belief Functions: Theory and Applications

Proceedings of the 2nd International Conference on Belief Functions, Compiègne, France 9-11 May 2012

Editors: Thierry Denoeux, Marie-Hélène Masson

Publisher: Springer Berlin Heidelberg

Book Series : Advances in Intelligent and Soft Computing

insite
SEARCH

About this book

The theory of belief functions, also known as evidence theory or Dempster-Shafer theory, was first introduced by Arthur P. Dempster in the context of statistical inference, and was later developed by Glenn Shafer as a general framework for modeling epistemic uncertainty. These early contributions have been the starting points of many important developments, including the Transferable Belief Model and the Theory of Hints. The theory of belief functions is now well established as a general framework for reasoning with uncertainty, and has well understood connections to other frameworks such as probability, possibility and imprecise probability theories.

This volume contains the proceedings of the 2nd International Conference on Belief Functions that was held in Compiègne, France on 9-11 May 2012. It gathers 51 contributions describing recent developments both on theoretical issues (including approximation methods, combination rules, continuous belief functions, graphical models and independence concepts) and applications in various areas including classification, image processing, statistics and intelligent vehicles.

Table of Contents

Frontmatter
On Belief Functions and Random Sets

We look back at how axiomatic belief functions were viewed as distributions of random sets, and address the problem of joint belief functions in terms of

copulas

. We outline the axiomatic development of belief functions in the setting of

incidence algebras

, and some aspects of decision-making with belief functions.

Hung T. Nguyen
Evidential Multi-label Classification Using the Random k-Label Sets Approach

Multi-label classification deals with problems in which each instance can be associated with a set of labels. An effective multi-label method, named RA

k

EL, randomly breaks the initial set of labels into smaller sets and trains a single-label classifier in each of this subset. To classify an unseen instance, the predictions of all classifiers are combined using a voting process. In this paper, we adapt the RA

k

EL approach under the belief function framework applied to set-valued variables. Using evidence theory makes us able to handle lack of information by associating a mass function to each classifier and combining them conjunctively. Experiments on real datasets demonstrate that our approach improves classification performances.

Sawsan Kanj, Fahed Abdallah, Thierry Denœux
An Evidential Improvement for Gender Profiling

CCTV systems are broadly deployed in the present world. To ensure in-time reaction for intelligent surveillance, it is a fundamental task for real-world applications to determine the gender of people of interest. However, normal video algorithms for gender profiling (usually face profiling) have three drawbacks. First, the profiling result is always uncertain. Second, for a time-lasting gender profiling algorithm, the result is not stable. The degree of certainty usually varies, sometimes even to the extent that a male is classified as a female, and vice versa. Third, for a robust profiling result in cases were a person’s face is not visible, other features, such as body shape, are required. These algorithms may provide different recognition results - at the very least, they will provide different degrees of certainties. To overcome these problems, in this paper, we introduce an evidential approach that makes use of profiling results from multiple algorithms over a period of time. Experiments show that this approach does provide better results than single profiling results and classic fusion results.

Jianbing Ma, Weiru Liu, Paul Miller
An Interval-Valued Dissimilarity Measure for Belief Functions Based on Credal Semantics

Evidence theory extends Bayesian probability theory by allowing for a more expressive model of subjective uncertainty. Besides standard interpretation of belief functions, where uncertainty corresponds to probability masses which might refer to whole subsets of the possibility space,

credal

semantics can be also considered. Accordingly, a belief function can be identified with the whole set of probability mass functions consistent with the beliefs induced by the masses. Following this interpretation, a novel, set-valued, dissimilarity measure with a clear behavioral interpretation can be defined. We describe the main features of this new measure and comment the relation with other measures proposed in the literature.

Alessandro Antonucci
An Evidential Pattern Matching Approach for Vehicle Identification

In this paper, we propose a novel pattern matching approach for vehicle identification based on belief functions. Distances are computed within a belief decision space rather than directly in the feature space as traditionally done. The main goal of the paper is to compare performances obtained when using several distances between belief functions recently introduced by the authors. Belief functions are modeled using the outputs of a set of modality-based 1-NN classifiers, two distinct uncertainty modeling techniques and are combined with Dempster’s rule. Results are obtained on real data gathered from sensor nodes with 4 signal modalities and for 4 classes of vehicles (pedestrian, bicycle, car, truck). Main results show the importance of the uncertainty technique used and the interest of the proposed pattern matching approach in terms of performance and expressiveness.

Anne-Laure Jousselme, Patrick Maupin
A Comparison between a Bayesian Approach and a Method Based on Continuous Belief Functions for Pattern Recognition

The theory of belief functions in discrete domain has been employed with success for pattern recognition. However, the Bayesian approach performs well provided that once the probability density functions are well estimated. Recently, the theory of belief functions has been more and more developed to the continuous case. In this paper, we compare results obtained by a Bayesian approach and a method based on continuous belief functions to characterize seabed sediments. The probability density functions of each feature of seabed sediments are unimodal and estimated from a Gaussian model and compared with an

α

-stable model.

Anthony Fiche, Arnaud Martin, Jean-Christophe Cexus, Ali Khenchaf
Prognostic by Classification of Predictions Combining Similarity-Based Estimation and Belief Functions

Forecasting the future states of a complex system is of paramount importance in many industrial applications covered in the community of Prognostics and Health Management (PHM). Practically, states can be either continuous (the value of a signal) or discrete (functioning modes). For each case, specific techniques exist. In this paper, we propose an approach called EVIPRO-KNN based on case-based reasoning and belief functions that jointly estimates the future values of the continuous signal and of the future discrete modes. A real datasets is used in order to assess the performance in estimating future break-down of a real system where the combination of both strategies provide the best prediction accuracies, up to 90%.

Emmanuel Ramasso, Michèle Rombaut, Noureddine Zerhouni
Adaptive Initialization of a EvKNN Classification Algorithm

The establishment of the learning data base is a long and tedious task that must be carried out before starting the classification process. An Evidential KNN (EvKNN) has been developed in order to help the user, which proposes the “best” samples to label according to a strategy. However, at the beginning of this task, the classes are not clearly defined and are represented by a number of labeled samples smaller than the k required samples for EvKNN. In this paper, we propose to take into account the available information on the classes using an adapted evidential model. The algorithm presented in this paper has been tested on the classification of an image collection.

Stefen Chan Wai Tim, Michèle Rombaut, Denis Pellerin
Classification Trees Based on Belief Functions

Decision tree classifiers are popular classification methods. In this paper, we extend to multi-class problems a decision tree method based on belief functions previously described for two-class problems only. We propose three possible extensions: combining multiple two-class trees together and directly extending the estimation of belief functions within the tree to the multi-class setting. We provide experiment results and compare them to usual decision trees.

Nicolas Sutton-Charani, Sébastien Destercke, Thierry Denœux
Combination of Supervised and Unsupervised Classification Using the Theory of Belief Functions

In this paper, we propose to fuse both clustering and supervised classification approach in order to outperform the results of a classification algorithm. Indeed the results of the learning in supervised classification depend on the method and on the parameters chosen. Moreover the learning process is particularly difficult which few learning data and/or imprecise learning data. Hence, we define a classification approach using the theory of belief functions to fuse the results of one clustering and one supervised classification. This new approach applied on real databases allows good and promising results.

Fatma Karem, Mounir Dhibi, Arnaud Martin
Continuous Belief Functions: Focal Intervals Properties

The set of focal elements resulting from a conjunctive or disjunctive combination of consonant belief functions is regretfully not consonant and is thus very difficult to represent.

In this paper, we propose a graphical representation of the cross product of two focal sets originating from univariate Gaussian pdfs. This representation allows to represent initial focal intervals as well as focal intervals resulting from a combination operation. We show in case of conjunctive or disjunctive combination operations, that the whole domain can be separated in four subsets of intervals having same properties. At last, we focus on identical length focal intervals resulting from a combination. We show that such intervals are organized in connected line segments on our graphical representation.

Jean-Marc Vannobel
Game-Theoretical Semantics of Epistemic Probability Transformations

Probability transformation of belief functions can be classified into different families, according to the operator they commute with. In particular, as they commute with Dempster’s rule, relative plausibility and belief transforms form one such “epistemic” family, and possess natural rationales within Shafer’s formulation of the theory of evidence, while they are not consistent with the credal or probability-bound semantic of belief functions. We prove here, however, that these transforms can be given in this latter case an interesting rationale in terms of optimal strategies in a non-cooperative game.

Fabio Cuzzolin
Generalizations of the Relative Belief Transform

Probability transformation of belief functions can be classified into different families, according to the operator they commute with. In particular, as they commute with Dempster’s rule, relative plausibility and belief transforms form one such “epistemic” family, and possess natural rationales within Shafer’s formulation of the theory of evidence. However, the relative belief transform only exists when some mass is assigned to singletons. We show here that relative belief is only a member of a class of “relative mass” mappings, which can be interpreted as lowcost proxies for both plausibility and pignistic transforms.

Fabio Cuzzolin
Choquet Integral as Maximum of Integrals with Respect to Belief Functions

We study the problem of representing the Choquet integral w.r.t. an arbitrary capacity as maximum of integrals w.r.t. belief functions. We propose an algorithm and prove that for 2-additive capacities it allows to obtain a decomposition with the lowest number of elements.

Mikhail Timonin
Consonant Approximations in the Belief Space

In this paper we solve the problem of approximating a belief measure with a necessity measure or “consonant belief function” by minimizing appropriate distances from the consonant complex in the space of all belief functions. Partial approximations are first sought in each simplicial component of the consonant complex, while global solutions are obtained from the set of partial ones. The

L

1

,

L

2

and

L

 ∞ 

consonant approximations in the belief space are here computed, discussed and interpreted as generalizations of the maximal outer consonant approximation. Results are also compared to other classical approximations in a ternary example.

Fabio Cuzzolin
Controling the Number of Focal Elements
Some Combinatorial Considerations

A basic belief assignment can have up to 2

n

focal elements, and combining them with a simple conjunctive operator will need

${\mathcal O}(2^{2n})$

operations. This article proposes some techniques to limit the size of the focal sets of the bbas to be combined while preserving a large part of the information they carry.

The first section revisits some well-known definitions with an algorithmic point of vue. The second section proposes a matrix way of building the least committed isopignistic, and extends it to some other bodies of evidence. The third section adapts the

k

-means algorithm for an unsupervized clustering of the focal elements of a given bba.

Christophe Osswald
Random Generation of Mass Functions: A Short Howto

As Dempster-Shafer theory spreads in different applications fields involving complex systems, the need for algorithms randomly generating mass functions arises. As such random generation is often perceived as secondary, most proposed algorithms use procedures whose sample statistical properties are difficult to characterize. Thus, although they produce randomly generated mass functions, it is difficult to control the sample statistical laws. In this paper, we briefly review classical algorithms, explaining why their statistical properties are hard to characterize, and then provide simple procedures to perform efficient and controlled random generation.

Thomas Burger, Sébastien Destercke
Revisiting the Notion of Conflicting Belief Functions

The problem of conflict measurement between information sources knows a regain of interest. In most works related to this issue, Dempter’s rule plays a central role. In this paper, we propose to revisit conflict from a different perspective. We do not make a priori assumption about dependencies and start from the definition of conflicting sets, studying its possible extensions to the framework of belief functions.

Sébastien Destercke, Thomas Burger
About Conflict in the Theory of Belief Functions

In the theory of belief functions, the conflict is an important concept. Indeed, combining several imperfect experts or sources allows conflict. However, the mass appearing on the empty set during the conjunctive combination rule is generally considered as conflict, but that is not really a conflict. Some measures of conflict have been proposed, we recall some of them and we show some counter-intuitive examples with these measures. Therefore we define a conflict measure based on expected properties. This conflict measure is build from the distance-based conflict measure weighted by a degree of inclusion introduced in this paper.

Arnaud Martin
The Internal Conflict of a Belief Function

In this paper we define and derive an internal conflict of a belief function We decompose the belief function in question into a set of generalized simple support functions (GSSFs). Removing the single GSSF supporting the empty set we obtain the base of the belief function as the remaining GSSFs. Combining all GSSFs of the base set, we obtain a base belief function by definition. We define the conflict in Dempster’s rule of the combination of the base set as the internal conflict of the belief function. Previously the conflict of Dempster’s rule has been used as a distance measure only between consonant belief functions on a conceptual level modeling the disagreement between two sources. Using the internal conflict of a belief function we are able to extend this also to non-consonant belief functions.

Johan Schubert
Plausibility in DSmT

Preparing for generalization of results on conflicts of classic belief function to DSm approach, we need normalized plausibility of singletons also in DSmT. To enable this, plausibility of DSm generalized belief functions is analyzed and compared on entire spectrum of DSm models for various types of belief functions; from simple uniform distribution, through general classic belief function, to general generalized belief function in full generality. Both numeric and comparative variability with respect to particular DSm models has been observed and described. This comparative study enables deeper understanding of plausibility in DSm approach and also underlines the sensitivity to selection of particular DSm models.

Figure of elements of DSm domain—DSm hyper-power set—and figures representing particular DSm models (the free DSm model, hybrid DSm models, and Shafer’s model) throughout the text enable better understanding of DSm principles.

Further, a notion of non-conflicting DSm model is introduced and characterized towards the end of the study.

Milan Daniel
A Belief Function Model for Pixel Data

Image data

i

.

e

. pixel values are notably corrupted with uncertainty. A pixel value can be seen as uncertain because of additional noise due to acquisition conditions or compression. It is possible to represent a pixel value in a more imprecise but less uncertain way by considering it as interval-valued instead of a single-valued. The Belief Function Theory (BFT) allows to handle such interval-based pixel representations. We provide in this paper a model describing how to define belief functions from image data. The consistency of this model is demonstrated on edge detection experiments as conflicting pixel-based belief functions lead to image transitions detection.

John Klein, Olivier Colot
Using Belief Function Theory to Deal with Uncertainties and Imprecisions in Image Processing

In imaging, physical phenomena and acquisition system often induce an alteration of the information. It results in the presence of noise and partial volume effect corresponding respectively to uncertainties and imprecisions. To cope with these different imperfections, we propose a method based on information fusion using Belief function theory. First, it takes advantage of neighborhood information and combination rules on mono-modal images in order to reduce uncertainties due to noise while considering imprecisions due to partial volume effect on disjunctions. Imprecisions are then reduced using information coming from multi-modal images. Results obtained on simulated images using various signal to noise ratio and medical images show its ability to segment multi-modal images having both noise and partial volume effect.

Benoît Lelandais, Isabelle Gardin, Laurent Mouchard, Pierre Vera, Su Ruan
Belief Theory for Large-Scale Multi-label Image Classification

Classifier combination is known to generally perform better than each individual classifier by taking into account the complementarity between the input pieces of information. Dempster-Shafer theory is a framework of interest to make such a fusion at the decision level, and allows in addition to handle the conflict that can exist between the classifiers as well as the uncertainty that remains on the sources of information. In this contribution, we present an approach for classifier fusion in the context of large-scale multi-label and multi-modal image classification that improves the classification accuracy. The complexity of calculations is reduced by considering only a subset of the frame of discernment. The classification results on a large dataset of 18,000 images and 99 classes show that the proposed method gives higher performances than of those classifiers separately considered, while keeping tractable computational cost.

Amel Znaidia, Hervé Le Borgne, Céline Hudelot
Facial Expression Classification Based on Dempster-Shafer Theory of Evidence

Facial expression recognition is a well discussed problem. Several machine learning methods are used in this regard. Among them, Adaboost is popular for its simplicity and considerable accuracy. In Adaboost, decisions are made based on the weighted majority vote of several weak classifiers. However, such weighted combination may not give expected accuracy due to the lack of proper uncertainty management. In this paper, we propose to adopt the Dempster Shafer theory (DST) of Evidence based solution where mass values are calculated from

k

-nearest neighboring feature information based on some distance metric, and combined together using DST. Experiments on a renowned dataset demonstrate the effectiveness of the proposed method.

Mohammad Shoyaib, M. Abdullah-Al-Wadud, S. M. Zahid Ishraque, Oksam Chae
Compositional Models in Valuation-Based Systems

Compositional models were initially described for discrete probability theory, and later extended for possibility theory, and Dempster-Shafer (D-S) theory of evidence. Valuation-based systems (VBS) can be considered as a generic uncertainty framework that has many uncertainty calculi, such as probability theory, a version of possibility theory where combination is the product t-norm, Spohn’s epistemic belief theory, and D-S belief function theory, as special cases. In this paper, we describe compositional models for the VBS framework using the semantics of no-double counting. We show that the compositional model defined here for belief functions differs from the one studied by Jiroušek, Vejnarová, and Daniel. The latter model can be described in the VBS framework, but with a combination operation that is different from Dempster’s rule.

Radim Jiroušek, Prakash P. Shenoy
Ascribing Causality from Interventional Belief Function Knowledge

In many Artificial Intelligence applications, causality is an important issue. Interventions are external manipulations that alter the natural behavior of the system. They have been used as tools to distinguish causal relations from spurious correlations. This paper proposes a model allowing the detection of causal relationships under the belief function framework resulting from acting on some events. Facilitation and justification in the presence of interventions, concepts complementary to the concept of causality, are also discussed in this paper.

Imen Boukhris, Salem Benferhat, Zied Elouedi
About Sources Dependence in the Theory of Belief Functions

In the theory of belief functions many combination rules are proposed in the purpose of merging and confronting several sources opinions. Some combination rules are used when sources are cognitively independent whereas others are specific to dependent sources. In this paper, we suggest a method to quantify sources degrees of dependence in order to choose the more appropriate combination rule. We used generated mass functions to test the proposed method.

Mouna Chebbah, Arnaud Martin, Boutheina Ben Yaghlane
On Random Sets Independence and Strong Independence in Evidence Theory

Belief and plausibility functions can be viewed as lower and upper probabilities possessing special properties. Therefore, (conditional) independence concepts from the framework of imprecise probabilities can also be applied to its sub-framework of evidence theory. In this paper we concentrate ourselves on random sets independence, which seems to be a natural concept in evidence theory, and strong independence, one of two principal concepts (together with epistemic independence) in the framework of credal sets. We show that application of strong independence to two bodies of evidence generally leads to a model which is beyond the framework of evidence theory. Nevertheless, if we add a condition on resulting focal elements, then strong independence reduces to random sets independence. Unfortunately, it is not valid no more for conditional independence.

Jiřina Vejnarová
Combining Linear Equation Models via Dempster’s Rule

This paper proposes a concept of imaginary extreme numbers, which are like traditional complex number

a

 + 

bi

but with

i

=

$\sqrt{-1}$

being replaced by

e

= 1/0, and defines usual operations such as addition, subtraction, and division on the numbers. It applies the concept to representing linear equations in knowledge-based systems. It proves that the combination of linear equations via Dempster’s rule is equivalent to solving a system of simultaneous equations or finding a least-square estimate when they are overdetermined.

Liping Liu
Reliability in the Thresholded Dempster-Shafer Algorithm for ESM Data Fusion

The effectiveness of a multi-source information fusion process for decision making highly depends on the quality of information that is received and processed. This paper proposes methods for incorporating reliability, as one of the attributes of the quality of information, into the Thresholded Dempster-Shafer fusion algorithm for Electronic Support Measure (ESM) data fusion and delivers its quantitative assessment by evaluating statistically the performance of the fusion algorithm. The results suggest that accounting for the reliability of information in the fusion algorithm will lead to an improved decision making.

Melita Hadzagic, Marie-Odette St-Hilaire, Pierre Valin
Hierarchical Proportional Redistribution for bba Approximation

Dempster’s rule of combination is commonly used in the field of information fusion when dealing with belief functions. However, it generally requires a high computational cost. To reduce it, a basic belief assignment (bba) approximation is needed. In this paper we present a new bba approximation approach called hierarchical proportional redistribution (HPR) allowing to approximate a bba at any given level of non-specificity. Two examples are given to show how our new HPR works.

Jean Dezert, Deqiang Han, Zhunga Liu, Jean-Marc Tacnet
On the α-Conjunctions for Combining Belief Functions

The

α

-conjunctions basically represent the set of associative, commutative and linear operators for belief functions with the vacuous belief function as neutral element. Besides, they include as particular case the unnormalized Dempster’s rule. They are thus particularly interesting from a formal standpoint. However, they suffer from a main limitation: they lack a clear interpretation in general. In this paper, an interpretation for these combination rules is proposed, based on a new framework that allows the integration of meta-knowledge on the various forms of lack of truthfulness of the information sources.

Frédéric Pichon
Improvements to the GRP1 Combination Rule

The recursive use of belief function combination rules, as required with temporal data, is issue prone. Systems will either become unreactive, through a greedy empty set, or provide a false sense of security through applying a closed world model to an open world scenario. We improve on the previous combination rule GRP1 to enhance its ability to work with temporal data in an open world. Specifically we have progressed with the dynamic self adjustment properties of the rule, which allow it to gauge how fusion should take place dependant on the temporal information that it receives. Comparisons are made between the improved GRP1 rule and other rules which have been applied to temporal datasets.

Gavin Powell, Matthew Roberts, Dafni Stampouli
Consensus-Based Credibility Estimation of Soft Evidence for Robust Data Fusion

Due to its subjective naturewhich can otherwise compromise the integrity of the fusion process, it is critical that

soft evidence

(generated by human sources) be validated prior to its incorporation into the fusion engine. The strategy of

discounting

evidence based on source reliability may not be applicable when dealing with soft sources because their reliability (e.g., an eye witnesses account) is often unknown beforehand. In this paper, we propose a methodology based on the notion of

consensus

to estimate the credibility of (soft) evidence in the absence of a ‘ground truth.’ This estimated credibility can then be used for source reliability estimation, discounting or appropriately ‘weighting’ evidence for fusion. The consensus procedure is set up via Dempster-Shafer belief theoretic notions. Further, the proposed procedure allows one to constrain the consensus by an estimate of the ground truth if/when it is available. We illustrate several interesting and intuitively appealing properties of the consensus procedure via a numerical example.

Thanuka L. Wickramarathne, Kamal Premaratne, Manohar N. Murthi
Ranking from Pairwise Comparisons in the Belief Functions Framework

The problem of deriving a binary relation over alternatives based on paired comparisons is studied. The problem is tackled in the framework of belief functions, which is well-suited to model and manipulate partial and uncertain information. Starting from the work of Tritchler and Lockwood [8], the paper proposes a general model of mass allocation and combination, and shows how to practically derive a complete or a partial ranking of the alternatives. A small example is provided as an illustration.

Marie-Hélène Masson, Thierry Denœux
Dempster-Shafer Fusion of Context Sources for Pedestrian Recognition

This contribution presents the design of an image-based contextual pedestrian classifier for an automotive application. Our previous work shows that local classifiers working with image cutouts are in many cases not sufficient to achieve satisfactory results in complex scenarios. As a solution the work proposed incorporating contextual knowledge into the classification task, significantly improving the classification results. Contextual knowledge is described by a set of different and independent context sources. This paper discusses the fusion of these sources on the basis of the Dempster-Shafer theory. It presents and compares different possibilities to model the frame of discernment and the mass function to achieve optimal results. Furthermore, it provides an elegant way to take uncertainties of the context sources into account. The methods are evaluated on simulated and on real data.

Magdalena Szczot, Otto Löhlein, Günther Palm
Multi-level Dempster-Shafer Speed Limit Assistant

This paper deals with a Speed Limit Assistant (

SLA

) performing the fusion of a Geographic Information System (

GIS

) and a vision system. The present strategy is based on multi-level data fusion using Evidence Theory. In a first step, the

GIS

reliability is estimated through

GIS

criteria related to the positioning, the localization and the digital map resolution. Contextual criteria also extracted from the

GIS

define the belief masses of the speed candidates. Afterwards, a multi-criterion fusion is processed to detect potential

GIS

incoherences (difference between the

GIS

speed and the road context). The second fusion level (the multi-sensor fusion) then combines the

GIS

and vision information by considering these sensors as specialized sources. In order to manage the conflict, the Proportional Conflict redistribution Rule 5 (

PCR

5) has been chosen. The benefits of the proposed solution are shown through real experiments performed with a test vehicle.

Jérémie Daniel, Jean-Philippe Lauffenburger
A New Local Measure of Disagreement between Belief Functions – Application to Localization

In the theory of belief functions, the disagreement between sources is often measured in terms of conflict or dissimilarity. These measures are global to the sources, and provide few information about the origin of the disagreement. We propose in this paper a “finer” measure based on the decomposition of the global measure of conflict (or distance). It allows focusing the measure on some hypotheses of interest (namely the ones likely to be chosen after fusion).We apply the proposed so called “local” measures of conflict and distance to the choice of sources for vehicle localization.We show that considering sources agreement/disagreement outperforms blind fusion.

Arnaud Roquel, Sylvie Le Hégarat-Mascle, Isabelle Bloch, Bastien Vincke
Map-Aided Fusion Using Evidential Grids for Mobile Perception in Urban Environment

Evidential grids have been recently used for mobile object perception. The novelty of this article is to propose a perception scheme using prior map knowledge. A geographic map is considered an additional source of information fused with a grid representing sensor data. Yager’s rule is adapted to exploit the Dempster- Shafer conflict information at large. In order to distinguish stationary and mobile objects, a counter is introduced and used as a factor for mass function specialisation. Contextual discounting is used, since we assume that different pieces of information become obsolete at different rates. Tests on real-world data are also presented.

Marek Kurdej, Julien Moras, Véronique Cherfaoui, Philippe Bonnifait
Distributed Data Fusion for Detecting Sybil Attacks in VANETs

Sybil attacks have become a serious threat as they can affect the functionality of VANETs (Vehicular Ad Hoc Networks). This paper presents a method for detecting such attacks in VANETs based on distributed data fusion. An algorithm has been developed in order to build distributed confidence over the network under the belief function framework. Our approach has been validated by simulation.

Nicole El Zoghby, Véronique Cherfaoui, Bertrand Ducourthial, Thierry Denœux
Partially-Hidden Markov Models

This paper addresses the problem of Hidden Markov Models (HMM) training and inference when the training data are composed of feature vectors plus uncertain and imprecise labels. The “soft” labels represent partial knowledge about the possible states at each time step and the “softness” is encoded by belief functions. For the obtained model, called a Partially-Hidden Markov Model (PHMM), the training algorithm is based on the Evidential Expectation-Maximisation (E2M) algorithm. The usual HMM model is recovered when the belief functions are vacuous and the obtained model includes supervised, unsupervised and semi-supervised learning as special cases.

Emmanuel Ramasso, Thierry Denœux, Noureddine Zerhouni
Large Scale Multinomial Inferences and Its Applications in Genome Wide Association Studies

Statistical analysis of multinomial counts with a large number

K

of categories and a small number

n

of sample size is challenging to both frequentist and Bayesian methods and requires thinking about statistical inference at a very fundamental level. Following the framework of Dempster-Shafer theory of belief functions, a probabilistic inferential model is proposed for this “large

K

and small

n

” problem. Using a data-generating device, the inferential model produces probability triplet (

p

,

q

,

r

) for an assertion conditional on observed data. The probabilities

p

and

q

are

for

and

against

the truth of the assertion, whereas

r

= 1-

p

 − 

q

is the remaining probability called the probability of “don’t know”. The new inference method is applied in a genome-wide association study with very-high-dimensional count data, to identify association between genetic variants to a disease Rheumatoid Arthritis.

Chuanhai Liu, Jun Xie
Belief Function Robustness in Estimation

We consider the case in which the available knowledge does not allow to specify a precise probabilistic model for the prior and/or likelihood in statistical estimation. We assume that this imprecision can be represented by belief functions. Thus, we exploit the mathematical structure of belief functions and their equivalent representation in terms of closed convex sets of probability measures to derive robust posterior inferences.

Alessio Benavoli
Conditioning in Dempster-Shafer Theory: Prediction vs. Revision

We recall the existence of two methods for conditioning belief functions due to Dempster: one, known as Dempster conditioning, that applies Bayesian conditioning to the plausibility function and one that performs a sensitivity analysis on a conditional probability. We recall that while the first one is dedicated to revising a belief function, the other one is tailored to a prediction problem when the belief function is a statistical model. We question the use of Dempster conditioning for prediction tasks in Smets generalized Bayes theorem approach to the modeling of statistical evidence and propose a modified version of it, that is more informative than the other conditioning rule.

Didier Dubois, Thierry Denœux
Combining Statistical and Expert Evidence within the D-S Framework: Application to Hydrological Return Level Estimation

Estimation of extreme sea levels and waves for high return periods is of prime importance in hydrological design and flood risk assessment. The common practice consists of inferring design levels from the available observations and assuming the distribution of extreme values to be stationary. However, in the recent decades, more concern has been given to the integration of the effect of climate change in environmental analysis. When estimating defense structure design parameters, sea level rise projections provided by experts now have to be combined with historical observations. Due to limited knowledge about the future world and the climate system, and also to the lack of sufficient sea records, uncertainty involved in extrapolating beyond available data and projecting in the future is considerable and should absolutely be accounted for in the estimation of design values.

In this paper, we present a methodology based on evidence theory to represent statistical and expert evidence in the estimation of future extreme sea return level associated to a given return period. We represent the statistical evidence by likelihood-based belief functions [7] and the sea level rise projections provided by two sets of experts by a trapezoidal possibility distribution. A Monte Carlo simulation allows us to combine both belief measures to compute the future return level and a measure of the uncertainty of the estimations.

Nadia Ben Abdallah, Nassima Mouhous Voyneau, Thierry Denœux
Sigmoidal Model for Belief Function-Based Electre Tri Method

Main decision-making problems can be described into choice, ranking or sorting of a set of alternatives or solutions. The principle of Electre TRI (ET) method is to sort alternatives ai according to criteria

g

j

into categories

C

h

whose lower and upper limits are respectively

b

h

and

b

h

 + 1

. The sorting procedure is based on the evaluations of outranking relations based firstly on calculation of partial concordance and discordance indexes and secondly on global concordance and credibility indexes. In this paper, we propose to replace the calculation of the original concordance and discordance indexes of ET method by a more effective sigmoidal model. Such model is part of a new Belief Function ET (BF-ET) method under development and allows a comprehensive, elegant and continuous mathematical representation of degree of concordance, discordance and the uncertainty level which is not directly taken into account explicitly in the classical Electre Tri.

Jean Dezert, Jean-Marc Tacnet
Belief Inference with Timed Evidence
Methodology and Application Using Sensors in a Smart Home

Smart Homes need to sense their environment. Augmented appliances can help doing this but sensors are also required. Then, data fusion is used to combine the gathered information. The belief functions theory is adapted for the computation of small pieces of context such as the presence of people or their posture. In our application, we can assume that a lot of sensors are immobile. Also, physical properties of Smart Homes and people can induce belief for more time than the exact moment of measures.

Thus, in this paper, we present a simple way to apply the belief functions theory to sensors and a methodology to take into account the timed evidence using the specificity of mass functions and the discounting operation. An application to presence detection in smart homes is presented as an example.

Bastien Pietropaoli, Michele Dominici, Fréedéric Weis
Evidential Network with Conditional Belief Functions for an Adaptive Training in Informed Virtual Environment

Simulators have been used for many years to learn driving, piloting, steering, etc. but they often provide the same training for each learner, no matter his/her performance. In this paper, we present the GULLIVER system, which determines the most appropriate aids to display for learner guiding in a fluvial-navigation training simulator. GULLIVER is a decision-making system based on an evidential network with conditional belief functions. This evidential network allows graphically representing inference rules on uncertain data coming from learner observation. Several sensors and a predictive model are used to collect these data about learner performance. Then the evidential network is used to infer in real time the best guiding to display to learner in informed virtual environment.

Loïc Fricoteaux, Indira Thouvenin, Jérôme Olive, Paul George
Using the Belief Functions Theory to Deploy Static Wireless Sensor Networks

The location of sensors is one of the fundamental design issues in wireless sensor networks. It may affect the fulfillment of the system’s requirements and multiple network performance metrics. Assuming that an inherent uncertainty can be associated with sensor readings, it is very important to consider this issue in the deployment process to anticipate this sensing behavior. This paper addresses the issue of uncertainty-aware sensor networks deployment by exploiting the belief functions reasoning framework. An evidence-based coverage model is proposed and some possible extensions are discussed. The deployment problem is formulated as an optimization problem and possible solutions are discussed. Preliminary experimental analysis demonstrates very promising results of the proposed methodology.

Mustapha Reda Senouci, Abdelhamid Mellouk, Latifa Oukhellou, Amar Aissani
A Quantitative Study of the Occurrence of a Railway Accident Based on Belief Functions

In the field of railway systems, there is a great interest to include the human factor in the risk analysis process. Indeed, a great number of accidents are consider to be triggered by the human factors interacting in the situation. Several attempts have been made to include human factors in safety analysis, but they generally attack the problem in a qualitative way. The choice of qualitative methods arises from the difficulty to elicit human behavior and the effects on systems safety. This paper presents a first attempt to account for the human factor by using the generalized bayesian theory and fault tree analysis.

Felipe Aguirre, Mohamed Sallak, Walter Schön, Fabien Belmonte
Backmatter
Metadata
Title
Belief Functions: Theory and Applications
Editors
Thierry Denoeux
Marie-Hélène Masson
Copyright Year
2012
Publisher
Springer Berlin Heidelberg
Electronic ISBN
978-3-642-29461-7
Print ISBN
978-3-642-29460-0
DOI
https://doi.org/10.1007/978-3-642-29461-7

Premium Partner