Skip to main content

About this book

Decision making is an omnipresent, most crucial activity of the human being, and also of virtually all artificial broadly perceived “intelligent” systems that try to mimic human behavior, reasoning and choice processes. It is quite obvious that such a relevance of decision making had triggered vast research effort on its very essence, and attempts to develop tools and techniques which would make it possible to somehow mimic human decision making related acts, even to automate decision making processes that had been so far reserved for the human beings. The roots of those attempts at a scientific analysis can be traced to the ancient times but – clearly – they have gained momentum in the recent 50 or 100 years following a general boom in science. Depending on the field of science, decision making can be viewed in different ways. The most general view can be that decision making boils down to some cognitive, mental process(es) that lead to the selection of an option or a course of action among several alternatives. Then, looking in a deeper way, from a psychological perspective this process proceeds in the context of a set of needs, preferences, rational choice of an individual, a group of individuals, or even an organization. From a cognitive perspective, the decision making process proceeds in the context of various interactions with the environment.

Table of Contents


Continuous Utility Functions for Nontotal Preorders: A Review of Recent Results

We present some recent and significant results concerning the existence of a continuous utility function for a not necessarily total preorder on a topological space.We first recall an appropriate continuity concept (namely, weak continuity) relative to a preorder on a topological space. Then a general characterization of the existence of a continuous utility function for a not necessarily total preorder on a topological space is presented and some consequences of this basic result are produced.
Gianni Bosi, Romano Isler

Risk Assessment of SLAs in Grid Computing with Predictive Probabilistic and Possibilistic Models

We developed a hybrid probabilistic and possibilistic technique for assessing the risk of an SLA for a computing task in a cluster/grid environment. The probability of success with the hybrid model is estimated higher than in the probabilistic model since the hybrid model takes into consideration the possibility distribution for the maximal number of failures derived from a resource provider’s observations. The hybrid model showed that we can increase or decrease the granularity of the model as needed; we can reduce the estimate of the P(S*=1) by making a rougher, more conservative, estimate of the more unlikely events of (M+1, N) node failures. We noted that M is an estimate which is dependent on the history of the nodes being used and can be calibrated to “a few” or to “many” nodes.
Christer Carlsson, Robert Fullér

From Benchmarks to Generalised Expectations

A random variable can be equivalently regarded to as a function or as a “set”, namely, that of the points lying below (when positive) and above (when negative) its graph. The second approach, proposed by Segal in 1989, is known as the measure (or measurement) representation approach. On a technical ground, it allows for using Measure theory tools instead of Functional analysis ones, thus making often possible to reach new and deeper conclusions. On an interpretative ground, it makes clear how expectation and expected utility, either classical or à la Choquet, are structurally analogous and, moreover, it allows for dealing with new and more general types of expectations including, e.g., state dependence.
Starting from the measurement approach and from a decision-theoretical result by Castagnoli and LiCalzi (2006), we present a new representation theorem in the same perspective and, finally, we propose a definition of generalised expectations and two different concepts of associativity that can be imposed to them.
Erio Castagnoli, Gino Favero

Memory Property in Heterogeneously Populated Markets

This paper focuses on the long memory of prices and returns of an asset traded in a financial market.We consider a microeconomic model of the market, and we prove theoretical conditions on the parameters of the model that give rise to long memory. In particular, the long memory property is detected in an agents’ aggregation framework under some distributional hypotheses on the market’s parameters.
Roy Cerqueti, Giulia Rotundo

From Comparative Degrees of Belief to Conditional Measures

Aim of this paper is to give a contribute to the discussion about the “best” definition of conditional model for plausibilty functions and its subclass of the possibility functions.
We propose to use the framework of the theory of measurements: by studying the comparative structure underling different conditional models. This approach gives an estimate of the “goodness” and “effectiveness” of the model, by pointing out the rules necessarily accepted by the user.Moreover, the results related to the characterization of comparative degree of belief by means conditional uncertainty measures can be used in decision theory. They are in fact necessary when we need a model for a decision maker interested in choosing by taking into account, at the same moment, different scenarios.
Giulianella Coletti, Barbara Vantaggi

Delay and Interval Effects with Subadditive Discounting Functions

Delay effect appears as an anomaly of the traditional discounted utility model according to which a decrease of the discount rate is performed as waiting time increases. But, in this description, it is not clear if the benchmark (that is to say, the reference instant in the assessment process) or the discounted amount availability is fixed or variable. In this way, other authors use the term common difference effect (and immediacy effect, when the first outcome is available immediately) and this expression at least does implies a variable discounted amount availability. Read introduces another different effect, the interval effect: longer intervals lead to smaller values of the discount rate r. Taking into account the parameter δ (geometric mean of the discount factor), the interval effect implies larger values of δ. In this paper we try to clarify the concepts of delay and interval effect and we deduce some relationships between these concepts and certain subadditive discounting functions.
Salvador Cruz Rambaud, María José Muñoz Torrecillas

Pairwise Comparison Matrices: Some Issue on Consistency and a New Consistency Index

In multicriteria decision making, the pairwise comparisons are an useful starting point for determining a ranking on a set X = {x 1,x 2,..., x n } of alternatives or criteria; the pairwise comparison between x i and x j is quantified in a number a ij expressing how much x i is preferred to x j and the quantitative preference relation is represented by means of the matrix A = (a ij ). In literature the number a ij can assume different meanings (for instance a ratio or a difference) and so several kind of pairwise comparison matrices are proposed. A condition of consistency for the matrix A = (a ij ) is also considered; this condition, if satisfied, allows to determine a weighted ranking that perfectly represents the expressed preferences. The shape of the consistency condition depends on the meaning of the number a ij . In order to unify the different approaches and remove some drawbacks, related for example to the fuzzy additive consistency, in a previous paper we have considered pairwise comparison matrices over an abelian linearly ordered group; in this context we have provided, for a pairwise comparison matrix, a general definition of consistency and a measure of closeness to consistency. With reference to the new general unifying context, in this paper we provide some issue on a consistent matrix and a new measure of consistency that is easier to compute; moreover we provide an algorithm to check the consistency of a pairwise comparison matrix and an algorithm to build consistent matrices.
Bice Cavallo, Livia D’Apuzzo, Gabriella Marcarelli

On a Decision Model for a Life Insurance Company Rating

A rating system is a decision support tool for analysts, regulators and stakeholders in order to evaluate firm capital requirements under risky conditions. The aim of this paper is to define an actuarial model to measure the Economic Capital of a life insurance company; the model is developed under Solvency II context, based on option pricing theory.
In order to asses a life insurance company Economic Capital it is necessary to involve coherent risk measures already used in the assessment of banking Solvency Capital Requirements, according to Basel II standards. The complexity of embedded options in life insurance contracts requires to find out operational solutions consistent with Fair Value principle, as defined in the International Accounting Standards (IAS).
The paper is structured as follows: Section 1 describes the development of the Insurance Solvency Capital Requirement standards; Section 2 introduces the theoretical framework of Economic Capital related to risk measures; Section 3 formalizes the actuarial model for the assessment of a life insurance company rating; Section 4 offers some results due to an application of the actuarial model to a portfolio of surrendable participating policies with minimum return guaranteed and option to annuitise.
Fabio Baione, Paolo De Angelis, Riccardo Ottaviani

Qualitative Bipolar Decision Rules: Toward More Expressive Settings

An approach to multicriteria decision-making previously developed by the authors is reviewed. The idea is to choose between alternatives based on an analysis of the pros and the cons, i.e. positive or negative arguments having various strengths.Arguments correspond to criteria or affects of various levels of importance and ranging on a very crude value scale containing only three elements: good, neutral or bad. The basic decision rule in this setting is based on two ideas: focusing on the most important affects, and when comparing the merits of two alternatives considering that an argument against one alternative can be counted as an argument in favour of the other. It relies on a bipolar extension of comparative possibility ordering. Lexicographic refinements of this crude decision rule turn out to be cognitively plausible, and to generalise a well-known choice heuristics. It can also be encoded in Cumulative Prospect Theory. The paper lays bare several lines of future research, especially an alternative to the bicapacity approach to bipolar decision-making, that subsumes both Cumulative Prospect Theory and our qualitative bipolar choice rule. Moreover, an extension of the latter to non-Boolean arguments is outlined.
Didier Dubois, Hélène Fargier

The Dynamics of Consensus in Group Decision Making: Investigating the Pairwise Interactions between Fuzzy Preferences

In this paper we present an overviewof the soft consensusmodel in group decision making and we investigate the dynamical patterns generated by the fundamental pairwise preference interactions on which the model is based.
The dynamical mechanism of the soft consensus model is driven by the minimization of a cost function combining a collective measure of dissensus with an individual mechanism of opinion changing aversion. The dissensus measure plays a key role in the model and induces a network of pairwise interactions between the individual preferences.
The structure of fuzzy relations is present at both the individual and the collective levels of description of the soft consensusmodel: pairwise preference intensities between alternatives at the individual level, and pairwise interaction coefficients between decision makers at the collective level.
The collective measure of dissensus is based on non linear scaling functions of the linguistic quantifier type and expresses the degree to which most of the decision makers disagree with respect to their preferences regarding the most relevant alternatives. The graded notion of consensus underlying the dissensus measure is central to the dynamical unfolding of the model.
The original formulation of the soft consensus model in terms of standard numerical preferences has been recently extended in order to allow decision makers to express their preferences by means of triangular fuzzy numbers. An appropriate notion of distance between triangular fuzzy numbers has been chosen for the construction of the collective dissensus measure.
In the extended formulation of the soft consensus model the extra degrees of freedom associated with the triangular fuzzy preferences, combined with non linear nature of the pairwise preference interactions, generate various interesting and suggestive dynamical patterns. In the present paper we investigate these dynamical patterns which are illustrated by means of a number of computer simulations.
Mario Fedrizzi, Michele Fedrizzi, R. A. Marques Pereira, Matteo Brunelli

Fuzzy Preference Relations Based on Differences

In this paper we introduce quaternary fuzzy relations in order to describe difference structures. Three models are developed and studied, based on three different interpretations of an implication. Functional forms of the quaternary relation are determined by solutions of functional equations of the same type.
János Fodor

Indices of Collusion among Judges and an Anti-collusion Average

We propose two indices of collusion among Judges of objects or events in a context of subjective evaluation, and an average based on these indices. The aim is manifold: to serve as a reference point for appeals against the results of voting already undertaken, to improve the quality of scores summarized for awards by eliminating those that are less certain, and, indirectly, to provide an incentive for reliable evaluations. An algorithm for automatic computation is supplied. The possible uses of this technique in various fields of application are pointed out: from Economics to Finance, Insurance, Arts, artistic sports and so on.
Cesarino Bertini, Gianfranco Gambarelli, Angelo Uristani

Scoring Rules and Consensus

In this paper we consider that voters rank order a set of alternatives and a scoring rule is used for obtaining a set of winning alternatives. The scoring rule we use is not previously fixed, but we analyze how to select one of them in such a way that the collective utility is maximized. In order to generate that collective utility, we ask voters for additional information: agents declare which alternatives are good and their degree of optimism.With that information and a satisfaction function, for each scoring rule we generate individual utility functions. The utility an alternative has for a voter should depend on whether this alternative is a winner for that scoring rule and on the position this alternative has in the individual ranking. Taking into account all these individual utilities, we aggregate them by means of an OWA operator and we generate a collective utility for each scoring rule. By maximizing the collective utility, we obtain the set of scoring rules that maximizes consensus among voters. Then, applying one of these scoring rules we obtain a collective weak order on the set of alternatives, thus a set of winning alternatives.
José Luis García-Lapresta, Bonifacio Llamazares, Teresa Peña

Dominance-Based Rough Set Approach to Interactive Evolutionary Multiobjective Optimization

We present application of Dominance-based Rough Set Approach (DRSA) to interactive Evolutionary Multiobjective Optimization (EMO). In the proposed methodology, the preference information elicited by the decision maker in successive iterations consists in sorting some solutions of the current population as “good” or “bad”, or in comparing some pairs of solutions. The “if ..., then ...” decision rules are then induced from this preference information using Dominance-based Rough Set Approach (DRSA). The rules are used within EMO in order to focus on populations of solutions satisfying the preferences of the decision maker. This allows to speed up convergence to the most preferred region of the Pareto-front. The resulting interactive schemes, corresponding to the two types of preference information, are called DRSA-EMO and DRSA-EMO-PCT, respectively. Within the same methodology, we propose DARWIN and DARWIN-PCT methods, which permit to take into account robustness concerns in multiobjective optimization.
Salvatore Greco, Benedetto Matarazzo, Roman Słowiński

Supporting Consensus Reaching Processes under Fuzzy Preferences and a Fuzzy Majority via Linguistic Summaries

We consider the classic approach to the evaluation of of degrees of consensus due to Kacprzyk and Fedrizzi [6], [7], [8] in which a soft degree of consensus has been introduced. Its idea is to find a degree to which, for instance, “most of the important individuals agree as to almost all of the relevant options”. The fuzzy majority, expressed as fuzzy linguistic quantifiers (most, almost all, ...) is handled via Zadeh’s [46] classic calculus of linguistically quantified propositions and Yager’s [44] OWA (ordered weighted average) operators. The soft degree of consensus is used for supporting the running of a moderated consensus reaching process along the lines of Fedrizzi, Kacprzyk and Zadrożny [3], Fedrizzi, Kacprzyk, Owsiński and Zadrożny [2], Kacprzyk and Zadrożny [22], and [24].
Linguistic data summaries in the sense of Yager [43], Kacprzyk and Yager [13], Kacprzyk, Yager and Zadrożny [14], in particular in its protoform based version proposed by Kacprzyk and Zadrożny [23], [25] are employed. These linguistic summaries indicate in a human consistent way some interesting relations between individuals and options to help the moderator identify crucial (pairs of) individuals and/options with whom/which there are difficulties with respect to consensus. An extension using ontologies representing both knowledge on the consensus reaching process and domain of the decision problem is indicated.
Janusz Kacprzyk, Sławomir Zadrożny

Decision Making in Social Actions

Decision making in a social ambit involves both individual optimal choices and social choices. The theory of “perverse effects” by Boudon shows that the sum of rational individual choices can produce a very undesirable global effect. Then decision making in social action must keep into account the theory of cooperative games with many players, in order to obtain the optimal strategies. Because of the semantic uncertainty in the definition of social actions, it is preferable assume that the issues are represented by fuzzy numbers.
Gabriella Marcarelli, Viviana Ventre

Coherence for Fuzzy Measures and Applications to Decision Making

Coherence is a central issue in probability (de Finetti, 1970). The studies on non-additive models in decision making, e. g., non-expected utility models (Fishburn, 1988), lead to an extension of the coherence principle in nonadditive settings, such as fuzzy or ambiguous contexts. We consider coherence in a class of measures that are decomposable with respect to Archimedean t-conorms (Weber, 1984), in order to interpret the lack of coherence in probability. Coherent fuzzy measures are utilized for the aggregations of scores in multiperson and multiobjective decision making. Furthermore, a geometrical representation of fuzzy and probabilistic uncertainty is considered here in the framework of join spaces (Prenowitz and Jantosciak, 1979) and, more generally, algebraic hyperstructures (Corsini and Leoreanu, 2003); indeed coherent probability assessments and fuzzy sets are join spaces (Corsini and Leoreanu, 2003; Maturo et al., 2006a, 2006b).
Antonio Maturo, Massimo Squillante, Aldo G. S. Ventre

Measures for Firms Value in Random Scenarios

The value of a firm cannot be totally independent of the financial context in which the firm operates. In this paper we propose a set of axioms in order to characterize appropriate measures of the (random) value of a company which provides a (sublinear) valuation functional consistent with the existence of a financial market. It allows to give an upper and a lower bound to the value of a firm.
Finally, in a random context, we consider some classical valuation methods and test them with respect to the axioms.
Paola Modesti

Thin Rationality and Representation of Preferences with Implications to Spatial Voting Models

Much of current micro economic theory and formal political science is based on the notion of thin rationality. This concept refers to the behavioral principle stating that rational people act according to their preferences. More precisely, a rational individual chooses A rather than B just in case he/she (hereafter he) prefers A to B. Provided that the individual’s preference is a binary, connected and transitive relation over alternative courses of action, we can define a utility function that represents the individual’s preferences so that when acting rationally - i.e. in accordance with his preferences - he acts as if he were maximizing his utility. When considering risky alternatives, i.e. probability mixtures of certain outcomes, similar representation theorem states that the individual’s preferences can be represented as a utility function with an expected utility property. These utility functions assign risky prospects utility values than coincide with weighted sums of the utility values of those outcomes that may materialize in the prospect. The weights, in turn, are identical with the probabilities of the corresponding outcomes.
Hannu Nurmi

Quantum Dynamics of Non Commutative Algebras: The SU(2) Case

Applying Maximum Entropy Formalism (MEP) the dynamics of Hamiltonians, associated to non commutative Lie algebras, can be found. For the SU(2) case, it is easy to show that the Generalized Uncertainty Principle (GUP) is an invariant of motion. The temporal evolution of the system is confined to Bloch spheres whose radius lay on the interval (0;1). The GUP, defines the fizziness of these spheres inside the \(\hbar\) domain for the SU(2) Lie algebra.
C. M. Sarris, A. N. Proto

Benefits of Full-Reinforcement Operators for Spacecraft Target Landing

In this paper we discuss the benefits of using full reinforcement operators for site selection in spacecraft landing on planets. Specifically we discuss a modified Uninorm operator for evaluating sites and a Fimica operator to aggregate pixels for constructing regions that will act as sites to be selected at lower spacecraft altitude. An illustrative case study of spacecraft target landing is presented to clarify the details and usefulness of the proposed operators.
Rita A. Ribeiro, Tiago C. Pais, Luis F. Simões

Neural Networks for Non-independent Lotteries

The von Neuman-Morgenstern utility functions play a relevant role in the set of utility functions. This paper shows the density of the set von Neuman- Morgenstern utility functions on the set of utility utility function that can represent arbitrarily well a given continuous but not independent preference relation over monetary lotteries. The main result is that without independence it is possible to approximate utility functions over monetary lotteries by von Neuman-Morgenstern ones with arbitrary precision. The approach used is a constructive one. Neural networks are used for their approximation properties in order to get the result, and their functional form provides both the von Neumann-Morgenstern representation and the necessary change of variables over the set of lotteries.
Giulia Rotundo

Weak Implication and Fuzzy Inclusion

The content of this talk is taken from joint papers with G. Coletti and B. Vantaggi.
We define weak implication \(H\longmapsto_P E\) (“H weakly implies E under P”) through the relation P(E|H) = 1, where P is a (coherent) conditional probability.
In particular (as a ... by-product) we get “inferential rules”, that correspond to those of default logic. We discuss also connections between weak implication and fuzzy inclusion.
Romano Scozzafava

The TOPSIS Method and Its Application to Linguistic Variables

Most of the time, the input of the decision process is linguistic, but this is not the case for the output. For that reason, we have modified the TOPSIS model to make it so that the output of the process is the same as the input, that is to say linguistic. This proposal will be applied to the process of quality assessment and accreditation of the Industrial Engineering Schools within the Spanish university system.
M. Socorro García-Cascales, M. Teresa Lamata, J. Luís Verdegay

Information Fusion with the Power Average Operator

The power average provides an aggregation operator that allows argument values to support each other in the aggregation process. The properties of this operator are described. We see this mixes some of the properties of the mode with mean. Some formulations for the support function used in the power average are described. We extend this facility of empowerment to a wider class of mean operators such as the OWA and generalized mean.
Ronald R. Yager


Additional information

Premium Partner

    Image Credits