Skip to main content

2015 | Buch

Fifty Years of Fuzzy Logic and its Applications

insite
SUCHEN

Über dieses Buch

This book presents a comprehensive report on the evolution of Fuzzy Logic since its formulation in Lotfi Zadeh’s seminal paper on “fuzzy sets,” published in 1965. In addition, it features a stimulating sampling from the broad field of research and development inspired by Zadeh’s paper. The chapters, written by pioneers and prominent scholars in the field, show how fuzzy sets have been successfully applied to artificial intelligence, control theory, inference, and reasoning. The book also reports on theoretical issues; features recent applications of Fuzzy Logic in the fields of neural networks, clustering, data mining and software testing; and highlights an important paradigm shift caused by Fuzzy Logic in the area of uncertainty management. Conceived by the editors as an academic celebration of the fifty years’ anniversary of the 1965 paper, this work is a must-have for students and researchers willing to get an inspiring picture of the potentialities, limitations, achievements and accomplishments of Fuzzy Logic-based systems.

Inhaltsverzeichnis

Frontmatter
Toward a Restriction-Centered Theory of Truth and Meaning (RCT)

What is truth? The question does not admit a simple, precise answer. A dictionary-style definition is: The truth value of a proposition, p, is the degree to which the meaning of p is in agreement with factual information, F. A precise definition of truth will be formulated at a later point in this paper. The theory outlined in the following, call it RCT for short, is a departure from traditional theories of truth and meaning. In RCT, truth values are allowed to be described in natural language. Examples. Quite true, more or less true, almost true, largely true, possibly true, probably true, usually true, etc. Such truth values are referred to as linguistic truth values. Linguistic truth values are not allowed in traditional logical systems, but are routinely used by humans in everyday reasoning and everyday discourse. The centerpiece of RCT is a deceptively simple concept—the concept of a restriction. Informally, a restriction, R(X), on a variable, X, is an answer to a question of the form: What is the value of X? Possible answers: X = 10, X is between 3 and 20, X is much larger than 2, X is large, probably X is large, usually X is large, etc. In RCT, restrictions are preponderantly described in natural language. An example of a fairly complex description is: It is very unlikely that there will be a significant increase in the price of oil in the near future. The canonical form of a restriction, R(X), is X isr R, where X is the restricted variable, R is the restricting relation, and r is an indexical variable which defines the way in which R restricts X. X may be an n-ary variable and R may be an n-ary relation. The canonical form may be interpreted as a generalized assignment statement in which what is assigned to X is not a value of X, but a restriction on the values which X can take. A restriction, R(X), is a carrier of information about X. A restriction is precisiated if X, R and r are mathematically well defined. A key idea which underlies RCT is referred to as the meaning postulate, MP. MP postulates that the meaning of a proposition drawn from a natural language, p—or simply p—may be represented as a restriction,

$$ {\text{p}} \to {\text{X}} $$

p

X

isr R. This expression is referred to as the canonical form of p, CF(p). Generally, the variables X, R and r are implicit in p. Simply stated, MP postulates that a proposition drawn from a natural language may be interpreted as an implicit assignment statement. MP plays an essential role in defining the meaning of, and computing with, propositions drawn from natural language. What should be underscored is that in RCT understanding of meaning is taken for granted. What really matters is not understanding of meaning but precisiation of meaning. Precisiation of meaning is a prerequisite to reasoning and computation with information described in natural language. Precisiation of meaning is a desideratum in robotics, mechanization of decision-making, legal reasoning, precisiated linguistic summarization with application to data mining, and other fields. It should be noted that most—but not all—propositions drawn from natural language are precisiable. In RCT, truth values form a hierarchy. First order (ground level) truth values are numerical, lying in the unit interval. Linguistic truth values are second order truth values and are restrictions on first order truth values. n

th

order truth values are restrictions on (n-1) order truth values, etc. Another key idea is embodied in what is referred to as the truth postulate, TP. The truth postulate, TP, equates the truth value of p to the degree to which X satisfies R. This definition of truth value plays an essential role in RCT. A distinguishing feature of RCT is that in RCT a proposition, p, is associated with two distinct truth values—internal truth value and external truth value. The internal truth value relates to the meaning of p. The external truth value relates to the degree of agreement of p with factual information. To compute the degree to which X satisfies R, it is necessary to precisiate X, R and r. In RCT, what is used for this purpose is the concept of an explanatory database, ED. Informally, ED is a collection of relations which represent the information which is needed to precisiate X and R or, equivalently, to compute the truth value of p. Precisiated X, R and p are denoted as X

*

, R

*

and p

*

, respectively. X and R are precisiated by expressing them as functions of ED. The precisiated canonical form, CF

*

(p), is expressed as X

*

isr

*

R

*

. At this point, the numerical truth value of p, nt

p

, may be computed as the degree to which X

*

satisfies R

*

. In RCT, the factual information, F, is assumed to be represented as a restriction on ED. The restriction on ED induces a restriction, t, on nt

p

which can be computed through the use of the extension principle. The computed restriction on nt

p

is approximated to by a linguistic truth value, lt

p

. Precisiation of propositions drawn from natural language opens the door to construction of mathematical solutions of computational problems which are stated in natural language.

Lotfi A. Zadeh
Functional Solution of the Knowledge Level Control Problem: The Principles of Fuzzy Logic Rules and Linguistic Variables

This paper addresses the problem of imitating a teacher evaluating the students’ levels of knowledge. It proposes application of fuzzy logic to construct and manage a knowledge control system used for generating evaluating questions. The system contains a knowledge base with relevant information and a set of rules. Students build the rules based on the analysis of answers and relevant reactions to questions. The algorithms governing the system allow for an automatic selection of sequences of appropriate and customized questions. The presented system for knowledge control and generating questions is comparable in quality and efficiency with the real teacher’s questioning process.

Ali M. Abbasov, Shahnaz N. Shahbazova
Learning Systems with FUZZY

Fuzzy techniques have been proven to effectively tackle the problems of uncertainty in relationships among variables in systems that learn to adapt to a changing environment. This paper outlines our challenges for the last 25 years to design learning systems with fuzzy techniques and their applications to many real world problems. We then focus on the development of human-in-the-loop systems, such as a smart home or an assistive robotic environment, that involve different types of learning strategies. This warrants a full consideration of learning mechanisms in humans that mediate action-selection. We envisage that the principles of fuzzy theory, when combined with what we know about computational learning mechanisms in the human brain, will offer a practical guidance on how we design learning systems to advance user’s experience in real-world scenarios.

Sang Wan Lee, Z. Zenn Bien
Fuzzy Modifiers at the Core of Interpretable Fuzzy Systems

Fuzzy modifiers associated with linguistic hedges have been introduced by L.A. Zadeh at the early stage of approximate reasoning and they are fundamental elements in the management of interpretable systems. They can be regarded as a solution to the construction of fuzzy sets slightly different from original ones. We first present the main definitions of modifiers based on mathematical transformations of membership functions, mainly focusing on so-called post-modifiers and pre-modifiers, as well as definitions based on fuzzy relations. We show that measures of similarity are useful to evaluate the proximity between the original fuzzy sets and their modified form and we point out links between modifiers and similarities. We then propose an overview of application domains which can take advantage of fuzzy modifiers, for instance analogy-based reasoning, rule-based systems, gradual systems, databases, machine learning, image processing, and description logic. It can be observed that fuzzy modifiers are either constructed in a prior way by means of formal definitions or automatically learnt or tuned, for instance in hybrid systems involving genetic algorithm-based methods.

Bernadette Bouchon-Meunier, Christophe Marsala
Human and Machine Intelligence — Between Fuzzy Logic and Daoist Thought

The theory of fuzziness, offering an important scientific approach in building intelligent machines, has been researched and developed in the past fifty years from various perspectives and applied for real world problem solving in many areas. Daoist thought, being one of the most influential schools of Chinese philosophy, has been studied for more than two thousand years and its wisdom exploited from generation to generation. Would a natural echo exist between the modern fuzzy thinking and the ancient oriental Daoist thought?

Liya Ding, Xiaogan Liu
Developing Fuzzy State Models as Markov Chain Models with Fuzzy Encoding

This paper examines the relationship and establishes the equivalence between a class of dynamic fuzzy models, called Fuzzy State Models (FSM), and recently introduced Markov Chain models with fuzzy encoding. The equivalence between the two models leads to a methodology for learning FSMs from data and a systematic way for model based design of rule-based fuzzy controllers. The proposed approach is demonstrated on a case study of vehicle adaptive cruise control system in which an FSM is identified from simulation data and a fuzzy feedback controller is generated by exploiting the Stochastic Dynamic Programming (SDP).

Dimitar Filev, Ilya Kolmanovsky, Ronald Yager
Incremental Granular Fuzzy Modeling Using Imprecise Data Streams

System modeling in dynamic environments needs processing of streams of sensor data and incremental learning algorithms. This paper suggests an incremental granular fuzzy rule-based modeling approach using streams of fuzzy interval data. Incremental granular modeling is an adaptive modeling framework that uses fuzzy granular data that originate from unreliable sensors, imprecise perceptions, or description of imprecise values of a variable in the form fuzzy intervals. The incremental learning algorithm builds the antecedent of functional fuzzy rules and the rule base of the fuzzy model. A recursive least squares algorithm revises the parameters of a state-space representation of the fuzzy rule consequents. Imprecision in data is accounted for using specificity measures. An illustrative example concerning the Rossler attractor is given.

Daniel Leite, Fernando Gomide
Fuzzy Measures and Integrals: Recent Developments

This paper surveys the basic notions and most important results around fuzzy measures and integrals, as proposed independently by Choquet and Sugeno, as well as recent developments. The latter includes bases and transforms on set functions, fuzzy measures on set systems, the notion of horizontal additivity, basic Choquet calculus on the nonnegative real line introduced by Sugeno, the extension of the Choquet integral for nonmeasurable functions, and the notion of universal integral.

Michel Grabisch
Important New Terms and Classifications in Uncertainty and Fuzzy Logic

Human cognitive and perception processes have a great tolerance for imprecision or uncertainty. For this reason, the notions of perception and cognition have great importance in solving many decision making problems in engineering, medicine, science, and social science as there are innumerable uncertainties in real-world phenomena. These uncertainties can be broadly classified as either

type one uncertainty

arising from the random behavior of physical processes or

type two uncertainty

arising from human perception and cognition processes. Statistical theory can be used to model the former, but lacks the sophistication to process the latter. The theory of fuzzy logic has proven to be very effective in processing type two uncertainty. New computing methods based on fuzzy logic can lead to greater adaptability, tractability, robustness, a lower cost solution, and better rapport with reality in the development of intelligent systems. Fuzzy logic is needed to properly pose and answer queries about quantitatively defining imprecise linguistic terms like

middle class, poor, low inflation, medium inflation,

and

high inflation.

Imprecise terms like these in natural languages should be considered to have

qualitative definitions, quantitative definitions, crisp quantitative definitions, fuzzy quantitative definitions, type

-

one fuzzy quantitative definitions,

and

interval type

-

two fuzzy quantitative definitions.

There can be

crisp queries, crisp answers, type

-

one fuzzy queries, type

-

one fuzzy answers, interval type

-

two fuzzy queries,

and

interval type

-

two fuzzy answers.

Madan M. Gupta, Ashu M. G. Solo
Formalization and Visualization of Kansei Information Based on Fuzzy Set Approach

Kansei or affective-computing related information is easy to express in terms of fuzzy sets. Three examples of Kansei information, e.g., emotion, atmosphere, and Kansei texture, are formalized by using fuzzy set concept on [-1,1]

3

space. They are also visualized by using shape-brightness-size, shape-color-size, and contour-shape-gradation models, respectively. Their applications to agent to agent communication, multiagent communication, and online shopping are also introduced.

Fangyan Dong, Kaoru Hirota
Cognitive Informatics: A Proper Framework for the Use of Fuzzy Dynamic Programming for the Modeling of Regional Development?

We advocate Wang’s cognitive informatics as a potentially powerful general approach and paradigm to formulate, analyze and solve human centric systems modeling,decision and control problems. We show the use of fuzzy dynamic programming for solving a regional development problem in which many crucial aspects, in particular life quality indicators, are subject to objective and subjective, by the humans, judgments and evaluations which are closely related to human perceptions and cognitive abilities. We consider how a best (optimal) investment policy can be obtained under different development scenarios.

Janusz Kacprzyk
On Discord Between Expected and Actual Developments in Applications of Fuzzy Logic During Its First Fifty Years

Developments of applications of fuzzy logic during the first fifty years of its existence are examined in this paper with the aim of comparing the actual developments with the expected ones in various areas of human affairs. It is shown that in many of the examined areas the actual developments turned out to be very different from the expected ones. In each area, an attempt is made to explain reasons for this surprising discord between reasonable expectations and the actual developments.

George J. Klir
Meta-Heuristic Optimization of a Fuzzy Character Recognizer

Meta-heuristic algorithms are well researched and widely used in optimization problems. There are several meta-heuristic optimization algorithms with various concepts and each has its own advantages and disadvantages. Still it is difficult to decide which method would fit the best to a given problem. In this study the optimization of a fuzzy rule-base from a classifier, more specifically fuzzy character recognizer is used as the reference problem and the aim of the research was to investigate the behavior of selected meta-heuristic optimization techniques in order to develop a multi meta-heuristic algorithm.

Alex Tormási, László T. Kóczy
Additive Fuzzy Systems as Generalized Probability Mixture Models

Additive fuzzy systems generalize the popular mixture-density models of machine learning. Additive fuzzy systems map inputs to outputs by summing fired then-parts sets and then taking the centroid of the sum. This additive structure produces a simple convex structure: Outputs are convex combinations of the centroids of the fired then-part sets. Additive systems are uniform function approximators and admit simple learning laws that grow and tune rules from sample data. They also behave as conditional expectations with conditional variances and other higher moment that describe their uncertainty. But they suffer from exponential rule explosion in high dimensions. Extending finite-rule additive systems to fuzzy systems with continuum-many rules overcomes the problem of rule explosion if a higher-level mixture structure acts as a system of tunable meta-rules. Monte Carlo sampling can then compute fuzzy-system outputs.

Bart Kosko
Fuzzy Information Retrieval Systems: A Historical Perspective

The application of fuzzy set theory to information retrieval has been applied, specifically to Boolean models. This includes fuzzy indexing procedures defined to represent the varying significance of terms in synthesizing the documents’ contents, the definition of query languages to allow the expression of soft selection conditions, and associative retrieval mechanisms to model fuzzy pseudo-thesauri, fuzzy ontologies, and fuzzy categorizations of documents.

Donald H. Kraft, Erin Colvin, Gloria Bordogna, Gabriella Pasi
Is the World Itself Fuzzy? Physical Arguments and Unexpected Computational Consequences of Zadeh’s Vision

Fuzzy methodology has been invented to describe imprecise (“fuzzy”) human statements about the world, statements that use imprecise words from natural language like “small” or “large”. Usual applications of fuzzy techniques assume that the world itself is “crisp”, that there are exact equations describing the world, and fuzziness of our statements is caused by the incompleteness of our knowledge. But what if the world itself is fuzzy? What if there is no perfect system of equations describing the physical world – in the sense that no matter what system of equations we try, there will always be cases when this system leads to wrong predictions? This is not just a speculation: this idea is actually supported by many physicists. At first glance, this is a pessimistic idea: no matter how much we try, we will never be able to find the the Ultimate Theory of Everything. But it turns out that this idea also has its optimistic aspects: namely, in this chapter, we show (somewhat unexpectedly), that if such a no-perfect-theory principle is true, then the use of physical data can drastically enhance computations.

Vladik Kreinovich, Olga Kosheleva
Handling Noise and Outliers in Fuzzy Clustering

Since it is an unsupervised data analysis approach, clustering relies solely on the location of the data points in the data space or, alternatively, on their relative distances or similarities. As a consequence, clustering can suffer from the presence of noisy data points and outliers, which can obscure the structure of the clusters in the data and thus may drive clustering algorithms to yield suboptimal or even misleading results. Fuzzy clustering is no exception in this respect, although it features an aspect of robustness, due to which outliers and generally data points that are atypical for the clusters in the data have a lesser influence on the cluster parameters. Starting from this aspect, we provide in this paper an overview of different approaches with which fuzzy clustering can be made less sensitive to noise and outliers and categorize them according to the component of standard fuzzy clustering they modify.

Christian Borgelt, Christian Braune, Marie-Jeanne Lesot, Rudolf Kruse
A Fuzzy-Based Approach to Survival Data Mining

Traditional data mining algorithms assume that all data on a given object becomes available simultaneously (e.g., by accessing the object record in a database). However, certain real-world applications, known as

survival analysis

, or

event history analysis (EHA)

, deal with monitoring specific objects, such as medical patients, in the course of their lifetime. The data streams produced by such applications contain various events related to the monitored objects. When we observe an infinite stream of events, at each point in time (the “cut-off point”), some of the monitored entities are “right-censored”, since they have not experienced the event of interest yet and we do not know when the event will occur in the future. In

snapshot monitoring

, the data stream is observed as a sequence of periodic snapshots. Given each snapshot, we are interested to estimate the probability of a critical event (e.g., patient death or equipment failure) as a function of time for every monitored object. In this research, we use

fuzzy class label adjustment

so that standard classification algorithms can seamlessly handle a snapshot stream of both censored and non-censored data. The objective is to provide reasonably accurate predictions after observing relatively few snapshots of the data stream and to improve the classification performance with additional information obtained from each incoming snapshot. The proposed fuzzy-based methodology is evaluated on real-world snapshot streams from two different domains of survival analysis.

Mark Last, Hezi Halpert
Knowledge Extraction from Support Vector Machines: A Fuzzy Logic Approach

Support vector machines (SVMs) proved to be highly efficient computational tools in various classification tasks. However, SVMs are nonlinear classifiers and the knowledge learned by an SVM is encoded in a long list of parameter values, making it difficult to comprehend what the SVM is actually computing. We show that certain types of SVMs are mathematically equivalent to a specific fuzzy–rule base, called the

fuzzy all–permutations rule base

(FARB). The equivalent FARB provides a symbolic representation of the SVM functioning. This leads to a new approach for knowledge extraction from SVMs. An important advantage of this approach is that the number of extracted fuzzy rules depends on the number of support vectors in the SVM. Several simple examples demonstrate the effectiveness of this approach.

Shahaf Duenyas, Michael Margaliot
On Type-Reduction Versus Direct Defuzzification for Type-2 Fuzzy Logic Systems

This chapter examines type-reduction and direct defuzzification for interval type-2 fuzzy logic systems. It provides critiques of type-reduction as an end to itself as well as of direct defuzzification, and concludes that: (1) a good way to categorize type-reduction/direct defuzzification algorithm papers is as papers that either focus on algorithms that lead to a type-reduced set, or directly to a defuzzified value; (2) research on type-reduction as an end to itself has led to results that are arguably of very little value; and, (3) the practice of base-lining an IT2 FLS that uses direct defuzzification against one that uses type-reduction followed by defuzzification is unnecessary.

Jerry M. Mendel
On Fuzzy Theory for Econometrics

This paper aims mainly at informing statisticians and econometricians of relevant concepts and methods in fuzzy theory that are useful in addressing economic problems. We emphasize three recent significant contributions of fuzzy theory to economics, namely fuzzy games for capital risk allocations, fuzzy rule bases and compositional rule of inference for causal inference, and a statistical setting for fuzzy data based on continuous lattices.

Hung T. Nguyen, Songsak Sriboonchitta
On Z-numbers and the Machine-Mind for Natural Language Comprehension

This article is centred on two themes. The first is the extension of Zadeh’s basic Z-numbers into a tool for level-2 Computing With Words (CWW) and consequently subjective natural language understanding. We describe an algorithm and new operators (leading to complex or spectral Z-numbers), use them to simulate differential diagnosis, and highlight the inherent strengths and challenges of the Z-numbers. The second theme deals with the design of a, Minsky’s Society of Mind based, natural language comprehending machine-mind architecture. We enumerate its macro-components (function modules and memory units) and illustrate its working mechanism through simulation of metaphor understanding; validating system outputs against human-comprehension responses. The framework uses the aforementioned new Z-number paradigm to precisiate knowledge-frames. The completeness of the conceptualized architecture is analyzed through its coverage of mind-layers (Minsky) and cerebral cortex regions. The research described here draws from multiple disciplines and seeks to contribute to the cognitive-systems design initiatives for man-machine symbiosis.

Romi Banerjee, Sankar K. Pal
Evolutionary Reduction of Fuzzy Rule-Based Models

In the design of fuzzy rule-based models we strive to develop models that are both accurate and interpretable (transparent). The approach proposed here is aimed at the enhancement of transparency of the fuzzy model already constructed with the accuracy criterion in mind by proposing two modifications to the rules. First, we introduce a mechanism of reduction of the input space by eliminating some less essential input variables. This results in rules with the reduced subspaces of input variables making the rules more transparent. The second approach is concerned with an isolation of input variables: fuzzy sets defined in the

n

-dimensional input space and forming the condition part of the rules are subject to a decomposition process in which some variables are isolated and interpreted separately. The reduced dimensionality of the input subspaces in the first approach and the number of isolated input variables in the second one are the essential parameters controlling impact of enhanced transparency on the accuracy of the obtained fuzzy model. The two problems identified above are of combinatorial character and the optimization tasks emerging there are handled with the use of Genetic Algorithms (GAs). A series of numeric experiments is reported where we demonstrate the effectiveness of the two approaches and quantify the relationships between the criterion of accuracy and interpretability.

Witold Pedrycz, Kuwen Li, Marek Reformat
Geospatial Uncertainty Representation: Fuzzy and Rough Set Approaches

Uncertainty in geospatial data is often considered in the context of geographical information systems which enable a variety of operations and manipulation of spatial data. Here we consider how both fuzzy set and rough set theory has been used to represent geospatial data with uncertainty. Terrain modeling and triangulated irregular networks techniques utilizing fuzzy sets are presented. Rough set theory is overviewed and its application to spatial data is described. Issues of uncertainty in the representation of spatial relationships such as topological and directional relationships are discussed.

Frederick Petry, Paul Elmore
How to Efficiently Diagnose and Repair Fuzzy Database Queries that Fail

Telling the user that there is no result for his/her query is very poorly informative and corresponds to the kind of situation cooperative systems try to avoid. Cooperative systems should rather explain the reason(s) of the failure, materialized by Minimal Failing Subqueries (MFS), and build alternative succeeding queries, called maXimal Succeeding Subqueries (XSS), that are as close as possible to the original query. In the particular context of fuzzy querying, we propose an efficient unified approach to the computation of gradual MFSs and XSSs that relies on a fuzzy-cardinality-based summary of the relevant part of the database.

Olivier Pivert, Grégory Smits
The Web, Similarity, and Fuzziness

The Internet is perceived as a source of multiple types of information, a large shopping mall, a social forum and an entertainment hub. The users constantly browse and search the web in order to find things of interest. The keyword-based search becomes less and less efficient in the case of more refined searches where details of items become important. The introduction of Resource Description Framework (RDF) as a relation-based format for data representation allows us to propose a different way of performing a relevancy-based search. The approach proposed is based on representation of items as sets of features. This means that evaluation of items’ relevance is based a feature-based comparison. A more realistic relevancy determination can be been achieved via categorization and ranking of features. The calculated relevancy measures for individual categories of features are aggregated using a fuzzy-based approach.

Parisa D. Hossein Zadeh, Marek Z. Reformat
The Genesis of Fuzzy Sets and Systems – Aspects in Science and Philosophy

In 1965 Lotfi A. Zadeh founded the theory of Fuzzy Sets and Systems. This chapter deals with developments in the history of philosophy, logic, and mathematics during the time before and up to the beginning of fuzzy logic and it also gives a view of its first application in control theory. Regarding the term “fuzzy” we note that older concepts of “vagueness” and “haziness” had previously been discussed in philosophy, logic, mathematics. This chapter delineates some specific paths through the history of the use of these “loose concepts ”. Haziness and fuzziness were concepts of interest in mathematics and philosophy during the second half of the 20th century. The logico-philosophical history presented here covers the work of Russell, Black, Hertz, Wittgenstein and others. The mathematical-technical history deals with the theories founded by Menger and Zadeh. Menger’s concepts of probabilistic metrics, hazy sets (ensembles flous) and micro-geometry as well as Zadeh’s theory of Fuzzy Sets paved the way for the establishment of Soft Computing methods. In the first decade of Fuzzy Sets and Systems, nobody thought that this theory would be successful in the field of applied sciences and technology. Zadeh expected that his theory would have a role in the future of computer systems as well as Humanities and Social Sciences. When Mamdani and Assilian picked up the idea of Fuzzy Algorithms to establish a first Fuzzy Control system for a small steam engine, this was the Kick-off for the “Fuzzy Boom” and Zadehs primary intention trailed away for years. Then in the new millennium a new movement for Fuzzy Sets in Social Sciences and Humanities was launched.

Rudolf Seising
Fuzzy Logic in Speech Technology - Introductory and Overviewing Glimpses

The chapter critically reviews several applications of fuzzy logic and fuzzy systems in speech technology, along the main directions of the filed: speech synthesis, speech recognition, and speech analysis. A brief incursion in the use of mixed techniques, combining fuzzy logic, fuzzy classifiers and nonlinear dynamics is included. A rich list of references complements the chapter.

Horia-Nicolai Teodorescu
Fuzzy Sets: Towards the Scientific Domestication of Imprecision

This paper considers fuzzy sets just as science tries to domesticate concepts once abstracted from reality: identifying them with quantities, using these quantities for building up mathematical models, escaping from just a formal logic setting, and testing the models against reality before provisionally accepting them. Its aim is that of trying to go towards a new experimental science of ‘the imprecise’; to something like a ‘physics of linguistic imprecision’. It contains a way for looking at fuzzy sets that, if continued, could offer a new perspective for seeing linguistic imprecision, and whose possible value lies on the idea that several forms of theorizing always can be better than a single one.

Enric Trillas
Type 1 and Full Type 2 Fuzzy System Models

We first present a brief review of the essentials fuzzy system models: Namely (1) Zadeh’s rulebase model, (2) Takagi and Sugeno’s model which is partly a rule base and partly a regression function and (3) Türkşen fuzzy regression functions where a fuzzy regression function correspond to each fuzzy rule. Next we review the well known FCM algorithm which lets one to extract Type 1 membership values from a given data set for the development of Type 1 fuzzy system models as a foundation for the development of Full Type 2 fuzzy system models. For this purpose, we provide an algorithm which lets one to generate Full Type 2 membership value distributions for a development of second order fuzzy system models with our proposed second order data analysis. If required one can generate Full Type 3,…, Full Type n fuzzy system models with an iterative execution of our algorithm. We present our application results graphically for TD_Stockprice data with respect to two validity indeces, namely: (1) Çelikyılmaz-Türkşen and (2) Bezdek indeces.

I. Burhan Türkşen
Complex Fuzzy Sets and Complex Fuzzy Logic an Overview of Theory and Applications

Fuzzy Logic, introduced by Zadeh along with his introduction of fuzzy sets, is a continuous multi-valued logic system. Hence, it is a generalization of the classical logic and the classical discrete multi-valued logic (e.g. Łukasiewicz’ three/many-valued logic). Throughout the years Zadeh and other researches have introduced extensions to the theory of fuzzy setts and fuzzy logic. Notable extensions include linguistic variables, type-2 fuzzy sets, complex fuzzy numbers, and Z-numbers. Another important extension to the theory, namely the concepts of complex fuzzy logic and complex fuzzy sets, has been investigated by Kandel et al. This extension provides the basis for control and inference systems relating to complex phenomena that cannot be readily formalized via type-1 or type-2 fuzzy sets. Hence, in recent years, several researchers have used the new formalism, often in the context of hybrid neuro-fuzzy systems, to develop advanced complex fuzzy logic-based inference applications. In this chapter we reintroduce the concept of complex fuzzy sets and complex fuzzy logic and survey the current state of complex fuzzy logic, complex fuzzy sets theory, and related applications.

Dan E. Tamir, Naphtali D. Rishe, Abraham Kandel
Backmatter
Metadaten
Titel
Fifty Years of Fuzzy Logic and its Applications
herausgegeben von
Dan E. Tamir
Naphtali D. Rishe
Abraham Kandel
Copyright-Jahr
2015
Electronic ISBN
978-3-319-19683-1
Print ISBN
978-3-319-19682-4
DOI
https://doi.org/10.1007/978-3-319-19683-1