Skip to main content

2003 | Buch

Entropy Measures, Maximum Entropy Principle and Emerging Applications

herausgegeben von: Professor  Karmeshu

Verlag: Springer Berlin Heidelberg

Buchreihe : Studies in Fuzziness and Soft Computing

insite
SUCHEN

Über dieses Buch

The last two decades have witnessed an enormous growth with regard to ap­ plications of information theoretic framework in areas of physical, biological, engineering and even social sciences. In particular, growth has been spectac­ ular in the field of information technology,soft computing,nonlinear systems and molecular biology. Claude Shannon in 1948 laid the foundation of the field of information theory in the context of communication theory. It is in­ deed remarkable that his framework is as relevant today as was when he 1 proposed it. Shannon died on Feb 24, 2001. Arun Netravali observes "As if assuming that inexpensive, high-speed processing would come to pass, Shan­ non figured out the upper limits on communication rates. First in telephone channels, then in optical communications, and now in wireless, Shannon has had the utmost value in defining the engineering limits we face". Shannon introduced the concept of entropy. The notable feature of the entropy frame­ work is that it enables quantification of uncertainty present in a system. In many realistic situations one is confronted only with partial or incomplete information in the form of moment, or bounds on these values etc. ; and it is then required to construct a probabilistic model from this partial information. In such situations, the principle of maximum entropy provides a rational ba­ sis for constructing a probabilistic model. It is thus necessary and important to keep track of advances in the applications of maximum entropy principle to ever expanding areas of knowledge.

Inhaltsverzeichnis

Frontmatter
1. Uncertainty, Entropy and Maximum Entropy Principle — An Overview
Abstract
This introductory chapter aims to spell out the basics of entropy, maximum entropy frameworks and their applications in some fields. Applications are selected so as to avoid possible overlaps with the ones contained in this volume. Though the contents in this book are largely concerned with uncertainty of random phenomena, a brief overview of uncertainty manifesting in various other forms is also given.
Karmeshu, N. R. Pal
2. Facets of Generalized Uncertainty-based Information
Abstract
Review of current research on broad concepts of uncertainty and information that is based on uncertainty reduction. A formalization of these concepts is conceived, in general, within a broad framework consisting of fuzzy set theory and fuzzy measure theory. This framework is sufficiently general to capture virtually all conceivable types of uncertainty via appropriate uncertainty theories. Three levels are distinguished in each uncertainty theory: uncertainty formalization, uncertainty measurement, and uncertainty utilization. Thus far, uncertainty (and uncertainty-based information) has been investigated at each of these three levels only in the Dempster-Shafer theory of evidence and its special branches. Results of these investigations are reviewed.
George J. Klir
3. Application of the Maximum (Information) Entropy Principle to Stochastic Processes far from Thermal Equilibrium
Abstract
This paper shows how the maximum (information) entropy principle allows the derivation of the short-time propagator from experimental data provided the process is Markovian. From the propagator, the Fokker-Planck equation can be derived. The Lagrange parameters that are used in the maximum information entropy principle can be derived by minimizing the Kullback information.
Hermann Haken
4. Maximum Entropy Principle, Information of Non-Random Functions and Complex Fractals
Abstract
In this paper, we give two new applications of the maximum entropy principle (MEP). First, we show how the MEP provides a meaningful approach to defining the entropy of non-random functions. And then, we use the MEP to obtain an estimate of the probability distribution of complex-valued fractional Brownian motion defined as the limit of a random walk on the complex roots of the unity in the complex plane, thus exhibiting a relation between complex fractals and thermodynamics of order n.
Guy Jumarie
5. Geometric Ideas in Minimum Cross-Entropy
Abstract
This article reviews three geometric approaches to the understanding of the minimum cross-entropy method for estimating a probability distribution. The first approach is to regard the method as a projection based on an analogue of Pythagoras’ Theorem. The second is to regard the set of probability distributions as a differentiable manifold and to introduce a Riemannian geometry on this manifold. The third uses the idea of Hausdorff dimension to support the use of the method.
L. Lore Campbell
6. Information-Theoretic Measures for Knowledge Discovery and Data Mining
Abstract
A database may be considered as a statistical population, and an attribute as a statistical variable taking values from its domain. One can carry out statistical and information-theoretic analysis on a database. Based on the attribute values, a database can be partitioned into smaller populations. An attribute is deemed important if it partitions the database such that previously unknown regularities and patterns are observable. Many information-theoretic measures have been proposed and applied to quantify the importance of attributes and relationships between attributes in various fields. In the context of knowledge discovery and data mining (KDD), we present a critical review and analysis of information-theoretic measures of attribute importance and attribute association, with emphasis on their interpretations and connections.
Y. Y. Yao
7. A Universal Maximum Entropy Solution for Complex Queueing Systems and Networks
Abstract
An analytic framework is presented for a unified exposition of entropy maximization and complex queueing systems and networks. In this context, a universal maximum entropy (ME) solution is characterized, subject to appropriate mean value constraints, for the joint state probability distribution of a complex single server queueing system with finite capacity, distinct either priority or nonpriority classes of jobs, general (G-type) class interarrival and service time processes and either complete (CBS) or partial (PBS) buffer sharing scheme. The ME solution leads to the establishment of closed-form expressions for the aggregate and marginal state probabilities and, moreover, it is stochastically implemented by making use of the generalized exponential (GE) distribution towards the least biased approximation of G-type continuous time distributions with known first two moments. Subsequently, explicit analytic formulae are presented for the estimation of the Lagrangian coefficients via asymptotic connections to the corresponding infinite capacity queue and GE-type formulae for the blocking probabilities per class. Furthermore, it is shown that the ME solution can be utilized, in conjunction with GE-type flow approximation formulae, as a cost effective building block towards the determination of an extended ME product-form approximation and a queue-by-queue decomposition algorithm for the performance analysis of complex open queueing network models (QNMs) with arbitrary configuration and repetitive service (RS) blocking.
Demetres Kouvatsos
8. Minimum Mean Deviation from the Steady-State Condition in Queueing Theory
Abstract
For frequent cases when random perturbations alter the steady-state condition of a queueing system, the paper proposes corrections by constructing new probability distributions for the number of arrivals, interarrival time, or/and service time by minimizing the mean chi-square deviation from the corresponding steady-state probability distributions subject to given constraints represented by generalized moments or generalized mixed moments induced by random fluctuations.
Silviu Guiasu
9. On the Utility of Different Entropy Measures in Image Thresholding
Abstract
Over last few years, several methods have been proposed for image thresholding based on entropy maximization. Some of these methods use gray level histogram, while others use entropy associated with the two-dimensional histogram or the co-occurrence matrix. Few recent methods use the cross entropy or divergence also. But most of these attempts are based on Shannon’s entropy except a few which use the exponential entropy or quadratic entropy. There are many other measures of information or entropy definitions whose utility in image processing has not been explored. This paper attempts to review some of these non-Shannon entropic measures and investigates their usefulness in image segmentation. Most of these “non-Shannonian” entropy measures have some parameters whose influence on the performance of the thresholding algorithms is investigated. In this regard we consider two types of algorithms, one based on global image information (or histogram) and the other based on local image information (co-occurrence or two dimensional histogram). Our findings are: (i) the co-occurrence based entropy methods perform better than histogram based methods for image thresholding; (ii) some of the entropy measures are not very sensitive to their parameters and a few of them are not at all useful at least for histogram based thresholding; and (iii) maximization of the histogram entropy of a partitioned image, at least the way it is being used in the literature, is not a good principle for image segmentation.
D. P. Mandal, N. R. Pal
10. Entropic Thresholding Algorithms and their Optimizations
Abstract
Entropic thresholding algorithms are important tools in image processing. In this article, fast iterative methods are derived for the minimum cross entropy thresholding and maximum entropy thresholding algorithms using the one-point iteration scheme. Simulations performed using systhetic generated histograms and a real image show the speed advantage and the accuracy of the iterated methods.
C. H. Li, C. K. Lee, P. K. S. Tam
11. Entropy and Complexity of Sequences
Abstract
We analyze and discuss here sequences of letters and time series coded as sequences of letters on certain alphabets. Main subjects are macromolecular sequences (e.g., nucleotides in DNA or amino acids in proteins) neural spike trains and financial time series. Several sequence representations are introduced, including return plots, surrogate sequences and surrogate processes. We give a short review of the definition of entropies and some other informational concepts. We also point out that entropies have to be considered as fluctuating quantities and study the corresponding distributions. In the last part we consider grammatical concepts. We discuss algorithms to evaluate the syntactic complexity and information content and apply them to several special sequences. We compare the data from seven neurons, before and after penicillin treatment, by encoding their inter-spike intervals according to their entropies, syntactic-complexity and informational content. Using these measures to classify these sequences with respect to their structure or randomness, give similar results. The other examples show significantly less order.
Werner Ebeling, Miguel Jimenez-Montano, Thomas Pohl
12. Some Lessons for Molecular Biology from Information Theory
Abstract
Applying information theory to molecular biology problems clarifies many issues. The topics addressed are: how there can be precision in molecular interactions, how much pattern is stored in the DNA for genetic control systems, and the roles of theory violations, instrumentation, and models in science.
Thomas D. Schneider
13. Computation of the MinMax Measure
Abstract
The MinMax measure of information, defined by Kapur, Baciu and Kesavan [6], is a quantitative measure of the information contained in a given set of moment constraints. It is based on both maximum and minimum entropy. Computational difficulties in the determination of minimum entropy probability distributions (MinEPD) have inhibited exploration of the full potential of minimum entropy and, hence, the MinMax measure. Initial attempts to solve the minimum entropy problem were directed towards finding analytical solutions for some specific set of constraints. Here, we present a numerical solution to the general minimum entropy problem and discuss the significance of minimum entropy and the MinMax measure. Some numerical examples are given for illustration.
M. Srikanth, H. K. Kesavan, Peter Roe
14. On Three Functional Equations Related to the Bose-Einstein Entropy
Abstract
In this paper, we study three functional equations that arise from the representation of the Bose-Einstein Entropy.
PL. Kannappan, P. K. Sahoo
15. The Entropy Theory as a Decision Making Tool in Environmental and Water Resources
Abstract
Since the development of the entropy theory by Shannon in the late 1940s and of the principle of maximum entropy (POME) by Jaynes in the late 1950s, there has been a proliferation of applications of entropy in a wide spectrum of areas, including environmental and water resources. The real impetus to entropy-based modelling in environmental and water resources was however provided in the early 1970s, and a great variety of entropy-based applications have since been reported and new applications continue to unfold. Most of these applications have, however, been in the realm of modelling and a relatively few applications have been reported on decision making. This note revisits the entropy theory and emphasizes its usefulness in the realm of decision making in environmental and water resources, and is concluded with comments on its implications in developing countries.
Vijay P. Singh
Metadaten
Titel
Entropy Measures, Maximum Entropy Principle and Emerging Applications
herausgegeben von
Professor Karmeshu
Copyright-Jahr
2003
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-540-36212-8
Print ISBN
978-3-642-05531-7
DOI
https://doi.org/10.1007/978-3-540-36212-8