Skip to main content

2014 | Buch

Rough Sets and Intelligent Systems Paradigms

Second International Conference, RSEISP 2014, Held as Part of JRS 2014, Granada and Madrid, Spain, July 9-13, 2014. Proceedings

herausgegeben von: Marzena Kryszkiewicz, Chris Cornelis, Davide Ciucci, Jesús Medina-Moreno, Hiroshi Motoda, Zbigniew W. Raś

Verlag: Springer International Publishing

Buchreihe : Lecture Notes in Computer Science

insite
SUCHEN

Über dieses Buch

This book constitutes the refereed proceedings of the 23rd Australasian Joint Conference on Rough Sets and Intelligent Systems Paradigms, RSEISP 2014, held in Granada and Madrid, Spain, in July 2014. RSEISP 2014 was held along with the 9th International Conference on Rough Sets and Current Trends in Computing, RSCTC 2014, as a major part of the 2014 Joint Rough Set Symposium, JRS 2014. JRS 2014 received 40 revised full papers and 37 revised short papers which were carefully reviewed and selected from 120 submissions and presented in two volumes. This volume contains the papers accepted for the conference RSEISP 2014, as well as the three invited papers presented at the conference. The papers are organized in topical sections on plenary lecture and tutorial papers; foundations of rough set theory; granular computing and covering-based rough sets; applications of rough sets; induction of decision rules - theory and practice; knowledge discovery; spatial data analysis and spatial databases; information extraction from images.

Inhaltsverzeichnis

Frontmatter
Correction to: Interactive Computations on Complex Granules

Correction to: Chapter “Interactive Computations on Complex Granules” in: M. Kryszkiewicz et al. (Eds.): Rough Sets and Intelligent Systems Paradigms, LNAI 8537, https://doi.org/10.1007/978-3-319-08729-0_11 The acknowledgement section of this chapter originally referred to grant DEC-2013/09/B/ST6/01568. The reference to this grant has been removed from the acknowledgement section at the request of one of the authors.

Andrzej Jankowski, Andrzej Skowron, Roman Swiniarski

Plenary Lecture and Tutorial Papers

The Impact of Local Data Characteristics on Learning from Imbalanced Data

Problems of learning classifiers from imbalanced data are discussed. First, we look at different data difficulty factors corresponding to complex distributions of the minority class and show that they could be approximated by analysing the neighbourhood of the learning examples from the minority class. We claim that the results of this analysis could be a basis for developing new algorithms. In this paper we show such possibilities by discussing modifications of informed pre-processing method LN–SMOTE as well as by incorporating types of examples into rule induction algorithm BRACID.

Jerzy Stefanowski
The Relational Construction of Conceptual Patterns - Tools, Implementation and Theory

Different conceptual ways to analyse information are here defined by means of the fundamental notion of a relation. This approach makes it possible to compare different mathematical notions and tools used in qualitative data analysis. Moreover, since relations are representable by Boolean matrices, computing the conceptual-oriented operators is straightforward. Finally, the relational-based approach makes it possible to conceptually analyse not only sets but relations themselves.

Piero Pagliani
SQL-Based KDD with Infobright’s RDBMS: Attributes, Reducts, Trees

We present a framework for KDD process implemented using SQL procedures, consisting of constructing new attributes, finding rough set-based reducts and inducing decision trees. We focus particularly on attribute reduction, which is important especially for high-dimensional data sets. The main technical contribution of this paper is a complete framework for calculating short reducts using SQL queries on data stored in a relational form, without a need of any external tools generating or modifying their syntax. A case study of large real-world data is presented. The paper also recalls some other examples of SQL-based data mining implementations. The experimental results are based on the usage of Infobright’s analytic RDBMS, whose performance characteristics perfectly fit the requirements of presented algorithms.

Jakub Wróblewski, Sebastian Stawicki

Foundations of Rough Set Theory

From Vagueness to Rough Sets in Partial Approximation Spaces

Vagueness has a central role in the motivation basis of rough set theory. Expressing vagueness, after Frege, Pawlak’s information-based proposal was the boundary regions of sets. In rough set theory, Pawlak represented boundaries by the differences of upper and lower approximations and defined exactness and roughness of sets via these differences. However, defining exactness/roughness of sets have some possibilities in general. In this paper, categories of vagueness, i.e., different kinds of rough sets, are identified in partial approximation spaces. Their formal definitions and intuitive meanings are given under sensible restrictions.

Zoltán Ernő Csajbók, Tamás Mihálydeák
Random Probes in Computation and Assessment of Approximate Reducts

We discuss applications of random probes in a process of computation and assessment of approximate reducts. By random probes we mean artificial attributes, generated independently from a decision vector but having similar value distributions to the attributes in the original data. We introduce a concept of a randomized reduct which is a reduct constructed solely from random probes and we show how to use it for unsupervised evaluation of attribute sets. We also propose a modification of the greedy heuristic for a computation of approximate reducts, which reduces a chance of including irrelevant attributes into a reduct. To support our claims we present results of experiments on high dimensional data. Analysis of obtained results confirms usefulness of random probes in a search for informative attribute sets.

Andrzej Janusz, Dominik Ślęzak
Applications of Boolean Kernels in Rough Sets

Rough Sets (RS) and Support Vector Machine (SVM) are the two big and independent research areas in AI. Originally, rough set theory is dealing with the concept approximation problem under uncertainty. The basic idea of RS is related to lower and upper approximations, and it can be applied in classification problem. At the first sight RS and SVM offer different approaches to classification problem. Most RS methods are based on minimal decision rules, while SVM converts the linear classifiers into instance based classifiers. This paper presents a comparison analysis between these areas and shows that, despite differences, there are quite many analogies in the two approaches. We will show that some rough set classifiers are in fact the SVM with Boolean kernel and propose some hybrid methods that combine the advantages of those two great machine learning approaches.

Sinh Hoa Nguyen, Hung Son Nguyen
Robust Ordinal Regression for Dominance-Based Rough Set Approach under Uncertainty

We consider decision under uncertainty where preference information provided by a Decision Maker (DM) is a classification of some reference acts, relatively well-known to the DM, described by outcomes to be gained with given probabilities. We structure the classification data using a variant of the Dominance-based Rough Set Approach. Then, we induce from this data all possible minimal-cover sets of rules which correspond to all instances of the preference model compatible with the input preference information. We apply these instances on a set of unseen acts, and draw robust conclusions about their quality using the Robust Ordinal Regression paradigm. Specifically, for each act we derive the necessary and possible assignments specifying the range of classes to which the act is assigned by all or at least one compatible set of rules, respectively, as well as class acceptability indices. The whole approach is illustrated by a didactic example.

Roman Słowiński, Miłosz Kadziński, Salvatore Greco
Monotonic Uncertainty Measures in Probabilistic Rough Set Model

Uncertainty measure is one of the key research issues in the rough set theory. In the Pawlak rough set model, the accuracy measure, the roughness measure and the approximation accuracy measure are used as uncertainty measures. Monotonicity is a basic property of these measures. However, the monotonicity of these measures does not hold in the probabilistic rough set model, which makes them not so reasonable to evaluate the uncertainty. The main objective of this paper is to address the uncertainty measure problem in the probabilistic rough set model. We propose three monotonic uncertainty measures which are called the probabilistic accuracy measure, the probabilistic roughness measure and the probabilistic approximation accuracy measure respectively. The monotonicity of the proposed uncertainty measures is proved to be held. Finally, an example is used to verify the validity of the proposed uncertainty measures.

Guoyin Wang, Xi’ao Ma, Hong Yu
Attribute Subset Quality Functions over a Universe of Weighted Objects

We consider a rough set inspired approach to deriving meaningful attribute subsets from data organized in a form of a decision system. We focus on quality functions measuring degrees in which particular attribute subsets determine the values of a decision attribute. We follow a well known idea of assigning weights to the training objects in order to reflect their importance in the attribute subset selection and new case classification processes. We discuss an example of an object weighting strategy related to probabilities of decision classes in the training data. We show that two attribute subset quality functions used in our earlier research are the same function computed using two different weighting techniques. We also investigate whether it is worth using the same weights during the processes of attribute selection and new case classification.

Sebastian Widz, Dominik Ślęzak
A Definition of Structured Rough Set Approximations

Pawlak lower and upper approximations are unions of equivalence classes. By explicitly expressing individual equivalence classes in the approximations, Bryniarski uses a pair of families of equivalence classes as rough set approximations. Although the latter takes into consideration of structural information of the approximations, it has not received its due attention. The main objective of this paper is to further explore the Bryniarski definition and propose a generalized definition of structured rough set approximations by using a family of conjunctively definable sets. The connections to covering-based rough sets and Grzymala-Busse’s LERS systems are investigated.

Yiyu Yao, Mengjun Hu

Granular Computing and Covering-Based Rough Sets

Interactive Computations on Complex Granules

Information granules (infogranules, for short) are widely discussed in the literature. In particular, let us mention here the rough granular computing approach based on the rough set approach and its combination with other approaches to soft computing. However, the issues related to interactions of infogranules with the physical world and to perception of interactions in the physical world represented by infogranules are not well elaborated yet. On the other hand, the understanding of interactions is the critical issue of complex systems. We propose to model complex systems by interactive computational systems (ICS) created by societies of agents. Computations in ICS are based on complex granules (c-granules, for short). In the paper we concentrate on some basic issues related to interactive computations based on c-granules performed by agents in the physical world.

Andrzej Jankowski, Andrzej Skowron, Roman Swiniarski
Formulation and Simplification of Multi-Granulation Covering Rough Sets

The theory of multi-granulation rough sets is one kind of effective methods for knowledge discovery in multiple granular structures. Based on rough sets on a single granular structure, various kinds of multi-granulation rough set models are proposed in the past decades. In this paper, according to two kinds of covering rough sets on single-granulation covering approximation spaces, four types of multi-granulation covering rough set models are defined. Properties of new models are examined in detail, comparison of multi-granulation covering approximation operators is done. Finally, simplification of four types of multi-granulation covering rough sets is investigated.

Tong-Jun Li, Xing-Xing Zhao, Wei-Zhi Wu
Covering Based Rough Sets and Relation Based Rough Sets

Relation based rough sets and covering based rough sets are two important extensions of the classical rough sets. This paper investigates relationships between relation based rough sets and the covering based rough sets in a particular framework of approximation operators, presents a new group of approximation operators obtained by combining coverings and neighborhood operators and establishes some relationships between covering based rough sets and relation based rough sets.

Mauricio Restrepo, Jonatan Gómez

Applications of Rough Sets

A Rough Set Approach to Novel Compounds Activity Prediction Based on Surface Active Properties and Molecular Descriptors

The aim of this paper is to study relationship between biological activity of a group of 140 gemini-imidazolium chlorides and three types of parameters: structure, surface active, and molecular ones. Dominance-based rough set approach is applied to obtain decision rules, which describe dependencies between analyzed parameters and allow to create a model of chemical structure with best biological activity. Moreover, presented study allowed to identify attributes relevant with respect to high antimicrobial activity of compounds. Finally, we have shown that decision rules that involve only structure and surface active attributes are sufficient to plan effective synthesis pathways of active molecules.

Jerzy Błaszczyński, Łukasz Pałkowski, Andrzej Skrzypczak, Jan Błaszczak, Alicja Nowaczyk, Roman Słowiński, Jerzy Krysiński
Rough Sets in Ortholog Gene Detection
Selection of Feature Subsets and Case Reduction Considering Imbalance

Ortholog detection should be improved because of the real value of ortholog genes in the prediction of protein functions. Datasets in the binary classification problem can be represented as information systems. We use a gene pair extended similarity relation based on an extension of the Rough Set Theory and aggregated gene similarity measures as gene features, to select feature subsets with the aid of quality measures that take imbalance into account. The proposed procedure can be useful for datasets with few features and discrete parameters. The case reduction obtained from the approximation of ortholog and non-ortholog concepts might be an effective method to cope with extremely high imbalance in supervised classification.

Deborah Galpert Cańizares, Reinier Millo Sánchez, María Matilde García Lorenzo, Gladys Casas Cardoso, Ricardo Grau Abalo, Leticia Arco García
Hybrid Model Based on Rough Sets Theory and Fuzzy Cognitive Maps for Decision-Making

Decision-making could be defined as the process to choose a suitable decision among a set of possible alternatives in a given activity. It is a relevant subject in numerous disciplines such as engineering, psychology, risk analysis, operations research, etc. However, most real-life problems are unstructured in nature, often involving vagueness and uncertainty features. It makes difficult to apply exact models, being necessary to adopt approximate algorithms based on Artificial Intelligence and Soft Computing techniques. In this paper we present a novel decision-making model called Rough Cognitive Networks. It combines the capability of Rough Sets Theory for handling inconsistent patterns, with the modeling and simulation features of Fuzzy Cognitive Maps. Towards the end, we obtain an accurate hybrid model that allows to solve non-trivial continuous, discrete, or mixed-variable decision-making problems.

Gonzalo Nápoles, Isel Grau, Koen Vanhoof, Rafael Bello
Decision Rules-Based Probabilistic MCDM Evaluation Method – An Empirical Case from Semiconductor Industry

Dominance-based rough set approach has been widely applied in multiple criteria classification problems, and its major advantage is the inducted decision rules that can consider multiple attributes in different contexts. However, if decision makers need to make ranking/selection among the alternatives that belong to the same decision class—a typical multiple criteria decision making problem, the obtained decision rules are not enough to resolve the ranking problem. Using a group of semiconductor companies in Taiwan, this study proposes a decision rules-based probabilistic evaluation method, transforms the strong decision rules into a probabilistic weighted model—to explore the performance gaps of each alternative on each criterion—to make improvement and selection. Five example companies were tested and illustrated by the transformed evaluation model, and the result indicates the effectiveness of the proposed method. The proposed evaluation method may act as a bridge to transform decision rules (from data-mining approach) into a decision model for practical applications.

Kao-Yi Shen, Gwo-Hshiung Tzeng

Induction of Decision Rules - Theory and Practice

Decision Rule Classifiers for Multi-label Decision Tables

Recently, multi-label classification problem has received significant attention in the research community. This paper is devoted to study the effect of the considered rule heuristic parameters on the generalization error. The results of experiments for decision tables from UCI Machine Learning Repository and KEEL Repository show that rule heuristics taking into account both coverage and uncertainty perform better than the strategies taking into account a single criterion.

Fawaz Alsolami, Mohammad Azad, Igor Chikalov, Mikhail Moshkov
Considerations on Rule Induction Procedures by STRIM and Their Relationship to VPRS

STRIM (Statistical Test Rule Induction Method) has been proposed as a method to effectively induct if-then rules from the decision table. The method was studied independently of the conventional rough sets methods. This paper summarizes the basic notion of STRIM and the conventional rule induction methods, considers the relationship between STRIM and their conventional methods, especially VPRS (Variable Precision Rough Set), and shows that STRIM develops the notion of VPRS into a statistical principle. In a simulation experiment, we also consider the condition that STRIM inducts the true rules specified in advance. This condition has not yet been studied, even in VPRS. Examination of the condition is very important if STRIM is properly applied to a set of real-world data set.

Yuichi Kato, Tetsuro Saeki, Shoutarou Mizuno
Generating Core in Rough Set Theory: Design and Implementation on FPGA

In this paper we propose the FPGA based device for data processing using rough set methods. Presented architecture has been tested on a real-world data. Obtained results confirm the huge acceleration of the computation time using hardware supporting core generation in comparison to software implementation.

Maciej Kopczynski, Tomasz Grzes, Jaroslaw Stepaniuk
Attribute Ranking Driven Filtering of Decision Rules

In decision rule induction approaches either minimal, complete, or satisfying sets of constituent rules are inferred, with an aim of providing predictive properties while offering descriptive capabilities for the learned concepts. Instead of limiting rules at their induction phase we can also execute post-processing of the set of generated decision rules (whether it is complete or not) by filtering out those that meet some constraints. The paper presents the research on rule filtering while following a ranking of conditional attributes, obtained in the process of sequential forward selection of input features for ANN classifiers.

Urszula Stańczyk
Evaluation of Leave-One Out Method Based on Incremental Sampling Scheme

This paper proposes a new framework for evaluation of leave-out one methods based on incremental sampling scheme. Although incremental sampling scheme is used for incremental rule induction, this paper shows that the same idea can be used for deletion of exampling. Then, we applied this technique to the leave-one out method for rules defined by the propositions whose constraints were defined by inequalities of accuracy and coverage. The results show that the evaluation framework gives a powerful tool for evaluation of the leave-out method.

Shusaku Tsumoto, Shoji Hirano
Optimization of Decision Rules Relative to Coverage - Comparative Study

In the paper, we present a modification of the dynamic programming algorithm for optimization of decision rules relative to coverage. The aims of the paper are: (i) study of the coverage of decision rules, and (ii) study of the size of a directed acyclic graph (the number of nodes and edges), for a proposed algorithm. The paper contains experimental results with decision tables from UCI Machine Learning Repository.

Beata Zielosko

Knowledge Discovery

MedVir: An Interactive Representation System of Multidimensional Medical Data Applied to Traumatic Brain Injury’s Rehabilitation Prediction

Clinicians could model the brain injury of a patient through his brain activity. However, how this model is defined and how it changes when the patient is recovering are questions yet unanswered. In this paper, the use of MedVir framework is proposed with the aim of answering these questions. Based on complex data mining techniques, this provides not only the differentiation between TBI patients and control subjects (with a 72% of accuracy using 0.632 Bootstrap validation), but also the ability to detect whether a patient may recover or not, and all of that in a quick and easy way through a visualization technique which allows interaction.

Santiago Gonzalez, Antonio Gracia, Pilar Herrero, Nazareth Castellanos, Nuria Paul
SnS: A Novel Word Sense Induction Method

The paper is devoted to the word sense induction problem. We propose a knowledge-poor method, called SenseSearcher (SnS), which induces senses of words from text corpora, based on closed frequent sets. The algorithm discovers a hierarchy of senses, rather than a flat list of concepts, so the results are easier to comprehend. We have evaluated the SnS quality by performing experiments for web search result clustering task with the datasets from SemEval-2013 Task 11.

Marek Kozłowski, Henryk Rybiński
Review on Context Classification in Robotics

In this paper a review on context and environment classification is presented with focus on autonomous service robots. A comprehensive research was made in order to classify the most relevant techniques, models and frameworks in use today, as well as to present possible future applications for these previous works. Most of the work done in this area has been focused in a general classification. In this sense a new possible application scenario is described and the corresponding supporting architecture.

Fábio Miranda, Tiago Cabral Ferreira, João Paulo Pimentão, Pedro Sousa
Interestingness Measures for Actionable Patterns

The ability to make mined patterns actionable is becoming increasingly important in today’s competitive world. Standard data mining focuses on patterns that summarize data and these patterns are required to be further processed in order to determine opportunities for action. To address this problem, it is essential to extract patterns by comparing the profiles of two sets of relevant objects to obtain useful, understandable, and workable strategies. In this paper, we present the definition of actionable rules by integrating action rules and reclassification rules to build a framework for analyzing big data. In addition, three new interestingness measures,

coverage

,

leverage

, and

lift

, are proposed to address the limitations of minimum left support, right support and confidence thresholds for gauging the importance of discovered actionable rules.

Li-Shiang Tsay
Meta-learning: Can It Be Suitable to Automatise the KDD Process for the Educational Domain?

The use of e-learning platforms is practically generalised in all educational levels. Even more, virtual teaching is currently acquiring a great relevance never seen before. The information that these systems record is a wealthy source of information that once it is suitably analised, allows both, instructors and academic authorities to make more informed decisions. But, these individuals are not expert in data mining techniques, therefore they require tools which automatise the KDD process and, the same time, hide its complexity. In this paper, we show how meta-learning can be a suitable alternative for selecting the algorithm to be used in the KDD process, which will later be wrapped and deployed as a web service, making it easily accessible to the educational community. Our case study focuses on the student performance prediction from the activity performed by the students in courses hosted in Moodle platform.

Marta Zorrilla, Diego García-Saiz

Spatial Data Analysis and Spatial Databases

Discovering Collocation Rules and Spatial Association Rules in Spatial Data with Extended Objects Using Delaunay Diagrams

The paper illustrates issues related to mining spatial association rules and collocations. In particular it presents a new method of mining spatial association rules and collocations in spatial data with extended objects using Delaunay diagrams. The method does not require previous knowledge of analyzed data nor specifying any space-related input parameters and is efficient in terms of execution times.

Robert Bembenik, Aneta Ruszczyk, Grzegorz Protaziuk
Potential Application of the Rough Set Theory in Indoor Navigation

The paper presents concepts of using the Rough Set Theory in indoor navigation. In particular, attention was drawn to potential verification of a position received from the positioning system and generation better quality navigation guidelines. The authors proposed the use of expert systems, spatial data for buildings and spatial analysis techniques typical for Geographic Information Systems (GIS). The presented analysis lies within the scope of the research conducted at the Laboratory of Mobile Cartography at Warsaw University of Technology.

Dariusz Gotlib, Jacek Marciniak
Mobile Indicators in GIS and GPS Positioning Accuracy in Cities

The publication describes the possible use of tele-geoprocessing as a synergy of modern IT solutions, telecommunications and GIS algorithms. The paper presents a possibility of urban traffic monitoring with the use of mobile GIS indicators of dedicated monitoring system designed for taxi corporation. The system is based on a stationary and mobile software package. The optimal and minimal assumptions for the monitoring of urban traffic are described. They can be implemented as a verification or supplementary tool for complex and high cost transportation management systems or throughput of city streets monitoring systems. The authors show limitations of standard monitoring and GNSS positioning in urban area. They indicate the possible improvement in the functionality of the application to the calculation of supplementary vector data of possible trajectories and based on it, the correction of the data received from the satellite positioning.

Artur Janowski, Aleksander Nowak, Marek Przyborski, Jakub Szulwic
Analysis of the Possibility of Using Radar Tracking Method Based on GRNN for Processing Sonar Spatial Data

This paper presents the approach of applying radar tracking methods for tracking underwater objects using stationary sonar. Authors introduce existing in navigation methods of target tracking with particular attention to methods based on neural filters. Their specific implementation for sonar spatial data is also described. The results of conducted experiments with the use of real sonograms are presented.

Witold Kazimierski, Grzegorz Zaniewicz
Supporting the Process of Monument Classification Based on Reducts, Decision Rules and Neural Networks

The present article attempts to support the process of classification of multi-characteristic spatial data in order to develop the correct cartographic visualisation of complex geographical information in the thematic geoportal. Rough sets, decision rules and artificial neural networks were selected as relevant methods of spatially distributed monument classification. Basing on the obtained results it was determined that the attributes reflecting the spatial relations between specific objects play an extremely significant role in the process of classification, reducts allow to select exclusively essential attributes of objects and neural networks and decision rules are highly useful for the purposes of classification of multi-characteristic spatial data.

Robert Olszewski, Anna Fiedukowicz
Self-organizing Artificial Neural Networks into Hydrographic Big Data Reduction Process

The article presents the reduction problems of hydrographic big data for the needs of gathering sound information for Navigation Electronic Chart (ENC) production. For the article purposes, data was used from an interferometric sonar, which is a modification of a multi-beam sonar. Data reduction is a procedure meant to reduce the size of the data set, in order to make them easier and more effective for the purposes of the analysis. The authors‘ aim is to examine whether artificial neural networks can be used for clustering data in the resultant algorithm. Proposed solution based on Kohonen network is tested and described. Experimental results of investigation of optimal network configuration are presented.

Andrzej Stateczny, Marta Wlodarczyk-Sielicka
Managing Depth Information Uncertainty in Inland Mobile Navigation Systems

Rough sets theory allows to model uncertainty in decision support systems. Electronic Charts Display and Information Systems are based on spatial data and together with build-in analysis tools pose primary aid in navigation. Mobile applications for inland waters use the same spatial information in form of Electronic Nautical Charts. In this paper we present a new approach for designation of a safety depth contour in inland mobile navigation. In place of manual setting of a safety depth value for the need of navigation-aid algorithm, an automatic solution is proposed. The solution is based on spatial characteristics and values derived from bathymetric data and system itself. Rough sets theory is used to reduce number of conditional attributes and to build rule matrix for decision-support algorithm.

Natalia Wawrzyniak, Tomasz Hyla

Information Extraction from Images

Optimal Scale in a Hierarchical Segmentation Method for Satellite Images

Even though images with high and very high spatial resolution exhibit higher levels of detailed features, traditional image processing algorithms based on single pixel analysis are often not capable of extracting all their information. To solve this limitation, object-based image analysis approaches (OBIA) have been proposed in recent years.

One of the most important steps in the OBIA approach is the segmentation process; whose aim is grouping neighboring pixels according to some homogeneity criteria. Different segmentations will allow extracting different information from the same image in multiples scales. Thus, the major challenge is to determine the adequate scale segmentation that allows to characterize different objects or phenomena, in a single image.

In this work, an adaptation of SLIC algorithm to perform a hierarchical segmentation of the image is proposed. An evaluation method consisting of an objective function that considers the intra-variability and inter-heterogeneity of the object is implemented to select the optimal size of each region in the image. The preliminary results show that the proposed algorithm is capable to detect objects at different scale and represent in a single image, allowing a better comprehension of the land-cover, their objects and phenomena.

David Fonseca-Luengo, Angel García-Pedrero, Mario Lillo-Saavedra, Roberto Costumero, Ernestina Menasalvas, Consuelo Gonzalo-Martín
3D Dendrite Spine Detection - A Supervoxel Based Approach

In neurobiology, the identification and reconstruction of dendritic spines from large microscopy image datasets is an important tool for the study of neuronal functions and biophysical properties. But the problem of how to automatically and accurately detect and analyse structural information from dendrites images in 3D confocal microscopy has not been completely solved. We propose an novel approach to detect and extract dendritic spines regardless their size o type, for images stacks result of 3D confocal microscopy. This method is based on supervoxel segmentation and their classification using a number of different, complementary algorithms.

César Antonio Ortiz, Consuelo Gonzalo-Martí, José Maria Peña, Ernestina Menasalvas
Histogram of Bunched Intensity Values Based Thermal Face Recognition

A robust thermal face recognition method has been discussed in this work. A new feature extraction technique named as Histogram of Bunched Intensity Values (HBIVs) is proposed. A heterogeneous classifier ensemble is also presented here. This classifier consists of three different classifiers namely, a five layer feed-forward backpropagation neural network (ANN), Minimum Distance Classifier (MDC), and Linear Regression Classifier (LRC). A comparative study has been made based on other feature extraction techniques for image description. Such image description methods are Harris detector, Hessian matrix, Steer, Shape descriptor, and SIFT. In the classification stage ANN, MDC, and LRC are used separately to identify the class label of probe thermal face images. Another class label is also assigned by majority voting technique based on the three classifiers. The proposed method is validated on UGC-JU thermal face database. The matching using majority voting technique of HBIVs approach showed a recognition rate of 100% for frontal face images which, consists different facial expressions such as happy, angry, etc On the other hand, 96.05% recognition rate has been achieved for all other images like variations in pose, occlusion etc, including frontal face images. The highly accurate results obtained in the matching process clearly demonstrate the ability of the thermal infrared system to extend in application to other thermal-imaging based systems.

Ayan Seal, Debotosh Bhattacharjee, Mita Nasipuri, Consuelo Gonzalo-Martin, Ernestina Menasalvas
Cost-Sensitive Sequential Three-Way Decision for Face Recognition

Recent years have witnessed an increasing interest in Three-Way Decision (TWD) model. In contrast to traditional two-way decision model, TWD incorporates a boundary decision, which presents a delay-decision choice when available information for a precise decision is insufficient. The boundary decision can be transformed into positive decision or negative decision with the increasing of available information, thus forming a sequential three-way decision process. In real-world decision problems, such sequential three-way decision strategies are frequently used in human decision process. In this paper, we propose a framework of cost-sensitive sequential three-way decision approach to simulate the human decision process in face recognition: a sequential decision process from rough granularity to precise granularity strategy. Both theoretic analysis and experimental verification are presented in this paper.

Libo Zhang, Huaxiong Li, Xianzhong Zhou, Bing Huang, Lin Shang
Image Enhancement Based on Quotient Space

Histogram equalization (HE) is a simple and widely used method in the field of image enhancement. Recently, various improved HE methods have been developed to improve the enhancement performance, such as BBHE, DSIHE and PC-CE. However, these methods fail to preserve the brightness of original image. To address the insufficient of these methods, an image enhancement method based on quotient space (IEQS) is proposed in this paper. Quotient space is an effective approach that can partitions the original problem in different granularity spaces. In this method, different quotient spaces are combined and the final granularity space is generated using granularity synthesis algorithm. The gray levels in each interval are mapped to the appropriate output gray-level interval. Experimental results show that IEQS can enhance the contrast of original image while preserving the brightness.

Tong Zhao, Guoyin Wang, Bin Xiao
Backmatter
Metadaten
Titel
Rough Sets and Intelligent Systems Paradigms
herausgegeben von
Marzena Kryszkiewicz
Chris Cornelis
Davide Ciucci
Jesús Medina-Moreno
Hiroshi Motoda
Zbigniew W. Raś
Copyright-Jahr
2014
Verlag
Springer International Publishing
Electronic ISBN
978-3-319-08729-0
Print ISBN
978-3-319-08728-3
DOI
https://doi.org/10.1007/978-3-319-08729-0

Premium Partner