Skip to main content

2016 | Book

Computational Intelligence

Revised and Selected Papers of the International Joint Conference, IJCCI 2013, Vilamoura, Portugal, September 20-22, 2013

Editors: Kurosh Madani, António Dourado, Agostinho Rosa, Joaquim Filipe, Janusz Kacprzyk

Publisher: Springer International Publishing

Book Series : Studies in Computational Intelligence


About this book

The present book includes a set of selected extended papers from the fifth International Joint Conference on Computational Intelligence (IJCCI 2013), held in Vilamoura, Algarve, Portugal, from 20 to 22 September 2013. The conference was composed by three co-located conferences: The International Conference on Evolutionary Computation Theory and Applications (ECTA), the International Conference on Fuzzy Computation Theory and Applications (FCTA), and the International Conference on Neural Computation Theory and Applications (NCTA). Recent progresses in scientific developments and applications in these three areas are reported in this book. IJCCI received 111 submissions, from 30 countries, in all continents. After a double blind paper review performed by the Program Committee, only 24 submissions were accepted as full papers and thus selected for oral presentation, leading to a full paper acceptance ratio of 22%. Additional papers were accepted as short papers and posters. A further selection was made after the Conference, based also on the assessment of presentation quality and audience interest, so that this book includes the extended and revised versions of the very best papers of IJCCI 2013. Commitment to high quality standards is a major concern of IJCCI that will be maintained in the next editions, considering not only the stringent paper acceptance ratios but also the quality of the program committee, keynote lectures, participation level and logistics.

Table of Contents


Evolutionary Computation Theory and Applications

Incremental Hough Transform: A New Method for Circle Detection
The circle Hough transform (CHT) is a fundamental issue in image processing applications of industrial parts or tools. Because of its drawbacks, various modifications have been suggested to increase its performance. Most of them have met the problem of implicit evaluation of trigonometric functions that makes the implementation difficult. The CORDIC algorithm is used to simplify the trigonometric calculations when the basic CHT algorithm is implemented into a digital device such as FPGA. Although, this solution require computation time and device resources consumption for the CORDIC IP implementation. This paper presents a modified CHT method, called Incremental circle Hough transform (ICHT), suitable for hardware implementation. This method is mainly used to get around the implementation of CORDIC IP. This paper provides also the errors analysis of the proposed method against the basic CHT method to illustrate that it can replace the basic CHT method for small values of the resolution \(\varepsilon \) of the angle \(\theta \).
A. Oualid Djekoune, Khadidja Messaoudi, Mahmoud Belhocine
Self-adaptive Evolutionary Many-Objective Optimization Based on Relation $${\varepsilon }$$ ε -Preferred
Many real-world optimization problems consist of several mutually dependent subproblems. If more than three optimization objectives are involved in the optimization process, the so-called Many-Objective Optimization is a challenge in the area of multi-objective optimization. Often, the objectives have different levels of importance that have to be considered. For this, relation \({\varepsilon }\)-\(\textit{Preferred} \) has been presented, that enables to compare and rank multi-dimensional solutions. \({\varepsilon }\)-\(\textit{Preferred} \) is controlled by a parameter \({\varepsilon }\) that has influence on the quality of the results. In this paper for the setting of the epsilon values three heuristics have been investigated. To demonstrate the behavior and efficiency of these methods an Evolutionary Algorithm for the multi-dimensional Nurse Rostering Problem is proposed. It is shown by experiments that former approaches are outperformed by heuristics that are based on self-adaptive mechanisms.
Nicole Drechsler
Automated Graphical User Interface Testing Framework—Evoguitest—Based on Evolutionary Algorithms
Software testing has become an important phase in software applications’ lifecycle. Graphical User Interface (GUI) components can be found in a large number of desktops and web applications and also in a wide variety of mobile devices. In the last years GUIs have become more and more complex and interactive, their testing process requiring an interaction with the GUI components, mainly by generating mouse, keyboard and touch events. Given their increased importance, GUIs verification for correctness contributes to the establishment of the correct functionality of the corresponding software application. The current research on GUI testing methodologies primarily focuses on automated testing. This paper presents EvoGUITest, a novel automated GUI testing framework based on evolutionary algorithms which tests the GUI independently from the application code itself. The framework is designed for testing GUIs of web applications. Results have been compared, based on specific metrics, with others existing frameworks.
Gentiana Ioana Latiu, Octavian Augustin Cret, Lucia Vacariu
Evolving Protection Measures for Lava Risk Management Decision Making
Many volcanic areas around the World are densely populated and urbanized. For instance, Mount Etna (Italy) is home to approximately one million people, despite being the most active volcano in Europe. Mapping both the physical threat and the exposure and vulnerability of people and material properties to volcanic hazards can help local authorities to guide decisions about where to locate a priori critical infrastructures (e.g. hospitals, power plants, railroads, etc.) and human settlements and to devise for existing locations and facilities appropriate mitigation measures. We here present the application of Parallel Genetic Algorithms for optimizing earth barriers construction by morphological evolution, to divert a case study lava flow that is simulated by the numerical Cellular Automata model Sciara-fv2 at Mt Etna volcano (Sicily, Italy). The devised area regards Rifugio Sapienza, a touristic facility located near the summit of the volcano, where the methodology was applied for the optimization of the position, orientation and extension of an earth barrier built to protect the zone. The study has produced extremely positive results, providing insights and scenarios for the area representing, to our knowledge, the first application of morphological evolution for lava flow mitigation.
Giuseppe Filippone, Donato D’Ambrosio, Davide Marocco, William Spataro
A Targeted Estimation of Distribution Algorithm Compared to Traditional Methods in Feature Selection
The Targeted Estimation of Distribution Algorithm (TEDA) introduces into an EDA/GA hybrid framework a ‘Targeting’ process, whereby the number of active genes, or ‘control points’, in a solution is driven in an optimal direction. For larger feature selection problems with over a thousand features, traditional methods such as forward and backward selection are inefficient. Traditional EAs may perform better but are slow to optimize if a problem is sufficiently noisy that most large solutions are equally ineffective and it is only when much smaller solutions are discovered that effective optimization may begin. By using targeting, TEDA is able to drive down the feature set size quickly and so speeds up this process. This approach was tested on feature selection problems with between 500 and 20,000 features using all of these approaches and it was confirmed that TEDA finds effective solutions significantly faster than the other approaches.
Geoffrey Neumann, David Cairns
Genetic Programming Model Regularization
We propose a tool for controlling the complexity of Genetic Programming models. The tool is supported by the theory of Vapnik-Chervonekis dimension (VCD) and is combined with a novel representation of models named straight line program. Experimental results, implemented on conventional algebraic structures (such as polynomials), show that the empirical risk, penalized by suitable upper bounds for the Vapnik-Chervonenkis dimension, gives a generalization error smaller than the use of statistical conventional techniques such as Bayesian or Akaike information criteria.
César L. Alonso, José Luis Montaña, Cruz Enrique Borges
A Radial Basis Function Neural Network-Based Coevolutionary Algorithm for Short-Term to Long-Term Time Series Forecasting
This work analyzes the behavior and effectiveness of the L-Co-R method using a growing horizon to predict. This algorithm performs a double goal, on the one hand, it builds the architecture of the net with a set of RBFNs, and on the other hand, it sets a group of time lags in order to forecast future values of a time series given. For that, it has been used a set of 20 time series, 6 different methods found in the literature, 4 distinct forecast horizons, and 3 distinct quality measures have been utilized for checking the results. In addition, a statistical study has been done to confirms the good results of the method L-Co-R.
E. Parras-Gutierrez, V. M. Rivas, J. J. Merelo
Tree Automata Mining
This paper [The article is an essentially revised version of conference paper (Przybylek (2013) International Conference on Evolutionary Computation Theory and Applications)] describes a new approach to mine business processes. We define bidirectional tree languages together with their finite models and show how they represent business processes. We offer an algebraic explanation for the phenomenon of an evolutionary metaheuristic “skeletal algorithms”, and show how this explanation gives rise to algorithms for recognition of bidirectional tree automata. We use the algorithms in process mining and in discovering mathematical theories.
Michal R. Przybylek
Alternative Topologies for GREEN-PSO
The expense of evaluating the function to be optimized can make it difficult to apply the Particle Swarm Optimization (PSO) algorithm in the real world. Approximating the function is one way to address this issue, but an alternative is conservation of function evaluations. GREEN-PSO (GR-PSO) adopts the latter approach: given a fixed number of function evaluations, GR-PSO conserves them by probabilistically choosing a subset of particles smaller than the entire swarm on each iteration and allowing only those particles to perform function evaluations. Since fewer function evaluations are used on each iteration, the algorithm can use more particles and/or more iterations for a given number of function evaluations. GR-PSO has been shown to be effective using the global topology, performing as well as, or better than, the standard PSO algorithm (S-PSO) [7]. We extend these results by showing that GR-PSO can achieve significantly better performance than S-PSO, in terms of both best function value achieved and rate of error reduction, using three other topologies—ring, von Neumann, and Moore—on a set of six standard benchmark functions, and that the von Neumann and Moore topologies can be more effective topologies for GR-PSO than the global topology.
Stephen M. Majercik
A Hybrid Shuffled Frog-Leaping Algorithm for the University Examination Timetabling Problem
The problem of examination timetabling is studied in this work. We propose a hybrid solution heuristic based on the Shuffled Frog-Leaping Algorithm (SFLA) for minimising the conflicts in the students’s exams. The hybrid algorithm, named Hybrid SFLA (HSFLA), improves a population of frogs (solutions) by iteratively optimising each memeplex, and then shuffling the memeplexes in order to distribute the best performing frogs by the memeplexes. In each iteration the frogs are improved based on three operators: crossover and mutation operators, and a local search operator based on the Simulated Annealing metaheuristic. For the mutation and local search, we use two well known neighbourhood structures. The performance of the proposed method is evaluated on the 13 instances of the Toronto datasets from the literature. Computational results show that the HSFLA is competitive with state-of-the-art methods, obtaining the best results on average in seven of the 13 instances.
Nuno Leite, Fernando Melício, Agostinho C. Rosa

Fuzzy Computation Theory and Applications

Model-Based Fuzzy System for Multimodal Image Segmentation
In this paper, a new model-based fuzzy system for multimodal 3-D image segmentation in MR series is introduced. The presented fuzzy system calculates affinity values for fuzzy connectedness segmentation procedure, which is the main stage of the processing. The fuzzy rules, generated for the system simulating a radiological analysis, are structured on the basis of Gaussian mixture model of analyzed image regions. For the model parameters estimation, different MR modalities, acquired during a single examination, are used. The segmentation abilities of a prototype system have been tested on two medical databases. The first one consists of 27 examinations with bone tumors, which are visualized with two different MR sequences. The second one is the database of brain tumors with ground truth description obtained from the MICCAI 2012 Challenge on Multimodal Brain Tumor Segmentation.
Joanna Czajkowska
Multiple Fuzzy Roles: Analysis of Their Evolution in a Fuzzy Agent-Based Collaborative Design Platform
Design for configurations is a highly collaborative and distributed process. The use of fuzzy agents, that implement the collaborative and distributed design by means of fuzzy logic, is highly recommended due to the fuzzy nature of the collaboration, distribution, interaction and design problems. In this paper, we propose a fuzzy agent model, where fuzzy agents grouped in communities interact and perform multiple fuzzy design roles to converge towards solutions of product configuration. Analysis of both interactions and multiple fuzzy roles of fuzzy agents during product configuration in a collaborative design platform is proposed. The modelling of fuzzy agents and its illustration for a collaborative design platform are presented. The results of analysis have shown the important influence of fuzzy solution agents in the organization of the agent based collaborative design for configurations platform. The more the fuzzy agents share their knowledge, the more their fuzzy roles are complete in every domain of design for configurations. The degree of interactions between fuzzy agents in the design for configurations process has an impact on the emergence of increased activity of some fuzzy agents. The fuzzy function agents, influenced by many fuzzy requirement agents, are the most active in the design process. The simulation shows that this observation can be extended to the fuzzy solution agents. The most active fuzzy solution agents are those which create the best consensual solution. Simulations show that the consensus can be found principally by increasing the degree of interactions.
Alain-Jérôme Fougères, Egon Ostrosi
Multi-distance and Fuzzy Similarity Based Fuzzy TOPSIS
This article introduces a new extension to fuzzy TOPSIS. In the extension we have used as a basis a fuzzy similarity based fuzzy TOPSIS that uses an additional component of multi-distance in forming a closeness coefficient. Ordered weighted averaging is also used in the aggregation process over fuzzy similarity values. For the ordered weighted averaging operator weight generation we use the O’Hagan’s method, to find optimal weights. Several different, predefined orness values are tested and an overall ranking is computed, based on the rankings resulting from multiple different orness values. The presented method is numerically applied to a research and development project selection problem.
Mikael Collan, Mario Fedrizzi, Pasi Luukka
Methodology of Virtual Wood Piece Quality Evaluation
This paper presents a way to evaluate the quality of virtual wood products according to their tomographic image. The main objective is to anticipate a sawmill divergent process in order to enhance the production plan. From a virtual representation of the product, singularity features are extracted and their impact on the product virtual quality is assessed thanks to the Choquet integrals. Next, the visual quality is evaluated by merging singularity impacts and singularity number criterion using suitable operators. Three operators are compared to the mean operator which is the commonly used one when there is little knowledge on the decision process. Finally the measure is express in the Sawmill expert language using linguistic variables which give the possibility degree that the product belongs to each quality. This degree could be understood as the risk to attribute the concerned quality. It is finally used to determine which quality is to attribute to product in order to satisfy customer needs and maximize sawyers benefit by a linear programming algorithm.
Jeremy Jover, Vincent Bombardier, Andre Thomas
Fuzzy Discretization Process from Small Datasets
A classification problem involves selecting a training dataset with class labels, developing an accurate description or a model for each class using the attributes available in the data, and then evaluating the prediction quality of the induced model. In this paper, we focus on supervised classification and models which have been obtained from datasets with few examples in relation with the number of attributes. Specifically, we propose a fuzzy discretization method of numerical attributes from datasets with few examples. The discretization of numerical attributes can be a crucial step since there are classifiers that cannot deal with numerical attributes, and there are other classifiers that exhibit better performance when these attributes are discretized. Also we show the benefits of the fuzzy discretization method from dataset with few examples by means of several experiments. The experiments have been validated by means of statistical tests.
José M. Cadenas, M. Carmen Garrido, Raquel Martínez
A Framework for Modelling Real-World Knowledge Capable of Obtaining Answers to Fuzzy and Flexible Searches
The Internet has become a place where massive amounts of information and data are being generated every day. This information is most of the times stored in a non-structured way, but the times it is structured in databases it cannot be retrieved by using easy fuzzy queries: we need human intervention to determine how the non-fuzzy information stored needs to be combined and processed to answer a fuzzy query. We present a web interface for posing fuzzy and flexible queries and a framework. Our framework allows to represent non-fuzzy concepts, fuzzy concepts and relations between them, giving the programmer the capability to model any real-world knowledge. It is this representation in the framework’s language what it uses to (1) determine how to answer the query without any human intervention and (2) provide the search engine with the information it needs to present the user a friendly and easy to use query form. We expect this work contributes to the development of more human-oriented fuzzy search engines.
Víctor Pablos-Ceruelo, Susana Munoz-Hernandez
On the Deduction Problem in Gödel and Product Logics
We investigate the deduction problem in Gödel and Product logics, both equipped with Gödel negation, in the countable case. Our approach is based on translation of a formula to an equivalent satisfiable finite order clausal theory, consisting of order clauses. An order clause is a finite set of order literals of the form \(\varepsilon _1\diamond \varepsilon _2\) where \(\diamond \) is a connective either or \(\prec \). and \(\prec \) are interpreted by th xe equality and standard strict linear order on [0, 1], respectively. We generalise the well-known hyperresolution principle to the standard first-order Gödel logic and devise a calculus operating over order clausal theories. A variant of the DPLL procedure in the propositional Product logic exploiting trichotomy and operating over order clausal theories, will be proposed. Both the calculi are refutation sound and complete for the countable case.
Dušan Guller
Fuzzy Optimization Models for Seaside Port Logistics: Berthing and Quay Crane Scheduling
The service time of container vessels is the main indicator of the competitiveness of a maritime container terminal. Vessels have to be berthed along the quay, a subset of quay cranes must be assigned to them and work schedules have to be planned for unloading the import containers and loading the export containers. This work addresses the Tactical Berth Allocation Problem, in which the vessels are assigned to a given berth, and the Quay Crane Scheduling Problem, for which the work schedules of the quay cranes are determined. The nature of this environment gives rise to inaccurate knowledge about the information related to the incoming vessels. Therefore, the aforementioned optimization problems can be tackled by considering fuzzy arrival times for the vessels and fuzzy processing times for the loading/unloading operations. Two fuzzy mathematical models are provided to solve the problems at hand. The computational experiments carried out in this work corroborate the effectiveness of the proposed methodologies.
Christopher Expósito-Izquiero, Eduardo Lalla-Ruiz, Teresa Lamata, Belén Melián-Batista, J. Marcos Moreno-Vega
Obtaining the Decision Criteria and Evaluation of Optimal Sites for Renewable Energy Facilities Through a Decision Support System
In projects regarding renewable energy facilities, decision making is an essential activity that provides greater consistency and viability to the project. The first step that any promoter of such facilities should face is to select an optimal location. To do so, it is necessary to consider all the criteria that influence the decision. However, not all the criteria are equally important, which means that determining their weights is extremely important. The objective of this chapter is to obtain the weights of the decision criteria that influence the location problems of wind farms and solar photovoltaic and thermoelectric plants. For this, a Decision Support System (DSS) has been designed that allows to carry out the extraction of knowledge from an expert group by Fuzzy AHP methodology. Finally, DSS will sort the viable locations based on the importance of the criteria that influence the decision.
Juan M. Sánchez-Lozano, Jose Angel Jiménez-Pérez, M. Socorro García-Cascales, M. Teresa Lamata
Gene Priorization for Tumor Classification Using an Embedded Method
The application of microarray technology to the diagnosis of cancer has been a challenge for computational techniques because the datasets obtained have high dimension and a few examples. In this paper two computational techniques are applied to tumor datasets in order to carry out the task of diagnosis of cancer (classification task) and identifying the most promising candidates among large list of genes (gene prioritization). Both techniques obtain good classification results but only one provides a ranking of genes as additional information and thus, more interpretable models, being more suitable for jointly addressing both tasks.
Jose M. Cadenas, M. Carmen Garrido, Raquel Martínez, David Pelta, Piero P. Bonissone
Generalizing and Formalizing Precisiation Language to Facilitate Human-Robot Interaction
We develop a formal logic as a generalized precisiation language. This formal logic can serve as a middle ground between the natural-language-based mode of human communication and the low-level mode of machine communication. Syntactic structures in natural language are incorporated in the syntax of the formal logic. As regards the semantics, we establish the formal logic as a many-valued logic. We present examples that illustrate how our formal logic can facilitate human-robot interaction.
Takehiko Nakama, Enrique Muñoz, Kevin LeBlanc, Enrique Ruspini

Neural Computation Theory and Applications

Growing Surface Structures: A Topology Focused Learning Scheme
Iterative refinement approaches derived from unsupervised artificial neural network (ANN) methods, such as Growing Cell Structures (GCS), have proven very efficient for the application of surface reconstruction from scattered 3D points. The Growing Surface Structures (GSS) algorithm is a major conceptual change in the GCS approach. Instead of “adjusting” the learning behavior, the central learning scheme is shifted from optimizing the distribution of vertices to the creation of a valid surface model. Where in former GCS approaches the created topology is only implicitly represented in the process, it is explicitly integrated and represented in the refinement process of the GSS approach. Here the closest surface structure, such as a vertex, an edge or a triangle is found for a given sample and the actual sample -to-surface distance is measured. With this additional information the adaptation process can be focused on the created topology. We demonstrate the performance of the novel concept in the area of surface reconstruction.
Hendrik Annuth, Christian-A. Bohn
Selective Image Compression Using MSIC Algorithm
This paper presents a new algorithm, Magnitude Sensitive Image Compression (MSIC), as a reliable and efficient approach for selective image compression. The algorithm uses MSCL neural networks (in direct and masked versions). These kind of neural networks tend to focus the learning process in data space zones with high values of a user-defined magnitude function. This property can be used for image compression to divide the image in irregular blocks, with higher resolution in areas of interest. These blocks are compressed by Vector Quantization in a later step, giving as a result that different areas of the image receive distinct compression ratios. Results in several examples demonstrate the better performance of MSIC compared to JPEG or other SOM based image compression algorithms.
Enrique Pelayo, David Buldain, Carlos Orrite
Unsupervised Analysis of Morphological ECG Features for Attention Detection
Physiological Computing augments the information bandwidth between a computer and its user by continuous, real-time monitoring of the user’s physiological traits and responses. This is especially interesting in a context of emotional assessment during human-computer interaction. The electroencephalogram (EEG) signal, acquired on the scalp, has been extensively used to understand cognitive function, and in particular emotion. However, this type of signal has several drawbacks, being susceptible to noise and requiring the use of impractical head-mounted apparatuses. For these reasons, the electrocardiogram (ECG) has been proposed as an alternative source to assess emotion, which is continuously available, and related with the psychophysiological state of the subject. In this paper we analyze morphological features of the ECG signal acquired from subjects performing an attention-demanding task. The analysis is based on various unsupervised learning techniques, which are validated against evidence found in a previous study by our team, where EEG signals collected for the same task exhibit distinct patterns as the subjects progress in the task.
Carlos Carreiras, André Lourenço, Helena Aidos, Hugo Plácido da Silva, Ana L. N. Fred
Autonomous Learning Needs a Second Environmental Feedback Loop
Deriving a successful neural control of behavior of autonomous and embodied systems poses a great challenge. The difficulty lies in finding suitable learning mechanisms, and in specifying under what conditions learning becomes necessary. Here, we provide a solution to the second issue in the form of an additional feedback loop that augments the sensorimotor loop in which autonomous systems live. The second feedback loop provides proprioceptive signals, allowing the assessment of behavior through self-monitoring, and accordingly, the control of learning. We show how the behaviors can be defined with the aid of this framework, and we show that, in combination with simple stochastic plasticity mechanisms, behaviors are successfully learned.
Hazem Toutounji, Frank Pasemann
Prediction Capabilities of Evolino RNN Ensembles
Modern portfolio theory of investment-based financial market forecasting use probability distributions. This investigation used an ensemble of genetic algorithm based recurrent neural networks (RNN), which allows to obtain multi-modal distribution for predictions. Comparison of the two different models—scatted points based prediction and distributions based prediction—opens new opportunities to create profitable investment tool, which was tested in real time demo market. Dependence of forecasting accuracy on the number of Evolino recurrent neural networks ensemble was obtained for five forecasting points ahead. This study allows to optimize the cluster based computational time and resources required for sufficiently accurate prediction.
Nijolė Maknickienė, Algirdas Maknickas
Gene Ontology Analysis on Behalf of Improved Classification of Different Colorectal Cancer Stages
The colorectal cancer is a serious cause of death worldwide. Diagnosing the current colorectal cancer stage is crucial for early prognosis and adequate treatment of the patients. Even though the scientists have developed various techniques, determining the real colorectal cancer stage is still critical. In this paper we utilize Gene Ontology analysis information to address this issue. We compose a set of special genes that are used to obtain two main results—we show the distinction between the carcinogenic and healthy tissue by difference in the range of their DNA gene expressions, and we propose a novel methodology that improves the colorectal cancer stages classification.
Monika Simjanoska, Ana Madevska Bogdanova, Sasho Panov
Artificial Curiosity Emerging Human-Like Behavior: Toward Fully Autonomous Cognitive Robots
This chapter is devoted to autonomous cognitive machines by mean of the design of an artificial curiosity based cognitive system for autonomous high-level knowledge acquisition from visual information. Playing a chief role as well in visual attention as in interactive high-level knowledge construction, the artificial curiosity is realized through combining visual saliency detection and Machine-Learning based approaches. Experimental results validating the deployment of the investigated system have been obtained using as well simulation facilities as a real humanoid robot acquiring visually knowledge about its surrounding environment interacting with a human tutor. As show the reported results and experiments, the proposed cognitive system allows the machine to discover autonomously the surrounding world in which it may evolve, to learn new knowledge about it and to describe it using human-like natural utterances.
Kurosh Madani, Christophe Sabourin, Dominik M. Ramík
Computational Intelligence
Kurosh Madani
António Dourado
Agostinho Rosa
Joaquim Filipe
Janusz Kacprzyk
Copyright Year
Electronic ISBN
Print ISBN

Premium Partner