Skip to main content

2009 | Buch

Methods and Models in Artificial and Natural Computation. A Homage to Professor Mira’s Scientific Legacy

Third International Work-Conference on the Interplay Between Natural and Artificial Computation, IWINAC 2009, Santiago de Compostela, Spain, June 22-26, 2009, Proceedings, Part I

herausgegeben von: José Mira, José Manuel Ferrández, José R. Álvarez, Félix de la Paz, F. Javier Toledo

Verlag: Springer Berlin Heidelberg

Buchreihe : Lecture Notes in Computer Science

insite
SUCHEN

Über dieses Buch

The two-volume set LNCS 5601 and LNCS 5602 constitutes the refereed proceedings of the Third International Work-Conference on the Interplay between Natural and Artificial Computation, IWINAC 2009, held in Santiago de Compostela, Spain, in June 2009. The 108 revised papers presented are thematically divided into two volumes. The first volume includes papers relating the most recent collaborations with Professor Mira and contributions mainly related with theoretical, conceptual and methodological aspects linking AI and knowledge engineering with neurophysiology, clinics and cognition. The second volume contains all the contributions connected with biologically inspired methods and techniques for solving AI and knowledge engineering problems in different application domains.

Inhaltsverzeichnis

Frontmatter
A Model of Low Level Co-operativity in Cerebral Dynamic

We present a conceptual model of cerebral dynamics based on the neuropsychological findings of Lashley, Luria, and J. Gonzalo on the residual function after traumatic and surgical lesions in animals and man. The model proposes a co-operative structure with polifunctional modules distributed on the same anatomical substratum. The level of co-operation depends closely the language used to describe the neural dialogue, analogic, logic or symbolic. At low level we can use a nonlinear convolution-like formulation with adaptive kernels to explain some of the experimental results, although the more relevant properties of cerebral dynamics need more sophisticated formulations.

J. Mira, A. E. Delgado
On Bridging the Gap between Human Knowledge and Computers: A Personal Tribute in Memory of Prof. J. Mira

This paper is an attempt at analyzing the scientific trajectory of Prof. Mira. It is also an attempt at understanding how Prof. Mira’s scientific contributions arose from the different pathways of study along which he advanced during his professional lifetime.

R. Marin
Personal Notes about the Figure and Legate of Professor Mira

This contribution pretends to be a retrospective look, inevitably partial and subjective, to the scientific legate that Professor José Mira has left us. Pepe Mira has been not only the alma mater of this Conference but also a maestro for many of us. This is my personal tribute and my most felt remembrance for him.

Ramón Ruiz-Merino
Intelligent Patient Monitoring: From Hardware to Learnware

Some time ago, in response to Proust’s musings on the role of the writer in language, the French philosopher Gilles Deleuze described literary creation as the result of a series of events that take place on the edge of language [11]. This figure is easily recognisable by anyone who, at some point, has decided to move to that territory in permanent tension which is the edge of knowledge; that fine line which many reach seeing naught but the path by which they have arrived, and which very few are capable of taking further away. Prof. José Mira knew that place well. And those fortunate enough to accompany him had the opportunity to learn through him part of what makes up the essential baggage for undertaking that journey: simple things, such as honesty, creativity and enthusiasm.

Senén Barro
A Look toward the Past of My Work with the Professor José Mira

The José Mira’s projection has been very important in my personal and professional life. I was their pupil in the faculty and in the development of my doctoral thesis, co-worker in their first investigation projects and, however, his friend. I have a great regret due to their sudden disappearance, and I want that these notes are of acknowledgment like homage toward their work and dedication in the scientific aspects and in their human quality, their generosity and capacity to integrate to people and ideas in a friendly and family environment. I present a summary of the topics in those that I worked with him, the investigation lines that opened up and that they affected me directly among the years 1981 to 1989 in their stages of Granada and Santiago de Compostela and what their work projected on mine and in my professional later career. I include some anecdotes that I have remembered while was writing.

Francisco Javier Ríos Gómez
Remembering José Mira

Here I emphasize the importance of the help, advice, suggestions and backing that I received from José Mira in the task of dissemination and formalization of the research of Justo Gonzalo (1910-1986). This survey intents to be an expression of my profound gratitude to José Mira.

Isabel Gonzalo-Fonrodona
Revisiting Algorithmic Lateral Inhibition and Accumulative Computation

Certainly, one of the prominent ideas of Professor Mira was that it is absolutely mandatory to specify the mechanisms and/or processes underlying each task and inference mentioned in an architecture in order to make operational that architecture. The conjecture of the last fifteen years of joint research of Professor Mira and our team at University of Castilla-La Mancha has been that any bottom-up organization may be made operational using two biologically inspired methods called “algorithmic lateral inhibition”, a generalization of lateral inhibition anatomical circuits, and “accumulative computation”, a working memory related to the temporal evolution of the membrane potential. This paper is dedicated to the computational formulations of both methods, which have led to quite efficient solutions of problems related to motion-based computer vision.

Antonio Fernández-Caballero, María T. López, Miguel A. Fernández, José M. López-Valles
Detection of Speech Dynamics by Neuromorphic Units

Speech and voice technologies are experiencing a profound review as new paradigms are sought to overcome some specific problems which can not be completely solved by classical approaches. Neuromorphic Speech Processing is an emerging area in which research is turning the face to understand the natural neural processing of speech by the Human Auditory System in order to capture the basic mechanisms solving difficult tasks in an efficient way. In the present paper a further step ahead is presented in the approach to mimic basic neural speech processing by simple neuromorphic units standing on previous work to show how formant dynamics -and henceforth consonantal features-, can be detected by using a general neuromorphic unit which can mimic the functionality of certain neurons found in the Upper Auditory Pathways. Using these simple building blocks a General Speech Processing Architecture can be synthesized as a layered structure. Results from different simulation stages are provided as well as a discussion on implementation details. Conclusions and future work are oriented to describe the functionality to be covered in the next research steps.

Pedro Gómez-Vilda, José Manuel Ferrández-Vicente, Victoria Rodellar-Biarge, Agustín Álvarez-Marquina, Luis Miguel Mazaira-Fernández, Rafael Martínez-Olalla, Cristina Muñoz-Mulas
Spatio-temporal Computation with Neural Sensorial Maps

In this work, the conexionist computational paradigm used in Artificial Intelligence (AI) and Autonomous Robotics is revised. A Neural Computational paradigm based on perceptual association maps is also analysed regarding his close relation with the conexionist paradigm. We explore the Singer hypothesis about the evidence of the same neural mechanism previous to retinotopic projections. Finally, the implications of this computational paradigm with maps in the context of the definition, design, building and evaluation of bioinspired systems is discussed.

J. M. Ferrández-Vicente, A. Delgado, J. Mira
Knowledge-Based Systems: A Tool for Distance Education

This work describes how, starting from the PhD thesis

A Knowledge-Based System for Distance Education

, under the supervision of Professor José Mira Mira, different ideas have evolved opening new research directions. This thesis has emphasized the usefulness of artificial intelligence (AI) techniques in developing systems to support distance education. In particular, the work concerning the modelling of tasks and methods at knowledge level, and in the use of hybrid procedures (symbolic and connectionist) to solve those tasks that recurrently appear when designing systems to support teaching-learning processes.

Pedro Salcedo L., M. Angélica Pinninghoff J., Ricardo Contreras A.
ANLAGIS: Adaptive Neuron-Like Network Based on Learning Automata Theory and Granular Inference Systems with Applications to Pattern Recognition and Machine Learning

In this paper the fusion of artificial neural networks, granular computing and learning automata theory is proposed and we present as a final result ANLAGIS, an adaptive neuron-like network based on learning automata and granular inference systems. ANLAGIS can be applied to both pattern recognition and learning control problems. Another interesting contribution of this paper is the distinction between pre-synaptic and post-synaptic learning in artificial neural networks. To illustrate the capabilities of ANLAGIS some experiments on knowledge discovery in data mining and machine learning are presented. Previous work of Jang

et al.

[1] on adaptive network-based fuzzy inference systems, or simply ANFIS, can be considered a precursor of ANLAGIS. The main, novel contribution of ANLAGIS is the incorporation of Learning Automata Theory within its structure.

Darío Maravall, Javier de Lope
Neural Prosthetic Interfaces with the Central Nervous System: Current Status and Future Prospects

Rehabilitation of sensory and/or motor functions in patients with neurological diseases is more and more dealing with artificial electrical stimulation and recording from populations of neurons using biocompatible chronic implants. For example deep brain stimulators have been implanted successfully in patients for pain management and for control of motor disorders such as Parkinson’s disease. Moreover advances in artificial limbs and brain-machine interfaces are now providing hope of increased mobility and independence for amputees and paralysed patients. As more and more patients have benefited from these approaches, the interest in neural interfaces has grown significantly. However many problems have to be solved before a neuroprosthesis can be considered a viable clinical therapy or option. We discuss some of the exciting opportunities and challenges that lie in this intersection of neuroscience research, bioengineering and information and communication technologies.

E. Fernández
Analytical Models for Transient Responses in the Retina

We propose analytical models for slow potentials transients as they are recorded at different layers of the retina, but mostly as they are integrated to produce ganglion cells outputs. First, two possible pathways from photoreceptors to bipolars and to ganglia are formulated, their formal equivalence being shown for quasi-linear behaviour. Secondly, a linear oscillator analytical model is introduced which is shown also to be equivalent to the first under certain circumstances. Finally, local instantaneous nonlinearities are introduced in the models. Tunning their parameters, the models are very versatile in describing the different neurophysiological situations and responses.

Gabriel de Blasio, Arminda Moreno-Díaz, Roberto Moreno-Díaz
Analysis of Retinal Ganglion Cells Population Responses Using Information Theory and Artificial Neural Networks: Towards Functional Cell Identification

In this paper, we analyse the retinal population data looking at behaviour. The method is based on creating population subsets using the autocorrelograms of the cells and grouping them according to a minimal Euclidian distance. These subpopulations share functional properties and may be used for data reduction, extracting the relevant information from the code. Information theory (IT) and artificial neural networks (ANNs) have been used to quantify the coding goodness of every subpopulation, showing a strong correlation between both methods. All cells that belonged to a certain subpopulation showed very small variances in the information they conveyed while these values were significantly different across subpopulations, suggesting that the functional separation worked around the capacity of each cell to code different stimuli.

M. P. Bonomini, J. M. Ferrández, J. Rueda, E. Fernández
On Cumulative Entropies and Lifetime Estimations

The cumulative entropy is a new measure of information, alternative to the classical differential entropy. It has been recently proposed in analogy with the cumulative residual entropy studied by Wang

et al.

(2003a) and (2003b). After recalling its main properties, including a connection to reliability theory, we discuss estimates of random lifetimes based on the empirical cumulative entropy, which is suitably expressed in terms of the dual normalized sample spacings.

Antonio Di Crescenzo, Maria Longobardi
Activity Modulation in Human Neuroblastoma Cultured Cells: Towards a Biological Neuroprocessor

The main objective of this work is to analyze the computing capabilities of human neuroblastoma cultured cells and to define stimulation patterns able to modulate the neural activity in response to external stimuli. Multielectrode Arrays Setups have been designed for direct culturing neural cells over silicon or glass substrates, providing the capability to stimulate and record simultaneously populations of neural cells. This paper describes the process of growing human neuroblastoma cells over MEA substrates and tries to modulate the natural physiologic responses of these cells by tetanic stimulation of the culture. If we are able to modify the selective responses of some cells with a external pattern stimuli over different time scales, the neuroblastoma-cultured structure could be trained to process pre-programmed spatio-temporal patterns. We show that the large neuroblastoma networks developed in cultured MEAs are capable of learning: stablishing numerous and dynamic connections, with modifiability induced by external stimuli

J. M. Ferrández-Vicente, M. Bongard, V. Lorente, J. Abarca, R. Villa, E. Fernández
A Network of Coupled Pyramidal Neurons Behaves as a Coincidence Detector

The transmission of excitatory inputs by a network of coupled pyramidal cells is investigated by means of numerical simulations. The pyramidal cells models are coupled by excitatory synapses and each one receives an excitatory pulse at a random time extracted from a Gaussian distribution. Moreover, each cell model is injected with a noisy current. It was found that the excitatory coupling promotes the transmission of the the synaptic inputs on a time scale of a few msec.

Santi Chillemi, Michele Barbi, Angelo Di Garbo
Characterisation of Multiple Patterns of Activity in Networks of Relaxation Oscillators with Inhibitory and Electrical Coupling

Fully-connected neural networks of non-linear half-center oscillators coupled both electrically and synaptically may exhibit a variety of modes of oscillation despite fixed topology and parameters. In suitable circumstances it is possible to switch between these modes in a controlled fashion. Previous work has investigated this phenomenon the simplest possible 2 cell network. In this paper we show that the 4 cell network, like the 2 cell, exhibits a variety of symmetrical and asymmetrical behaviours. In general, with increasing electrical coupling the number of possible behaviours is reduced until finally the only expressed behaviour becomes in-phase oscillation of all neurons. Our analysis enables us to predict general rules governing behaviour of more numerous networks, for instance types of co-existing activity patterns and a subspace of parameters where they emerge.

Tiaza Bem, John Hallam
Scaling Power Laws in the Restoration of Perception with Increasing Stimulus in Deficitary Natural Neural Network

Measurements of the restoration of visual and tactile perceptions with increasing stimulus, carried out by Justo Gonzalo (1910-1986) in patients with lesions in the cerebral cortex, constitute exceptional examples of quantification of perception. From an analysis of the data for different types of stimulus, we find that, at high enough intensity of stimulus, perception follows scaling power laws with dominance of quarter exponents, which are similar to the scaling laws found in the improvement of perception by multisensory facilitation, reflecting general mechanisms in the respective neural networks of the cortex. The analysis supports the idea that the integrative cerebral process, initiated in the projection path, reaches regions of the cortex of less specificity.

Isabel Gonzalo-Fonrodona, Miguel A. Porras
Neuron-Less Neural-Like Networks with Exponential Association Capacity at Tabula Rasa

Artificial neural networks have been used as models of associative memory but their storage capacity is severely limited. Alternative machine-learning approaches perform better in classification tasks but require long learning sessions to build an optimized representational space. Here we present a radically new approach to the problem of classification based on the fact that networks associated to random hard constraint satisfaction problems display naturally an exponentially large number of attractor clusters. We introduce a warning propagation dynamics that allows selective mapping of arbitrary input vector onto these well-separated clusters of states, without need of training. The potential for such networks with exponential capacity to handle inputs with a combinatorially complex structure is finally explored with a toy-example.

Demian Battaglia
Brain Complexity: Analysis, Models and Limits of Understanding

Manifold initiatives try to utilize the operational principles of organisms and brains to develop alternative, biologically inspired computing paradigms. This paper reviews key features of the standard method applied to complexity in the cognitive and brain sciences, i.e. decompositional analysis. Projects investigating the nature of computations by cortical columns are discussed which exemplify the application of this standard method. New findings are mentioned indicating that the concept of the basic uniformity of the cortex is untenable. The claim is discussed that non-decomposability is not an intrinsic property of complex, integrated systems but is only in our eyes, due to insufficient mathematical techniques. Using Rosen’s modeling relation, the scientific analysis method itself is made a subject of discussion. It is concluded that the fundamental assumption of cognitive science, i.e., cognitive and other complex systems are decomposable, must be abandoned.

Andreas Schierwagen
Classifying a New Descriptor Based on Marr’s Visual Theory

Descriptors are a powerful tool in digital image analysis. Performance of tasks such as image matching and object recognition is strongly dependent on the visual descriptors that are used. The dimension of the descriptor has a direct impact on the time the analysis take, and less dimensions are desirable for fast matching. In this paper we use a type of region called curvilinear region. This approach is based on Marr’s visual theory. Marr supposed that every object can be divided in its constituent parts, being this parts cylinders. So, we suppose also that in every image there must be curvilinear regions that are easy to detect. We propose a very short descriptor to use with these curvilinear regions in order to classify these regions for higher visual tasks.

J. M. Pérez-Lorenzo, S. García Galán, A. Bandera, R. Vázquez-Martín, R. Marfil
Solving the Independent Set Problem by Using Tissue-Like P Systems with Cell Division

Tissue-like P systems with cell division is a computing model in the framework of Membrane Computing inspired by the intercellular communication and neuronal synaptics. It considers the cells as unit processors and the computation is performed by the parallel application of given rules. Division rules allow an increase of the number of cells during the computation. We present a polynomial-time solution for the Independent Set problem via a uniform family of such systems.

Daniel Díaz-Pernil, Miguel A. Gutiérrez-Naranjo, Mario J. Pérez-Jiménez, Agustín Riscos-Núñez
How to Do Recombination in Evolution Strategies: An Empirical Study

In Evolution Strategies (ES) mutation is often considered to be the main variation operator and there has been relatively few attention on the choice of recombination operators. This study seeks to compare advanced recombination operators for ES, including multi-parent weighted recombination. Both the canonical

$(\mu{+\atop,} \lambda)-$

ES with mutative self-adaptation and the CMA-ES are considered. The results achieved on scalable (non-)separable test problem indicate that the right choice of recombination has an considerable impact on the performance of the ES. Moreover, the study will provide empirical evidence that weighted multi-parent recombination is a favorable choice for both ES variants.

Juan Chen, Michael T. M. Emmerich, Rui Li, Joost Kok, Thomas Bäck
Serial Evolution

Genetic algorithms (GA) represent an algorithmic optimization technique inspired by biological evolution. A major strength of this meta-heuristic is its ability to explore the search space in independent parallel search routes rendering the algorithm highly efficient if implemented on a parallel architecture. Sequential simulations of GAs frequently result in enormous computational costs. To alleviate this problem, we propose a serial evolution strategy which results in a much smaller number of necessary fitness function evaluations thereby speeding up the computation considerably. If implemented on a parallel architecture the savings in computational costs are even more pronounced. We present the algorithm in full mathematical detail and proof the corresponding schema theorem for a simple case without cross-over operations. A toy example illustrates the operation of serial evolution and the performance improvement over a canonical genetic algorithm.

V. Fischer, A. M. Tomé, E. W. Lang
A Sensitivity Clustering Method for Hybrid Evolutionary Algorithms

The machine learning community has traditionally used the correct classification rate or accuracy to measure the performance of a classifier, generally avoiding the presentation of the sensitivities (i.e. the classification level of each class) in the results, especially in problems with more than two classes. Evolutionary Algorithms (EAs) are powerful global optimization techniques but they are very poor in terms of convergence performance. Consequently, they are frequently combined with Local Search (LS) algorithms that can converge in a few iterations. This paper proposes a new method for hybridizing EAs and LS techniques in a classification context, based on a clustering method that applies the

k

-means algorithm in the sensitivity space, obtaining groups of individuals that perform similarly for the different classes. Then, a representative of each group is selected and it is improved by a LS procedure. This method is applied in specific stages of the evolutionary process and we consider the minimun sensitivity and the accuracy as the evaluation measures. The proposed method is found to obtain classifiers with a better accuracy for each class than the application of the LS over the best individual of the population.

F. Fernández-Navarro, P. A. Gutiérrez, C. Hervás-Martínez, J. C. Fernández
A Genetic Algorithm for the Open Shop Problem with Uncertain Durations

We consider a variation of the open shop problem where task durations are allowed to be uncertain and where uncertainty is modelled using fuzzy numbers. We propose a genetic approach to minimise the expected makespan: we consider different possibilities for the genetic operators and analyse their performance, in order to obtain a competitive configuration. Finally, the performance of the proposed genetic algorithm is tested on several benchmark problems, modified so as to have fuzzy durations, compared with a greedy heuristic from the literature.

Juan José Palacios, Jorge Puente, Camino R. Vela, Inés González-Rodríguez
Genetic Algorithm Combined with Tabu Search for the Job Shop Scheduling Problem with Setup Times

We face the Job Shop Scheduling Problem with Sequence Dependent Setup Times and makespan minimization. To solve this problem we propose a new approach that combines a Genetic Algorithm with a Tabu Search method. We report results from an experimental study across conventional benchmark instances showing that this hybrid approach outperforms the current state-of-the-art methods.

Miguel A. González, Camino R. Vela, Ramiro Varela
Prediction and Inheritance of Phenotypes

In the search for functional relationships between genotypes and phenotypes, there are two possible findings. A phenotype may be heritable when it depends on a reduced set of genetic markers. Or it may be predictable from a wide genomic description. The distinction between these two kinds of functional relationships is very important since the computational tools used to find them are quite different. In this paper we present a general framework to deal with phenotypes and genotypes, and we study the case of the height of barley plants: a predictable phenotype whose heritability is quite reduced.

Antonio Bahamonde, Jaime Alonso, Juan José del Coz, Jorge Díez, José Ramón Quevedo, Oscar Luaces
Controlling Particle Trajectories in a Multi-swarm Approach for Dynamic Optimization Problems

In recent years, particle swarm optimization has emerged as a suitable optimization technique for dynamic environments, mainly its multi-swarm variant. However, in the search for good solutions some particles may produce transitions between non improving ones. Although this fact is usual in stochastic algorithms like PSO, when the problem at hand is dynamic in some sense one can consider that those particles are wasting resources (evaluations, time, etc). To overcome this problem, a novel operator for controlling particle trajectories is introduced into a multi-swarm PSO algorithm. Experimental studies over a benchmark problem shows the benefits of the proposal.

Pavel Novoa, David A. Pelta, Carlos Cruz, Ignacio García del Amo
Clustering Ensembles Using Ants Algorithm

Cluster ensembles combine different clustering outputs to obtain a better partition of the data. There are two distinct steps in cluster ensembles, generating a set of initial partitions that are different from one another, and combining the partitions via a consensus functions to generate the final partition. Most of the previous consensus functions require the number of clusters to be specified a priori to obtain a good final partition. In this paper we introduce a new consensus function based on the Ant Colony Algorithms, which can automatically determine the number of clusters and produce highly competitive final clusters. In addition, the proposed method provides a natural way to determine outlier and marginal examples in the data. Experimental results on both synthetic and real-world benchmark data sets are presented to demonstrate the effectiveness of the proposed method in predicting the number of clusters and generating the final partition as well as detecting outlier and marginal examples from data.

Javad Azimi, Paul Cull, Xiaoli Fern
The kNN-TD Reinforcement Learning Algorithm

A reinforcement learning algorithm called

k

NN-TD is introduced. This algorithm has been developed using the classical formulation of temporal difference methods and a

k

-nearest neighbors scheme as its expectations memory. By means of this kind of memory the algorithm is able to generalize properly over continuous state spaces and also take benefits from collective action selection and learning processes. Furthermore, with the addition of probability traces, we obtain the

k

NN-TD(

λ

) algorithm which exhibits a state of the art performance. Finally the proposed algorithm has been tested on a series of well known reinforcement learning problems and also at the Second Annual RL Competition with excellent results.

José Antonio Martín H., Javier de Lope, Darío Maravall
Recombination Patterns for Natural Syntax

The main goal of this paper is to show that the mechanism of recombination, found in biology, also works in natural languages. We aim to demonstrate that it may be possible to generate a natural language, or at least an important part of it, by starting with a base composed of a finite number of simple sentences and words, and by applying a small number of recombination rules.

Recombination patterns

are introduced as a simple and efficient formalism –based on the behavior of DNA molecules– to describe the syntax of natural languages. In this paper we deal with the description of complex sentences and we have used English, Italian and Spanish to define our formalism.

Vincenzo Manca, M. Dolores Jiménez-López
Computing Natural Language with Biomolecules: Overview and Challenges

This paper aims to present the formal framework in which the interdisciplinary study of natural language is conducted by integrating linguistics, computer science and biology. It provides an overview of the field of research, conveying the main biological ideas that have influenced research in linguistics. Especially, this work highlights the main methods of molecular computing that have been applied to the processing and study of the structure of natural language. Among them, DNA computing, membrane computing and NEPs are the most relevant computational architectures that have been adapted to account for one of the most unknown cognitive capacities of human beings.

Gemma Bel-Enguix
A Christiansen Grammar for Universal Splicing Systems

The main goal of this work is to formally describe splicing systems. This is a necessary step to subsequently apply Christiansen Grammar Evolution (an evolutionary tool developed by the authors) for automatic designing of splicing systems. Their large number of variants suggests us a decisions: to select a family as simple as possible of splicing systems equivalent to Turing machines. This property ensures that the kind of systems our grammar can generate is able to solve any arbitrary problem. Some components of these universal splicing systems depend on other components. So, a formal representation able to handle context dependent constructions is needed. Our work uses Christiansen grammars to describe splicing systems.

Marina de la Cruz Echeandía, Alfonso Ortega de la Puente
DNA Replication as a Model for Computational Linguistics

We examine some common threads between biological sequence analysis and AI methods, and propose a model of human language processing inspired in biological sequence replication and nucleotide bindings. It can express and implement both analysis and synthesis in the same stroke, much as biological mechanisms can analyse a string plus synthesize it elsewhere, e.g. for repairing damaged DNA substrings.

Veronica Dahl, Erez Maharshak
jNEPView: A Graphical Trace Viewer for the Simulations of NEPs

jNEP, a Network of Evolutionary Processors (NEP) simulator, has been improved with several visualization facilities. jNEPView display the network topology in an friendly manner and shows the complete description of the simulation state in each step. Using this tool, it is easier to program and study NEPs, whose dynamic is quite complex, facilitating theoretical and practical advances on the NEP model.

Emilio del Rosal, Miguel Cuéllar
The Problem of Constructing General-Purpose Semantic Search Engines

This work proposes the basic ideas to achieve a really semantic search. For this, first the number of semantic web sites must be increased which, on the one hand, maintain compatibility with the current Web and, on the other, offer different interpretations (now with semantics) of the same information according to different ontologies. Thus, the design of tools is proposed that facilitate this translation of HTML contents into OWL contents, as we say, possibly, according to different ontologies. The article continues by analysing the possible functionalities that we consider a semantic search engine based on the Semantic Web paradigm must support and by presenting a general-purpose search engine prototype on the two structures, current and semantic.

Luis Criado Fernández, Rafael Martínez-Tomás
Computational Agents to Model Knowledge - Theory, and Practice in Visual Surveillance

In this work the concept of computational agent is located within the methodological framework of levels and domains of description of a calculus in the context of different usual paradigms in Artificial Intelligence (symbolic, situated, connectionist, and hybrid). Emphasis in the computable aspects of agent theory is put, leaving open the possibility to the incorporation of other aspects that are still pure cognitive nomenclature without any computational counterpart of equivalent semantic richness. These ideas are currently being implemented on semi-automatic video-surveillance.

José Mira, Ana E. Delgado, Antonio Fernández-Caballero, José M. Gascueña, María T. López
Knowledge and Event-Based System for Video-Surveillance Tasks

This work describes an event-based system supported by knowledge to compose high-level abstraction events from intermediate agent events. The agents are in a level that interprets multi-sensory signals according to the scenario ontology, particularly, from video-sequence identification and monitoring. The target task is surveillance understood in its entirety from the identification of pre-alarm signals to planned action. The work describes the system architecture for surveillance based on this composition knowledge, how the knowledge base is organised, tools for its management and examples of event inference/composition characterising a scene situation.

Rafael Martínez Tomás, Angel Rivas Casado
ARDIS: Knowledge-Based Dynamic Architecture for Real-Time Surface Visual Inspection

This work presents an approach to surface dynamic inspection in laminated materials based on the configuration of a visual system to obtain a good quality control of the manufacturing surface. The configuration task for surface inspection is solved as a Configuration-Design task following the CommonKADS methodology, which supports the proposed knowledge-based dynamic architecture (ARDIS).

The task is analysed at the knowledge level and is decomposed into simple subtasks to reach the inference level. All the generic knowledge involved in the surface inspection process is differentiated among environment, real-time, image quality and computer vision techniques to be integrated it in ARDIS.

An application has been developed to integrate four operation modes relating to visual system configuration and specific surface inspection. An example is shown for configuring a stainless steel inspection system and another one for wood inspection.

D. Martín, M. Rincón, M. C. García-Alegre, D. Guinea
SONAR: A Semantically Empowered Financial Search Engine

The increasingly huge volume of financial information found in a number of heterogeneous business sources is characterized by unstructured content, disparate data models and implicit knowledge. As Semantic Technologies mature, they provide a consistent and reliable basis to summon financial knowledge properly to the end user. In this paper, we present SONAR, a semantically enhanced financial search engine empowered by semi-structured crawling, inference-driven and ontology population strategies bypassing the present state-of-the-art technology caveats and shortcomings.

Juan Miguel Gómez, Francisco García-Sánchez, Rafael Valencia-García, Ioan Toma, Carlos García Moreno
KBS in Context Aware Applications: Commercial Tools

Knowledge based systems are advanced systems of complex problems representation. Its architecture and representation formalisms are the base of nowadays systems. The nature of the knowledge is usually derived from the experience in specific areas and its validation requires a different methodology of the one used in the conventional systems because the symbolic characteristic of the knowledge. On the other hand, context-aware applications are designed to react to constant changes in the environment and to adapt their behavior to its users’ situation, needs and objectives. In this contribution, we describe the design, definition and evaluation process of a knowledge-based system using CommonKADS methodology in order to represent the contextual information in a formal way for Appear platform. We also validate the prototype of the context aware system in different realistic environments: an airport, an intelligent home and elderly care which is a significant step into the formally-built applications of KBS.

Nayat Sánchez-Pi, Javier Carbó, José Manuel Molina
An Architecture Proposal for Adaptive Neuropsychological Assessment

In this work we present the architecture of an special sort of software capable of changing its presentation depending on the behavior of a particular user. Although the proposed solution is applicable as the base of a general adaptive application, we want to expose the peculiarities of the model designed specifically to work in a special domain, the cognitive neuropsychology. In this domain, one of the most important topic is the design of task for the assessment of patients with cerebral damage. Depending of the patient, and its particular characteristics, the therapists must adjust the exposition time of several stimulus in order to obtain the best parameter values, and therefore to obtain the profile in a precise way. This is a very important topic because correct assessment implies successful rehabilitation but, in practice, this process is time-consuming and difficult to be done. So, with the aim to help the therapists in the task tuning process, we propose the use of artificial intelligence-based techniques in order to detect the patient’s profile automatically during the execution time, adapting the values of the parameters dynamically. In particular, we propose the foundation of temporal similarities techniques as the basis of design adaptive software.

María M. Antequera, M. Teresa Daza, Francisco Guil, Jose M. Juárez, Ginesa López-Crespo
A Study of Applying Knowledge Modelling to Evidence-Based Guidelines

This paper reports on a case-study of applying the general purpose and widely accepted methodology CommonKADS to a clinical practice guideline. CommonKADS is focussed on obtaining a compact knowledge model. However, guidelines usually contain incomplete and ambiguous knowledge. So, the resulting knowledge model will be incomplete and we will need to detect what parts of the guideline knowledge are missing. A complementary alternative, which we propose in this work, is to reconstruct the process of knowledge model construction, proposed by CommonKADS, in order to force the knowledge engineer to keep the transformation paths during knowledge modeling. That is to say, we propose to establish explicit mappings between original medical texts and the knowledge model, storing these correspondences in a structured way. This alternative will reduce the existing gap between natural language representation and the corresponding knowledge model.

M. Taboada, M. Meizoso, D. Martínez, S. Tellado
Fuzzy Classification of Mortality by Infection of Severe Burnt Patients Using Multiobjective Evolutionary Algorithms

The classification of survival in severe burnt patients is an on-going problem. In this paper we propose a multiobjective optimisation model with constraints to obtain fuzzy classification models based on the criteria of accuracy and interpretability. We also describe a multiobjective evolutionary approach for fuzzy classification based on data with real and discrete attributes. This approach is evaluated using three different evolutive schemas: pre-selection with niches, NSGA-II and ENORA. The results are compared as regards efficacy by statistical techniques.

F. Jiménez, G. Sánchez, J. M. Juárez, J. M. Alcaraz, J. F. Sánchez
Knowledge Based Information Retrieval with an Adaptive Hypermedia System

This paper describes research on information retrieval with an adaptive hypermedia system (AHS) used during three higher education courses taught in a blended learning (BL) environment. The system generates different work plans for each student, according to their profile. Work plans are adapted by means of an algorithm. AHS enable course contents to be adapted to the learning needs of each student and structured in a way that leads to many different learning paths. The case study method was used in this research. The results suggest that the AHS has a positive impact on the learning process. Further research is needed to confirm these results.

Francisca Grimón, Josep Maria Monguet, Jordi Ojeda
Reveal the Collaboration in a Open Learning Environment

The management and characterization of collaboration to improve students’ learning is still an open issue, which needs standardized models and inferring methods for effective collaboration indicators, especially when online courses are based on open approaches where students are not following CSCL scripts. We have supplied our students with a scrutable (manageable and understandable) web application that shows an ontology, which includes collaborative features. The ontology structures collaboration context information, which has been obtained form explicit (based on questionnaires) and implicit methods (supported by several machine learning techniques). From two consecutive years of experiences with hundreds of students we researched students’ interactions to find implicit methods to identify and characterize students’ collaboration. Based on the outcomes of our experiments we claim that showing useful and structured information to students and tutors about students’ collaborative features can have a twofold beneficial impact on students learning and on the management of their collaboration.

Antonio R. Anaya, Jesús G. Boticario
Reasoning on the Evaluation of Wildfires Risk Using the Receiver Operating Characteristic Curve and MODIS Images

This paper presents a method to evaluate the wildfires risk using the Receiver Operating Characteristic (ROC) curve and Terra moderate resolution imaging spectroradiometer (MODIS) images. To evaluate the wildfires risk fuel moisture content (FMC) was used, the relationship between satellite images and field collected FMC data was based on two methodologies; empirical relations and statistical models based on simulated reflectances derived from radiative transfer models (RTM). Both models were applied to the same validation data set to compare their performance. FMC of grassland and shrublands were estimated using a 5-year time series (2001-2005) of Terra moderate resolution imaging spectroradiometer (MODIS) images. The simulated reflectances were based on the leaf level PROSPECT coupled with the canopy level SAILH RTM. The simulated spectra were generated for grasslands and shrublands according to their biophysical parameters traits and FMC range. Both RTM-based models, empirical and statistical, offered similar accuracy with better determination coefficients for grasslands. In this work, we have evaluated the accuracy of (MODIS) images to discriminate between situations of high and low fire risk based on the FMC, by using the Receiver Operating Characteristic (ROC) curve. Our results show that none of the MODIS bands have a good discriminatory capacity (0.9984) when used separately, but the joint information provided by them offer very small misclassification errors.

L. Usero, M. Xose Rodriguez-Alvarez
Optimised Particle Filter Approaches to Object Tracking in Video Sequences

In this paper, the ways of optimising a Particle Filter video tracking algorithm are investigated. The optimisation scheme discussed in this work is based on hybridising a Particle Filter tracker with a deterministic mode search technique applied to the particle distribution. Within this scheme, an extension of the recently introduced structural similarity tracker is proposed and compared with the approach based on separate and combined colour and mean-shift tracker. The new approach is especially applicable to real-world video surveillance scenarios, in which the presence of multiple targets and complex background pose a non-trivial challenge to automated trackers. The preliminary results indicate that a considerable improvement in tracking is achieved by applying the optimisation scheme, at the price of a moderate computational complexity increase of the algorithm.

Artur Loza, Fanglin Wang, Miguel A. Patricio, Jesús García, José M. Molina
Towards Interoperability in Tracking Systems: An Ontology-Based Approach

Current video surveillance systems are expected to allow for management of video sequences recorded by distributed cameras. In order to obtain a complete view of the scene, it is necessary to apply data fusion techniques to the acquired data. This increases the need of communicating various and probably heterogeneous components in a vision system. A solution is to use a common and well-defined vocabulary to achieve understanding between them. In this work, we present an OWL ontology aimed at the symbolic representation of tracking data. The ontological representation of tracking data improves system interoperability and scalability, and facilitates the development of new functionalities. The ontology can be used to express the tracking information contained in the messages exchanged by the agents in CS-MAS, a multi-agent framework for the development of multi-camera vision systems.

Juan Gómez-Romero, Miguel A. Patricio, Jesús García, José M. Molina
Multimodal Agents in Second Life and the New Agents of Virtual 3D Environments

The confluence of 3D virtual worlds with social networks imposes to software agents, in addition to his conversational functions, the same behaviors as those common to human-driven avatars. In this paper we explore the possibilities of the use of metabots (metaverse robots) in virtual 3D worlds and we introduce the concept of AvatarRank as a measure of the avatar’s popularity and the concept of the extended Turing test to assess the anthropomorphness of the metaverse’s elements.

A. Arroyo, F. Serradilla, O. Calvo
Application of Artificial Neural Networks to Complex Dielectric Constant Estimation from Free-Space Measurements

Adequate characterization of materials allows the engineer to select the best option for each application. Apart from mechanical or environmental characterization, last decades’ rise in the exploitation of the electromagnetic spectrum has made increasingly important to understand and explain the behavior of materials also in that ambit. The electromagnetic properties of non-magnetic materials are governed by their intrinsic permittivity or dielectric constant and free-space measurements is one of the various methods employed to estimate this quantity at microwave frequencies. This paper proposes the application of Artificial Neural Networks (ANNs) to extract the dielectric constant of materials from the reflection coefficient obtained by free-space measurements. In this context, two kind of ANNs are examined: Multilayer Perceptron (MLP) and Radial Basis Function (RBF) networks. Simulated materials are utilized to train the networks with and without noise and performance is tested using an actual material sample measured by the authors in an anechoic chamber.

Antonio Jurado, David Escot, David Poyatos, Ignacio Montiel
Backmatter
Metadaten
Titel
Methods and Models in Artificial and Natural Computation. A Homage to Professor Mira’s Scientific Legacy
herausgegeben von
José Mira
José Manuel Ferrández
José R. Álvarez
Félix de la Paz
F. Javier Toledo
Copyright-Jahr
2009
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-642-02264-7
Print ISBN
978-3-642-02263-0
DOI
https://doi.org/10.1007/978-3-642-02264-7