Skip to main content

2012 | Buch

Computer Aided Systems Theory – EUROCAST 2011

13th International Conference, Las Palmas de Gran Canaria, Spain, February 6-11, 2011, Revised Selected Papers, Part I

herausgegeben von: Roberto Moreno-Díaz, Franz Pichler, Alexis Quesada-Arencibia

Verlag: Springer Berlin Heidelberg

Buchreihe : Lecture Notes in Computer Science

insite
SUCHEN

Über dieses Buch

The two-volume proceedings, LNCS 6927 and LNCS 6928, constitute the papers presented at the 13th International Conference on Computer Aided Systems Theory, EUROCAST 2011, held in February 2011 in Las Palmas de Gran Canaria, Spain. The total of 160 papers presented were carefully reviewed and selected for inclusion in the books. The contributions are organized in topical sections on concepts and formal tools; software applications; computation and simulation in modelling biological systems; intelligent information processing; heurist problem solving; computer aided systems optimization; model-based system design, simulation, and verification; computer vision and image processing; modelling and control of mechatronic systems; biomimetic software systems; computer-based methods for clinical and academic medicine; modeling and design of complex digital systems; mobile and autonomous transportation systems; traffic behaviour, modelling and optimization; mobile computing platforms and technologies; and engineering systems applications.

Inhaltsverzeichnis

Frontmatter

Concepts and Formal Tools

A Framework for Combining Multivalued Data: A Practical Approach

Scientific and engineering disciplines are developing daily. This progress is strongly connected to complex techniques and methods. Over the years countless efforts have been made to process data. However it is surprising to observe that the majority of methods and techniques used are binary based. In recent years, due to the development of Knowledge Discovery in Databases, many techniques and tools that discover useful information and knowledge in databases have been designed. Knowledge Discovery in Databases is the nontrivial process of identifying valid, novel, potentially useful, and understandable patterns in data [1].

Margaret Miró-Julià
On Modelling Metabolism-Repair by Convolution and Partial Realization

John Casti introduced by several papers [1], [2], [3] a mathematical modelling method for metabolism-repair in biological cells following the approach of Robert Rosen [4]. As a result Casti was able to determine algebraic criteria which describe for the linear time-invariant case functional conditions for repair and replication. Furthermore Casti was interested to compute for the metabolism map

f

, for the repair map

P

and for the replication map

β

by means of the realization algorithm of Kalman-Ho [5], which originally was developed for applications in control engineering, a state space representation given by the associated minimal dynamical system. Different authors, coming mainly from the field of mathematical biology, have made use of the results of Casti and have tried to extend the results [6], [7]. In this lecture and in the paper we repeat partly the results of John Casti but take the narrower point of view in describing the relevant I/O operations by discrete time convolution. Furthermore Casti computes on the basis of the impulse response

h

by the method of Kalman-Ho the associated state space representations (

F

,

G

,

H

). By this approach he gets for the metabolism map

f

, the repair map

P

and the replication map

β

a algorithmic representations with the expectation to get so additional modelling means for the solution of associated biological problems. The application of the Kalman-Ho algorithm for realization requires, however, that the Hankel matrix of the associated impulse responses is finite dimensional. In the case of biological cells, the validity of this assumption seems to be difficult to prove. Any biological cell is in its inners a highly complex system which is reflected by its metabolism map and the associated impulse response. John Casti stated on this point, that it would be desirable to be able to compute partial realizations, which does not require finite dimensionality of the impulse responses. Our presentation follows this direction. We make use of the partial realization method as introduced for the scalar case by Rissanen [8] and for the multi-variable case by Rissanen-Kailath [9]. In an earlier paper we used the partial realization method of Rissanen to generalize the Massey-Berlekamp algorithm of linear cryptanalysis for the case of multi-variable pseudo random sequences [10] The implementation of the partial realization method of Rissanen-Kailath was reported by Jochinger [11]. It is our hope that our work can stimulate biological research in metabolism-repair by the use of the modelling approach of Rosen-Casti.

Franz Pichler
Cost Oriented Humanoid Robots

Currently there are three categories of humanoid robots available: Professional humanoid robots developed by large companies with a huge amount of research resources. They are expensive and not available on the market. Research humanoid robots: The robots in this category are usually prototypes developed by scientists to implement and test new ideas. Humanoid “Toy” robots: There are a lot of humanoid toy robots, mostly developed by small or medium sized companies, available on the market. Usually they have extremely limited capabilities in hard- as well as in software. Because of these facts in this paper a new fourth category – cost oriented humanoid robots (COHR) are introduced. These robots will be able to support humans in everyday life e.g on the working place, in the household, in leisure and entertainment, and should be available on the market for a reasonable price.

P. Kopacek
New Biomimetic Neural Structures for Artificial Neural Nets

The general aim is to formalize known properties of real neurons, formulating them into appropriate mathematical models. These will converge into, hopefully, more powerful neurophysiological founded distributed computation units of artificial neural nets. Redundancy and distributed computation are key factors to be embodied in the corresponding biomimetic structures.

We focus in two neurophysiological processes: first, the dendro-dendritic or afferent non linear interactions, prior to the synapses with the cell body. Computational redundancy (and reliability as a consequence) is to be expected. Second, distributed computation, also provoked by a dendritic-like computational structure to generate arbitrary receptive fields weights or profiles, where also, a kind of reliability is expected, result of the distributed nature of the computation.

Gabriel de Blasio, Arminda Moreno-Díaz, Roberto Moreno-Díaz Jr., Roberto Moreno-Díaz

Software Applications

Extending OLSR Functionalities to PKI Management

In self-organized environments such as mobile ad-hoc networks, network members are responsible for developing correctly and efficiently security service management among many others management tasks. The specific objective in the proposal here described is the enhancement of local certificate repositories building processes. The approach considered to meet this goal has been to define an authentication solution based on the web of trust concept combined with some tools belonging to the core of the Optimized Link State Routing (OLSR) protocol. The experimental results obtained show a considerable decrease in resource consumption while carrying out the certificate verification process.

C. Hernández-Goya, P. Caballero-Gil, J. Molina-Gil, C. Caballero-Gil
Bandwidth Usage Optimization for NNTP Protocol

Streaming extensions of the NNTP protocol provide methods of pipelining article transfers to maximize connection throughput. If redundant server links are used for service reliability improvement, this causes substantial bandwidth loss, as the same articles may be transferred in parallel on different links. An extension of the existing NNTP protocol is described which can circumvent this problem by actively shaping the bandwidth of the incoming connections. Also, a method for remote measurement of NNTP protocol bandwidth is described.

Tomasz Surmacz
The Dilemma of Choice in Management of Communication Processes in WSN

In this paper we concentrate on developing the formal methods and techniques necessary to model and evaluate situations in the Wireless Sensor Network (WSN). We present results of the research on choices in distributed, complex systems like WSN. In general, three types of choices are analyzed; the choice of appropriate mathematical abstraction as tool capable to manage the WSN communication activity, the choice of mathematical abstraction as tool capable to manage the WSN communication structure and the local – global dilemma.

Jan Nikodem
A Distributed Authorization System with Mobile Usage Control Policies

Distributed systems, such as the Cloud, are widely used for solving large problems, because they provide big computational power at a low cost. From the security point of view, distributed systems pose new challenges, because the applications running on the components of the system could cooperate to access the system’s resources. Hence, the security support should consider all the accesses performed by the applications run by the same user on distinct nodes of a distributed system as the behaviour of that user. To address this problem, this paper proposes mobile usage control policies that, besides regulating the usage of the system resources, also define the exchange of some policy fragments among the nodes of the distributed system. In this way, the usage of resources performed on one node of the distributed system affects the right of accessing resources on other nodes of the system. A reference scenario where mobile usage control policies could be successfully adopted is the Cloud environment.

Fabio Martinelli, Paolo Mori
Fuzzy Logic for the Performance Assessment of the Innovation Management in Tourism

The innovative performance of companies has been studied quite extensively for a long period of time. However, the results of those studies have not yet led to a generally accepted indicator of innovative performance or a methodology to assess Innovation Management. This paper assumed the multidimensionality of the Innovation Management as an outcome of complex interactions. Because of this, it is proposed a design of a methodology to assess Innovation Management based on principles of the Fuzzy Logic. The established methodology stages constitute a guide to develop Innovation Management assessment. In the research development, it was necessary to contextualize, in the conditions of the touristic sector, the theories, models and systems used in the analyzed subject.

Dayana Lozada, Jose Manuel Castillo, Alberto Salguero, Francisco Araque, Cecilia Delgado, Marcia Noda, Gilberto Hernández

Computation and Simulation in Modelling Biological Systems

Neuronal Data Analysis Based on the Empirical Cumulative Entropy

We propose the empirical cumulative entropy as a variability measure suitable to describe the information content in neuronal firing data. Some useful characteristics and an application to a real dataset are also discussed.

Antonio Di Crescenzo, Maria Longobardi
On the Construction of Densities for Time Non-homogeneous Diffusion Processes

A new procedure for constructing transition probability density functions and first passage time densities through constant boundaries is proposed for diffusion processes, in general time non homogeneous. A special diffusion process with periodic drift and infinitesimal variance is considered.

Virginia Giorno, Amelia G. Nobile, Luigi M. Ricciardi
Rational Function Systems in ECG Processing

In this paper we propose to use rational function systems for processing ECG signals. In several respects our approach has advantages over the previously used transformation methods. For instance the system is designed particularly for ECG signals in the sense that the shape of the individual terms correspond to the natural segmentation of the signal. Moreover, our method is more flexible. This means that not only the coefficients but also the system itself can be optimized even from heartbeats to heartbeats. Another property worth to emphasize is the simplicity of the system. The terms are simple rational functions that can be generated by a few number of complex arithmetic operations, i.e. they are easy to work with on computers.

Sándor Fridli, Levente Lócsi, Ferenc Schipp
First-Passage-Time for Gauss-Diffusion Processes via Integrated Analytical, Simulation and Numerical Methods

In the study of the dynamics of a system subject to stochastic evolution, the attention is usually focused on the laws regulating the time course of the distribution function (df), or the transition probability density function (pdf), by which one can express the probability for the system to occupy at given times any preassigned configuration of the state space. In other words, it is customary to pay attention to the time evolution of the considered system through the state space starting either from a uniquely preassigned initial state or from a state whose probability distribution is assumed to be given.

Aniello Buonocore, Luigia Caputo, Enrica Pirozzi
Modelling Aspects and Structural Properties of a Fed-Batch Bioprocess

This work deals with the modelling and the achievement of some structural properties for a nonlinear fed-batch prototype bioprocess: an aerobic microbial growth process coupled with an enzyme-catalyzed reaction. As a modelling procedure, Bond Graph approach is used. The model structural properties are analyzed, such as the partition in linear and nonlinear parts, the decoupling of kinetics, etc. Numerical simulations of the evolution of fed-batch bioprocess are provided by using both 20sim and MATLAB environments.

Monica Roman

Intelligent Information Processing

A Certified Module to Study Digital Images with the Kenzo System

Kenzo is a Computer Algebra system devoted to Algebraic Topology, written in the Common Lisp programming language. In this paper, programs which allow us to analyze monochromatic digital images with the Kenzo system are presented. Besides a complete automated proof of the correctness of our programs is provided. The proof is carried out using ACL2, a system for proving properties of programs written in (a subset of) Common Lisp.

Jónathan Heras, Vico Pascual, Julio Rubio
Modelling the Psychographic Behaviour of Users Using Ontologies in Web Marketing Services

Web marketing is a form of advertising geared to reach its target audience using a fewer number of commercials. Any recommendation model intended to provide a personalized outcome is based on accurate segmentation strategies that rely heavily on how the users’ characteristics and behaviour are modelled. Although our proposal distributes the domain information among several ontologies, in this paper we will focus on how the psychographic data can be used to properly segment the user.

Abraham Rodríguez Rodríguez, Nicolás Iglesias García, José María Quinteiro-González
Understanding the System Dynamics of High-Technology Markets: Pólya Processes with Positive Feedback, Path Dependence and Lock-In

This paper relies on complexity theory to gain new insights into the dynamics of high-technology markets. We are making use of the Pólya process model to explain these dynamics. This classical model highlights the “mechanism” of positive feedback, which gives rise to the phenomenon of path dependence and lock-in.

Markus Schwaninger, Christoph Mandl
R2RIF - Rule Integration Plugin for Protégé OWL

Rules and Ontologies are two parts of the Semantic Web which considerably afford highly adaptable and generic approaches according to their interoperability and unified application. Within this paper we present a proposal and conceptualization of R2RIF, a rule integration plugin for Protégé OWL, aiming to combine ontologies and rules to build appropriate conceptual models for a domain of interest. Hence, this task comes up with several specific topics regarding transformation of different rule languages into one common interoperable W3C Rule-Interchange-Format (RIF), converting domain knowledge within ontologies into Horn Logic axioms and terms, and merging them with semantic compatible RIF rules. Subsumption reasoning and consistency checks of such ontology/rule models deliver expressive domain models. According to transformation and conversion tasks, R2RIF delivers generic algorithms and workflows to be adaptable for different ontology and rule languages.

Andreas Pomarolli, Stefan Anderlik, Josef Küng
GenComp – A Generic Transformation System

A transformation system can be viewed as a general case of a compiler. This paper introduces a new concept to generate such transformation systems (e.g., compilers), a so called generic compiler. A first version of the generic compiler is described:

GenComp

, which is the abbreviation for a generic compiler. It first reads a formal description in form of an attributed grammar and then interprets this description in order to transform an input in the source language to an output in the target language.

Qiao Chen, Heinz Dobler
Implementing the Universal Virtual Computer

In order to keep digital objects for an indefinite period of time, one needs a very contrived archiving system. One challenge is to sustain the accessibility of document formats that are becoming obsolete, another is to guarantee their authenticity. The Universal Virtual Computer (UVC) is a simple yet powerful approach to preserve digital objects on a very long-term scale. Its main attraction is that documents do not have to be processed and transformed during their whole archive lifetime. In contrast, when using the migration approach, all documents have to be processed in regular intervals. This is not only time-consuming; also, after a number of migration steps a document’s authenticity is seriously threatened. UVC does not share these problems. With UVC, the main effort occurs before ingest time: rendering software for interpreting documents of a given format on UVC must be developed and archived. The effort spent in the development of the rendering software will determine the degree of authenticity. In order to access archived objects, an implementation of UVC must be available. The focus of this paper is on implementing UVC. So far, only proof-of-concept implementations of UVC were available. We have gained practical experience by implementing UVC on different platforms based on a collection of vintage, but still fully working supercomputers. These efforts have led to an improved specification of the UVC, which simplifies implementation even more.

Nico Krebs, Lothar Schmitz, Uwe M. Borghoff
Using GPS Trajectories to Create a Dynamic Network of Significant Locations as an Abstraction of Road Maps

This contribution discusses an approach on finding significant locations from a set of GPS tracks, called Hotspots in this contribution. Based on that, a model where traffic infrastructure is represented by a dynamic network of Hotspots is suggested. Besides the location of Hotspots, information about travel times between these Hotspot-Nodes also comes along with the extracted significant places. This information can be used to improve or enrich traffic management and/or navigation systems by consequently achieving a more precise estimation of travel times compared to current systems.

Reinhard Stumptner, Bernhard Freudenthaler, Jürgen Hönigl, Karl Rehrl, Josef Küng
On the Confluence of the Graphic Calculus with Penrose Diagrams (I)

In paper Molinelli et al, 1998 a general model allowing the integration of different kinds of calculus with diagrams appearing in several fields of Science and Engineering was introduced. And also, a computer aided system enabling some manipulation of this graphical stuff was presented.

Traditionally most of these diagrams have been used as an aid in the development of complex calculus, although the lack of a solid theoretical foundation has prevent the existence of practical tools.

As a contribution to that necessary background, we present here an implementation of the diagrams using Coq and a first discussion on the confluence of the rewriting based on the interchange law.

J. L. Freire Nistal, A. Blanco Ferro, J. M. Molinelli Barba, E. Freire Brañas
System for Recommendation of Information Based on a Management Content Model Using Software Agents

In recent years researchers have been working to develop tools to filter the information available and give the users the content that is relevant to them. This paper presents a recommendation system developed on an agent architecture software based on a management content model in blended learning environments. It also uses clustering algorithms to find relevant information in the content, building a search space tailored to the interests of the learner. To validate the architecture proposed we worked with Action Research methodology and was developed a prototype system called SisMA in order to retrieve the information in the educational setting. To measure the effectiveness of the application and its impact on the learning process the system was tested in two scenarios. The results showed that the relevance of content and the profile generated by its work plan have had a positive effect on the learning process.

Francisca Grimón, Marylin Giugni, Joaquín Fernández, Joseph Monguet
Dynamic Cellular Automata-Based S-Boxes

The most important elements of many block ciphers are nonlinear functions known as substitution boxes (S-boxes). Classical S-boxes are usually represented by numerical tables, which are used today in current cryptographic standards, such as Data Encryption Standard (DES) or Advanced Encryption Standard (AES), but in the result of developing methods of cryptanalysis they do not ensure enough safety of ciphers. Therefore, the open research issue now is to design new more sophisticated classes of S-boxes, in particular dynamic ones. In this paper we propose a methodology to design dynamic cellular automata (CA)-based S-boxes, which can be considered as a generator of CA-based S-boxes. We provide an exhaustive experimental analysis of the proposed CA-based S-boxes in terms of non-linearity, autocorrelation, balance and strict avalanche criterion. We show that the proposed S-boxes have high quality cryptographic properties (high non-linearity and balance, also low autocorrelation and distance to fulfill strict avalanche criterion). The interesting feature of the proposed S-boxes is a dynamic flexible structure, fully functionally realized by CA, while the classical S-boxes are represented by predefined unchangeable table structures.

Miroslaw Szaban, Franciszek Seredynski
People Transfer in City Transport Modeled via CPN

The main goal of the paper is to present an optimized model of a complex City Transport System via CPN Tools (Coloured Petri Nets) in order to improve whole system with respect to their specific characteristics and environment (e.g. rush hours, car accident etc.). Our approach is based on timed Coloured Petri nets and simulation techniques exploiting such nets. In particular, we demonstrate on a simplified model from the real environment that the whole system is very predisposed to insufficiency. Especially, in any case of even a little delay the whole system begins to fail transfer people fluently. First of all, the main preliminaries are presented and after that we describe our approach and model in more details.

Dušan Kolář, Šárka Květoňová
Adaptive Change Estimation in the Context of Online Market Monitoring

In the Internet-based economy, the (relative) transparency of e-markets and increasing online market dynamics call for more responsive and encompassing approaches towards the monitoring of markets and competitors. Accordingly, this paper proposes to observe continuously a preselected set of e-commerce Web channels, or online portals, to gather a comprehensive as possible picture of market dynamics. In so doing, a historical market data repository is accumulated based on an adaptive scheme of harvesting Web data online in order to provide dynamic information about both market structure and prices. A description of the proposed estimator for online data sampling based on observed (price) change frequencies is given. Numerical simulations highlight the virtues of the proposed adaptive estimator compared to established Web page change frequency estimators, even more so in case of considering constraints on (observation) resources. As an example, the methodology is applied to the online hotel room booking market.

Norbert Walchhofer, Karl Anton Froeschl, Kurt Hornik
On Transforming a Knowledge Base from Topic Maps to OWL

In this contribution we show, how we could overcome the shortcomings of the Topic-Map representation by applying our developed transformation concept for the particular system VCEDECIS. At a passive decision support system using Topic Maps as a semantic technology for representing its knowledge, it was decided to transform the knowledge representation to Web Ontology Language. We introduce a transformation concept that starts with a sound analysis of the source system. For typical patterns in the topic map the best corresponding patterns in OWL-DL were defined. A combination of general considerations and real examples act as a proof of concept.

Kamil Matoušek, Petr Křemen, Josef Küng, Reinhard Stumptner, Stefan Anderlik, Bernhard Freudenthaler

Heuristic Problem Solving

Automated Building Construction Design Optimization for Reduction of Construction Costs and Energy Demand

Considering both, the ecological and economical aspects in building construction engineering, is of high importance for a balanced and efficient building design. For high competitiveness on the markets, the need for novel efficiency metrics and automated optimization of the building plan arises. We introduce an efficiency measure for balancing the trade-off between construction cost and the heating demand as energy efficiency aspect. Thereby the building physics are precisely modeled and all possible variations of the particular material choice can be evaluated via simulation. By considering all possible variations of the particular plan, a large multi-dimensional search space can be defined, allowing search for the optimal design utilizing heuristic methods. Exploitation of the search space allows the quantitative assessment of plans with respect to the theoretical optimum and generally the qualitative comparison of different construction settings and different material choice for development of current best standard practice in building construction engineering.

Gerald Zwettler, Paul Track, Florian Waschaurek, Richard Woschitz, Elmar Hagmann, Stefan Hinterholzer
Using a Multiobjective OpenMP+MPI DE for the Static RWA Problem

The most promising technology for exploiting optical networks is the Wavelength Division Multiplexing (WDM). When it is necessary to establish a set of demands, a problem comes up, this problem is known as: Routing and Wavelength Assignment problem (RWA problem). In this paper, we present a Hybrid OpenMP+MPI version of the Differential Evolution (DE), but adapted to a multiobjective context (DEPT), for solving this problem. We have studied the behavior of the Hybrid DEPT algorithm by comparing it with an MPI version of the DEPT algorithm. For this comparison, we have used a real-world topology (NTT, Japan) and a homogeneous cluster composed of 16 multi-core nodes, each node with 8 cores (a total of 128 cores). We can conclude that using a Hybrid OpenMP+MPI version of the DEPT algorithm is very suitable to solve the RWA problem in a reasonable time.

Álvaro Rubio-Largo, Miguel A. Vega-Rodríguez, Juan A. Gómez-Pulido, Juan M. Sánchez-Pérez
Discovering DNA Motifs with a Parallel Shared Memory Differential Evolution

The usefulness and efficiency of one algorithm to solve an optimization problem is not given only by the quality of the results obtained, it is also important the computational time and the resources required to obtain them. In this paper we present a parallel implementation of the Differential Evolution (DE) to solve the Motif Discovery Problem (MDP). MDP is an important biological problem that can have a high computational cost if we work with large amounts of nucleotides, so the fine-grained parallelism on a shared memory machine can help us to achieve results quickly. To ensure that our heuristic obtains relevant results we have compared them with those obtained by the standard algorithm NSGA-II and with other fourteen well-known biological methods. As we will see, the structure of the algorithm makes it well suited for parallelization, achieving good results and efficiencies up to 95%.

David L. González-Álvarez, Miguel A. Vega-Rodríguez, Juan A. Gómez-Pulido, Juan M. Sánchez-Pérez
Optimization of Parameter Settings for Genetic Algorithms in Music Segmentation

Genetic algorithms have been introduced to the field of media segmentation including image, video, and also music segmentation since segmentation problems usually have complex fitness landscapes. Music segmentation can provide insight into the structure of a music composition so it is an important task in music information retrieval (MIR). The authors have already presented the application of genetic algorithms for the music segmentation problem in an earlier paper. This paper focuses on the optimization of parameter settings for genetic algorithms in the field of MIR as well as on the comparison of their results.

Brigitte Rafael, Stefan Oertl, Michael Affenzeller, Stefan Wagner
Automatic Generation of 2-AntWars Players with Genetic Programming

In this work, we show how Genetic Programming can be used to create game playing strategies for 2-AntWars, a deterministic turn-based two player game with local information. We evaluate the created strategies against fixed, human created strategies as well as in a coevolutionary setting, where both players evolve simultaneously. We show that genetic programming is able to create competent players which can beat the static playing strategies, sometimes even in a creative way. Both mutation and crossover are shown to be essential for creating superior game playing strategies.

Johannes Inführ, Günther R. Raidl
A Multilevel Heuristic for the Rooted Delay-Constrained Minimum Spanning Tree Problem

The rooted delay-constrained minimum spanning tree problem is an NP-hard combinatorial optimization problem. The problem appears in practice for example when designing a distribution network with a guarantee of timely delivery. Another example is be a centralized broadcasting network where the delaybound represents a quality of service constraint. We introduce a multilevel-based construction heuristic which uses a new measurement for the suitability of edges to create a solution for the problem. In comparison to existing heuristics the main intention is not to create a minimum cost spanning tree, but a solution with a high potential for further improvement. Experimental results indicate that in most cases our approach produces solutions that after local improvement are of higher quality than those of other existing construction techniques.

Martin Berlakovich, Mario Ruthmair, Günther R. Raidl
Improving the Parsimony of Regression Models for an Enhanced Genetic Programming Process

This research is focused on reducing the average size of the solutions generated by an enhanced GP process without affecting the high predictive accuracy the method exhibits when being applied on a complex, industry proposed, regression problem. As such, the effects the GP enhancements have on bloat have been studied and, finally, a bloat control system based on dynamic depth limiting (DDL) and iterated tournament pruning (ITP) was designed. The resulting bloat control system is able to improve by ≃ 40% the average GP solution parsimony without impacting average solution accuracy.

Alexandru-Ciprian Zăvoianu, Gabriel Kronberger, Michael Kommenda, Daniela Zaharie, Michael Affenzeller
GPU-Based Evaluation to Accelerate Particle Swarm Algorithm

With the advent of the cards GPU, many computational problems have suffered from a net increase of performance. Nevertheless, the improvement depends strongly on the usage of the technology and the porting process used in the adaptation of the problem. These aspects are critical in order that the improvement of the performance of the code adapted to GPU is significant. This article focus on the study of the strategies for the porting of Particle Swarm Algorithm with parallel-evaluation of Schwefel Problem 1.2 and Rosenbrock function. The implementation evaluates the population in GPU, whereas the other intrinsic operators of the algorithm are executed in CPU. The design, the implementation and the associated issues related to GPU execution context are evaluated and presented. The results demonstrate the effectiveness of the proposed approach and its capability to effectively exploit the architecture of GPU.

Miguel Cárdenas-Montes, Miguel A. Vega-Rodríguez, Juan José Rodríguez-Vázquez, Antonio Gómez-Iglesias
Simulation-Based Fitness Landscape Analysis and Optimisation for Vehicle Scheduling Problem

The paper presents simulation optimisation methodology and tools for the vehicle scheduling problem (VSP) with time windows. The optimisation problem statement is given. The fitness landscape analysis is used to evaluate the hardness of the problem. The tool for fitness landscape analysis is build up. To evaluate fitness of solutions the vehicle schedule simulation model in AnyLogic 6 is developed, and Java applications generate landscape path solutions and analyse their fitness series. A genetic algorithm is applied for simulation-based vehicle schedule optimisation. The results of the experimental study are described.

Galina Merkuryeva, Vitaly Bolshakov
An Evolutionary Algorithm with Solution Archive for the Generalized Minimum Spanning Tree Problem

We propose a concept of enhancing an evolutionary algorithm (EA) with a complete solution archive that stores evaluated solutions during the optimization in a trie in order to detect duplicates and to efficiently convert them into yet unconsidered solutions. As an application we consider the generalized minimum spanning tree problem where we are given a graph with nodes partitioned into clusters and exactly one node from each cluster must be connected. For this problem there exist two compact solution representations that can be efficiently decoded, and we use them jointly in our EA. The solution archive contains two tries – each is based on one representation, respectively. We show that these two tries complement each other well. Test results on TSPlib instances document the strength of this concept and that it can match up with the leading state-of-the-art metaheuristic approaches from the literature.

Bin Hu, Günther R. Raidl
Variable Neighborhood and Greedy Randomized Adaptive Search for Capacitated Connected Facility Location

The Connected Facility Location problem combining facility location and Steiner trees has recently gained stronger scientific interest as it can be used to model the extension of last mile communication networks in so-called fiber-to-the-curb scenarios. We consider a generalization of this problem which considers capacity constraints on potential facilities and aims at maximizing the resulting profit by potentially supplying only a subset of all customers. In this work, we discuss two metaheuristic approaches for this problem based on variable neighborhood search and greedy randomized adaptive search. Computational results show that both approaches allow for computing high quality solutions in relatively short time.

Markus Leitner, Günther R. Raidl
Effectively Evolving Finite State Machines Compared to Enumeration

We want to answer the question how effectively finite state machines (FSMs) controlling the behavior of local agents in a multi agent system with a global task can be evolved by a genetic algorithm (GA). Different variants of the GA were used by varying the mutation techniques and the population size. In order to evaluate the effectiveness of the GA the optimal behavior is used for comparison. This optimal behavior can be found by a sophisticated enumeration technique. The agents’ global task is to explore an unknown area with obstacles. The number of states of the controlling FSM was restricted to five in order to keep the computation time for the enumeration acceptable. The results show that the GA is reliable and almost as effective as enumeration while being significantly faster.

Patrick Ediger, Rolf Hoffmann, Sylvia Grüner
Heuristic Power Scheduling of Electric Vehicle Battery Charging Based on Discrete Event Simulation

Since the electrification of individual traffic may cause a critical load to power grids, methods have to be investigated that are capable of handling its highly stochastic behaviour. From a power grid’s point of view, forecasting applications are needed for computing optimal power generation schedules that satisfy end-user’s energy needs while considering installed capacities in the grid. In this paper, an optimization framework is being proposed, that uses metaheuristic algorithms for finding these schedules based on individual traffic simulation using discrete-event methodology. Evolution Strategy implemented in HeuristicLab is used as optimization algorithm, where the used parameterization and the achieved results will be shown.

Stephan Hutterer, Michael Affenzeller, Franz Auinger
Exploring the Accuracy of a Parallel Cooperative Model for Trajectory-Based Metaheuristics

Classical cooperative parallel models for metaheuristics have one major issue when the underlying search method is based on the exploration of the neighborhood of one single solution, i.e., a trajectory-based metaheuristic. Whenever a cooperation step takes place by exchanging solutions, either the incoming or the local solution has to be discarded because the subalgorithm does only work with one single solutions. Therefore, important information may be lost. A recent new parallel model for trajectory-based metaheuristics has faced this issue by adding a crossover operator that is aimed at combining valuable information from both the incoming and the local solution. This work is targeted to further evaluate this parallel model by addressing two well-known, hard optimization problems (MAXSAT and RND) using Simulated Annealing as the search method in each subalgorithm. The results have shown that the new model is able to outperform the classical cooperative method under the experimental conditions used.

Gabriel Luque, Francisco Luna, Enrique Alba, Sergio Nesmachnow
Combination and Comparison of Different Genetic Encodings for the Vehicle Routing Problem

Unlike for other problems, such as the traveling salesman problem, no widely accepted encodings for the vehicle routing problem have been developed yet. In this work, we examine different encodings and operations for vehicle routing problems. We show, how different encodings can be combined in one algorithm run and compare the individual encodings in terms of runtime and solution quality. Based on those results, we perform extensive test cases on different benchmark instances and show how the combination of different encodings and operations can be beneficial and provide a balance between solution quality and runtime.

Stefan Vonolfen, Andreas Beham, Michael Affenzeller, Stefan Wagner, Andreas Mayr
Analysis of Selected Evolutionary Algorithms in Feature Selection and Parameter Optimization for Data Based Tumor Marker Modeling

In this paper we report on the use of evolutionary algorithms for optimizing the identification of classification models for selected tumor markers. Our goal is to identify mathematical models that can be used for classifying tumor marker values as normal or as elevated; evolutionary algorithms are used for optimizing the parameters for learning classification models. The sets of variables used as well as the parameter settings for concrete modeling methods are optimized using evolution strategies and genetic algorithms. The performance of these algorithms is analyzed as well as the population diversity progress. In the empirical part of this paper we document modeling results achieved for tumor markers

CA 125

and

CYFRA

using a medical data base provided by the Central Laboratory of the General Hospital Linz; empirical tests are executed using HeuristicLab.

Stephan M. Winkler, Michael Affenzeller, Gabriel Kronberger, Michael Kommenda, Stefan Wagner, Witold Jacak, Herbert Stekel
Neural Networks Based System for Cancer Diagnosis Support

The paper presents the analysis of two different approaches for a system to support cancer diagnosis. The first one uses only tumor marker data containig missing values to predict cancer occurrence and the second one also includes standard blood parameters. Both systems are based on several heterogeneous artificial neural networks for estimating missing values of tumor markers and they finally caluculate possibilities of different tumor diseases.

Witold Jacak, Karin Pröll
A Memetic Algorithm and a Solution Archive for the Rooted Delay-Constrained Minimum Spanning Tree Problem

We present a memetic algorithm for a combinatorial optimization problem called rooted delay-constrained minimum spanning tree problem arising for example in centralized broadcasting networks where quality of service constraints are of concern. The memetic algorithm is based on a specialized solution representation and a simple and effective decoding mechanism. Solutions are locally improved by a variable neighborhood descent in two neighborhood structures. Furthermore, to tackle the problem of repeated examination of already visited solutions we investigate a simple hash-based method to only detect duplicates or, alternatively, a trie-based complete solution archive to additionally derive new unvisited solutions. Experimental results show that our memetic algorithm outperforms existing heuristic approaches for this problem in most cases. Including the hash-based duplicate detection mostly further improves solution quality whereas the solution archive can only rarely obtain better results due to its operational overhead.

Mario Ruthmair, Günther R. Raidl
Effects of Data Grouping on Calibration Measures of Classifiers

The calibration of a probabilistic classifier refers to the extend to which its probability estimates match the true class membership probabilities. Measuring the calibration of a classifier usually relies on performing chi-squared goodness-of-fit tests between grouped probabilities and the observations in these groups.

We considered alternatives to the Hosmer-Lemeshow test, the standard chi-squared test with groups based on sorted model outputs. Since this grouping does not represent “natural” groupings in data space, we investigated a chi-squared test with grouping strategies in data space. Using a series of artificial data sets for which the correct models are known, and one real-world data set, we analyzed the performance of the Pigeon-Heyse test with groupings by self-organizing maps,

k

-means clustering, and random assignment of points to groups. We observed that the Pigeon-Heyse test offers slightly better performance than the Hosmer-Lemeshow test while being able to locate regions of poor calibration in data space.

Stephan Dreiseitl, Melanie Osl
Parameter Meta-optimization of Metaheuristic Optimization Algorithms

The quality of a heuristic optimization algorithm is strongly dependent on its parameter values. Finding the optimal parameter values is a laborious task which requires expertise and knowledge about the algorithm, its parameters and the problem. This paper describes, how the optimization of parameters can be automated by using another optimization algorithm on a meta-level. To demonstrate this, a meta-optimization problem which is algorithm independent and allows any kind of algorithm on the meta- and base-level is implemented for the open source optimization environment HeuristicLab. Experimental results of the optimization of a genetic algorithm for different sets of base-level problems with different complexities are shown.

Christoph Neumüller, Stefan Wagner, Gabriel Kronberger, Michael Affenzeller
Systolic Optimization on GPU Platforms

The article presents a systolic algorithm implemented using NVIDIA’s Compute Unified Device Architecture (CUDA). The algorithm works as a general disposition of the elements in a mesh by sinchronously computing basic solutions among processing elements. We have used instances of the Subset Sum Problem for evaluating to study the behavior of the proposed model. The experimental results show that the approach is very efficient, especially for large problem instances and consumes shorter times compared to other algorithms like parallel Genetic Algorithms and Random Search.

Enrique Alba, Pablo Vidal
Applying Heuristic Approaches for Predicting Defect-Prone Software Components

Effective and efficient quality assurance has to focus on those parts of a software system that are most likely to fail. Defect prediction promises to indicate the defect-prone components of a software system. In this paper we investigate the viability of predicting defect-prone components in upcoming releases of a large industrial software system. Prediction models constructed with heuristic machine learning are used to classify the components of future versions of the software system as defective or defect-free. It could be shown that the accuracy of the predictions made for the next version is significantly higher (around 74%) than guessing even when taking only new or modified components into account. Furthermore, the results reveal that, depending on the specific prediction model, acceptable accuracy can be achieved for up to three versions in the future.

Rudolf Ramler, Thomas Natschläger
Improved Packing and Routing of Vehicles with Compartments

We present a variable neighborhood search for the vehicle routing problem with compartments where we incorporate some features specifically aiming at the packing aspect. Among them we use a measure to distinguish packings and favor solutions with a denser packing, propose new neighborhood structures for shaking, and employ best-fit and best-fit-decreasing methods for inserting orders. Our approach yields encouraging results on a large set of test instances, obtaining new best known solutions for almost two third of them.

Sandro Pirkwieser, Günther R. Raidl, Jens Gottlieb
Application of Symbolic Regression on Blast Furnace and Temper Mill Datasets

This work concentrates on three different modifications of a genetic programming system for symbolic regression analysis. The coefficient of correlation

R

2

is used as fitness function instead of the mean squared error and offspring selection is used to ensure a steady improvement of the achieved solutions. Additionally, as the fitness evaluation consumes most of the execution time, the generated solutions are only evaluated on parts of the training data to speed up the whole algorithm. These three algorithmic adaptations are incorporated in the symbolic regression algorithm and their impact is tested on two real world datasets describing a blast furnace and a temper mill process. The effect on the achieved solution quality as well as on the produced models are compared to results generated by a symbolic regression algorithm without the mentioned modifications and the benefits are highlighted.

Michael Kommenda, Gabriel Kronberger, Christoph Feilmayr, Leonhard Schickmair, Michael Affenzeller, Stephan M. Winkler, Stefan Wagner
Analysis of Single-Objective and Multi-Objective Evolutionary Algorithms in Keyword Cluster Optimization

As it is not trivial to cope with the fast growing number of papers published in the field of medicine and biology, intelligent search strategies are needed to be able to access the required information as fast and accurately as possible. In [5] we have proposed a method for keyword clustering as a first step towards an intelligent search strategy in biomedical information retrieval. In this paper we focus on the analysis of the internal dynamics of the evolutionary algorithms applied here using solution encoding specific population diversity analysis, which is also defined in this paper. The population diversity results obtained using evolution strategies, genetic algorithms, genetic algorithms with offspring selection and also a multi-objective approach, the NSGA-II, are discussed here. We see that the diversity of the populations is preserved over the generations, decreasing towards the end of the runs, which indicates a good performance of the selection process.

Viktoria Dorfer, Stephan M. Winkler, Thomas Kern, Gerald Petz, Patrizia Faschang
A Heuristic Scheduling and Resource Management System for Solving Bioinformatical Problems via High Performance Computing on Heterogeneous Multi-platform Hardware

To process the data available in Bioinformatics, High Performance Computing is required. To efficiently calculate the necessary data, the computational tasks need to be scheduled and maintained. We propose a method of predicting runtimes in a heterogeneous high performance computing environment as well as scheduling methods for the execution of hgih performance tasks. The heuristic method used is the feedforward artificial neural network, which utilizes a collected history of real life data to predict and schedule upcoming jobs.

Andreas Hölzlwimmer, Hannes Brandstätter-Müller, Bahram Parsapour, Gerald Lirk, Peter Kulczycki
Comprehensive and Automatic Fitness Landscape Analysis Using HeuristicLab

Many different techniques have been developed for fitness landscape analysis. We present a consolidated and uniform implementation inside the heuristic optimization platform HeuristicLab. On top of these analysis methods a new approach to empirical measurement of isotropy is presented that can be used to extend existing methods. Results are shown using the existing implementation within HeuristicLab 3.3.

Erik Pitzer, Michael Affenzeller, Andreas Beham, Stefan Wagner
Particle Swarm Optimization with Two Swarms for the Discrete (r|p)-Centroid Problem

The (

r

|

p

)-centroid problem is a competitive location problem that consists of determining optimal strategies for two competing firms, the leader and the follower, which make decisions sequentially. We propose a particle swarm optimization procedure with two swarms to solve the discrete (

r

|

p

)-centroid problem. The proposed algorithm considers a swarm for each player in the location game.

Clara Campos-Rodríguez, José A. Moreno-Pérez, Dolores R. Santos-Peñate
ACO-GRASP-VNS Metaheuristic for VRP with Fuzzy Windows Time Constraints

In this article we propose a methodological approach based on Soft Computing to obtain solutions to the vehicle routing problem when time windows constraints are imprecise and flexible. A fuzzy model and method is obtained by applying a Fuzzy Optimization approach. A new hybrid algorithm is presented and applied to real problem instances of a distribution company that combines Ant Colony Optimization, Variable Neighbourhood Search and Greedy Randomize Adaptive Search Procedure for the corresponding fuzzy optimization problem.

J. Brito, F. J. Martínez, José A. Moreno-Pérez, J. L. Verdegay
Using Statistical Tests for Improving State-of-the-Art Heuristics for the Probabilistic Traveling Salesman Problem with Deadlines

The Probabilistic Traveling Salesman Problem with Deadlines (PTSPD) is a Stochastic Vehicle Routing Problem with a computationally demanding objective function. Currently heuristics using an approximation of the objective function based on Monte Carlo Sampling are the state-of-the-art methods for the PTSPD. We show that those heuristics can be significantly improved by using statistical tests in combination with the sampling-based evaluation of solutions for the pairwise comparison of solutions.

Dennis Weyland, Roberto Montemanni, Luca Maria Gambardella
Solving the Two-Dimensional Bin-Packing Problem with Variable Bin Sizes by Greedy Randomized Adaptive Search Procedures and Variable Neighborhood Search

In this work we present new metaheuristic algorithms to a special variant of the two-dimensional bin-packing, or cutting-stock problem, where a given set of rectangular items (demand) must be produced out of heterogeneous stock items (bins). The items can optionally be rotated, guillotine-cuttable and free layouts are considered. The proposed algorithms use various packing-heuristics which are embedded in a greedy randomized adaptive search procedure (GRASP) and variable neighborhood search (VNS) framework. Our results for existing benchmark-instances show the superior performance of our algorithms, in particular the VNS, with respect to previous approaches.

Andreas M. Chwatal, Sandro Pirkwieser
Market Basket Analysis of Retail Data: Supervised Learning Approach

In this work we discuss a supervised learning approach for identification of frequent itemsets and association rules from transactional data. This task is typically encountered in market basket analysis, where the goal is to find subsets of products that are frequently purchased in combination.

In this work we compare the traditional approach and the supervised learning approach to find association rules in a real-world retail data set using two well known algorithm, namely Apriori and PRIM.

Gabriel Kronberger, Michael Affenzeller

Computer Aided Systems Optimization

A Flexible and Reliable Radar Simulator in Matlab OOP for Optimizing Tracking Algorithms

In recent years, many techniques and algorithms have been developed to detect small and slow targets in Radar systems, but anyhow quantitative measures to evaluate the performance and the optimization of algorithm parameters is very hard to provide. Therefore a flexible and reliable Radar Simulator has been implemented in Matlab OOP, to provide datasets which will be used to relax these problems. The main feature of the proposed simulator is that it combines real clutter data with artificial target trajectories.

Andreas Weiss
Frequency Estimation beyond Nyquist Using Sparse Approximation Methods

In this work Sparse Approximation methods for frequency estimation of complex exponentials in white Gaussian noise are evaluated and compared against classical frequency estimation approaches. We use a non-equidistant sampling scheme which allows reconstructing frequencies far beyond the Nyquist rate. The evaluation is done for signals composed of one single complex exponential or the sum of two complex exponentials. We show that for the latter case the SA methods outperform the classical approaches. Especially when only a small number of signal samples are available the performance gain becomes significant.

Alexander Onic, Mario Huemer
Refinement of Simulation Models for Point-of-Load DC-DC Converters to Enable Accurate Simulation-Based Compensator Design

The starting point for each controller design exercise is a reliable and accurate model of the system under investigation. In this paper, an enhanced model of a Switched-Mode power supply for the application of Point-of-Load conversion is presented. After identifying the critical parameters, they are included in a refined model. Subsequently, the model parameters are verified against experimental results. We will show that the derived model accurately reflects the real system and is a reliable starting point for controller design tasks and optimizations.

Robert Priewasser, Matteo Agostinelli, Stefano Marsili
Optimized Filter Design for a Filter Bank Based Blocker Detection Concept for LTE Systems

For mobile communication systems power efficiency is a very important issue. Especially for mobile user equipments a careful management and efficient use of the limited energy ressources is mandatory. In today’s user equipments quite an amount of energy is wasted. The reason for this is, that analog and digital frontend in communication systems are engineered for extracting the wanted signal from a spectral environment, which is defined in the corresponding communication standards with strict requirements. In a real receiving process those requirements can typically be considered as less critical. Sensing the environmental transmission conditions and adapting the receiver architecture to the actual needs allows to save energy during the receiving process. An efficient architecture being able to fulfill this task for a typical Long Term Evolution scenario has been disussed recently. For the implementation of this architecture, highly efficient filter approaches had to be investigated. This paper gives an overview on the basic properties of those approaches and compares it to well known filter types.

Thomas Schlechter
Clustering and Data Aggregation as Factors of Wireless Sensor Network Lifetime

Wireless Sensor Networks are of great interest to the scientific community. Having many potential applications, they also present many research problems, concerning radio channel and collision avoidance, network management and self-organization, routing or node energy preservation. This work is focused on determining WSN lifetime dependence on network clusterization and aggregation algorithms. We show that the high inherent variability of WSN lifetime is caused by randomness of deployment of its nodes.

Bartosz Wojciechowski, Maciej Nikodem, Tomasz Surmacz
Synthesis of Logic Circuits Based on Negative Differential Resistance Property

This paper deals with negative differential resistance and its application to construction of threshold gates. We present evaluation of two synthesis algorithms for Generalised Threshold Gates and formulate properties and general steps of synthesis for Multi Threshold Threshold Gates.

Marek A. Bawiec, Bartosz Wojciechowski, Maciej Nikodem, Janusz Biernat
Simulation Based Optimization of Signal Processing for RFID

Over the recent years, an increasing use of near field radio frequency identification (RFID) technologies could be observed. In a near field RFID system, data is transmitted between an RFID reader and an inductively coupled RFID tag. The reader outputs a magnetic field that is used for the power supply of an RFID tag as well as for data transmission. In this work we will focus on the tag-to-reader communication by a method called

load modulation

. Load modulation means that data is transmitted by switching a resistor at the tag leading to different loads of the magnetic field. This modulates a voltage level in the reader circuit that is detected and further processed at the reader. When increasing the data transmission speed, inter-symbol interference (ISI) occurs. To counteract inter-symbol interference equalization algorithms are used. For designing such algorithms, a fast and precise simulation model is needed. This work presents a novel simulation model for tag-to-reader communication. Performance results of this simulation model showing a significant improvement in terms of simulation speed compared to a traditionally used circuit simulator and performance results of an equalization algorithm developed using this model are shown.

Michael Lunglmayr, Mario Huemer

Model-Based System Design, Simulation, and Verification

A Uniform Classification of Common Concurrency Errors

Nowadays, multi-threaded programs are quite common and so are concurrency-related errors. Many works devoted to detection of concurrency errors have been published in recent years, and many of them presented definitions of concurrency errors that the proposed algorithms are able to handle. These definitions are usually expressed in different terms suitable for a description of the particular considered algorithms, and they surprisingly often differ from each other in the meaning they assign to particular errors. To help understanding the errors and developing techniques for detecting them, this paper strives to provide a uniform taxonomy of concurrency errors common in current programs, with a stress on those written in Java, together with a brief overview of techniques so far proposed for detecting such errors.

Jan Fiedor, Bohuslav Křena, Zdeněk Letko, Tomáš Vojnar
An Easy to Use Infrastructure for Building Static Analysis Tools

This paper deals with design and implementation of an easy to use infrastructure for building static analyzers. The infrastructure provides an abstraction layer called a

Code Listener

over existing source code parsers like, for example, GCC or Sparse. It is distributed as a C++ library that can be used to build static analyzers in the form of GCC

plug-ins

. The interface exposed to analyzers is, however, completely independent of GCC, which allows one to run the same analyzer on top of different code parsers without a need to change anything in the analyzer. We describe the key design principles of the infrastructure and briefly introduce its application programming interface that is available to analyzers. The infrastructure is already used in research prototypes Predator and Forester, implementing advanced shape analyses, intended to operate on real industrial code.

Kamil Dudka, Petr Peringer, Tomáš Vojnar
Choice of Directions for the Approximation of Reachable Sets for Hybrid Systems

In this paper we propose an approach to over-approximate the reachable set (with bounded time and number of transitions) of a hybrid system by a finite set of polytopes. The constraints of the polytope are determined by a direction choice method. For the hybrid systems whose (1) continuous dynamics are linear, (2) invariants and guards are defined by linear inequalities, and (3) variable resets are expressed by invertible affine maps, we show that the over-approximations can be computed in polynomial time, and the overestimation can be arbitrarily reduced by decreasing the discretization time step if the continuous dynamics are all deterministic. Some experimental results are also presented to show the effectiveness of our approach.

Xin Chen, Erika Ábrahám
Unfoldings of Bounded Hybrid Petri Nets

The unfolding is a useful partial-order based method for analysis and verification of the Petri net properties. This technique can cope well with the so-called state space explosion problem, especially for the Petri nets with a lot of concurrency. The paper formalizes the concept of the unfolding for bounded hybrid Petri nets and introduces the algorithm for its computing.

Petr Novosad, Milan Češka
State Encoding and Minimization Methodology for Self-Checking Sequential Machines

State encoding methodology and minimization procedure for designing of reliable digital circuit - Totally Self Checking Sequential Machines is considered in this article. We limit considerations to Mealy sequential circuits with inputs, internal states and outputs encoded with any unordered code.

Agata Brzozowska, Jerzy Greblicki, Jerzy Kotowski
A Novel Approach to Modechart Verification of Real-Time Systems

Because real-time systems are often time-critical applications and their failure can have fatal consequences, it is important to ensure their correct behaviour. There exist many approaches for verification of real-time systems. Some use graphical formalisms, other various kinds of logics, to describe the system being verified. While graphical description can be significantly easier to use, it disallows to utilise many powerful methods for analysis and verification. In this paper, we propose a new approach for verification of real-time systems described by the Modechart graphical formalism by transforming the computation of the system onto a set of restricted real-time logic (RRTL) formulae. Moreover, if the verified property is known in advance, we are able to reduce the number of resulting RRTL formulae.

Jan Fiedor, Marek Gach, Milan Češka
Cloud Computing in Educational Applications Methods of Virtual Desktops Deployment

Task scheduling is one of the main parts of Cloud Computing (CC). We understand this process as mapping users tasks to appropriate resources to execute. Since we use virtual machines (VM) to hide physical resources form user the critical problem is how to schedule this machines (sub tasks) in given resources (ie. host machines). Tasks scheduling algorithms for CC systems are discussed in literature [2][4][5]. The most authors use mathematical model without consideration of many problems with desktop virtualization. In this paper we consider a new model for tasks scheduling of VM deployment for business and educational purposes. We also introduce genetic algorithm [1] for task scheduling in CC.

Agata Brzozowska, Jerzy Greblicki, Jerzy Kotowski

Computer Vision and Image Processing

Monocular Vision-Based Target Detection on Dynamic Transport Infrastructures

This paper describes a target detection system on transport infrastructures, based on monocular vision. The goal is to detect and track vehicles and pedestrians, dealing with objects variability, different illumination conditions, shadows, occlusions and rotations. A background subtraction method, based on GMM and shadow detection algorithms are proposed to do the segmentation of the image. Finally a feature extraction, optical flow analysis and clustering methods are used for the tracking step. The algorithm requires no object model and prior knowledge and it is robust to illumination changes and shadows.

S. Álvarez, M. A. Sotelo, D. F. Llorca, R. Quintero, O. Marcos
Precise Segmentation of the Optic Disc in Retinal Fundus Images

This paper presents a methodology for the precise extraction of the optic disc in retinal images. We propose a methodology that combines an enhancement algorithm, a fast location methodology and, finally, a segmentation technique to fit the optic disc boundaries. At each stage, we analyze several techniques and we present the results obtained in a public image data set.

A. Fraga, N. Barreira, M. Ortega, M. G. Penedo, M. J. Carreira
Speeding Up a Chaos-Based Image Encryption Algorithm Using GPGPU

Traditional ciphers like RSA, AES, etc. have proven to be slow ciphering multimedia information compared to chaos-based ones. For this reason and taking advantage of the availability of GPGPU, a chaotic image encryption algorithm has been ported to GPU using the CUDA programming framework in order to assess its performance and find out if this kind of accelerators provides a more suitable platform than the CPU for this sort of problems.

Juan José Rodríguez-Vázquez, Sixto Romero-Sánchez, Miguel Cárdenas-Montes
Surface Classification for Road Distress Detection System Enhancement

This paper presents a vision-based road surface classification in the context of infrastructure inspection and maintenance, proposed as stage for improving the performance of a distress detection system. High resolution road images are processed to distinguish among surfaces arranged according to the different materials used to build roads and their grade of granulation and striation. A multi-class Support Vector Machine (SVM) classification system using mainly Local Binary Pattern (LBP), Gray-Level Co-occurrence Matrix (GLCM) and Maximally Stable Extremal Regions (MSER) derived features is described. The different texture analysis methods are compared based on accuracy and computational load. Experiments with real application images show a significant improvement on the the distress detection system performance by combining several feature extraction methods.

M. Gavilán, D. Balcones, M. A. Sotelo, D. F. Llorca, O. Marcos, C. Fernández, I. García, R. Quintero
Analysis of Recent Advances in Optical Flow Estimation Methods

One of the key problems in computer vision is the estimation of motion in image sequences. The apparent displacement of the pixels through the image sequence is generally called

optical flow

. This is a low-level task that is the base for many other high-level applications, such us stereoscopic vision and 3D scene reconstruction, object tracking, ambient intelligence, video surveillance, medical image analysis, meteorological prediction and analysis, and so on. After many years of intense research, we may consider that the optical flow research field is not mature yet. The quality and amount of recent publications, with many important contributions, reflect that this is a very active field. It is attracting many researchers in computer vision that make evolve the field in a steady way. In this paper we examine the last contributions and most important ideas about optical flow that have appeared during the last years.

Javier Sánchez
Contextual and Skin Color Region Information for Face and Arms Location

Interest in intelligent human-computer interfaces has grown in recent years due to the possibilities that they offer. To these systems, two of the most important sources of interaction are the face and the arms gestures. Different face detection approaches have been made up to date, while arms detection is still a challenging task. This paper describes a methodology for the location of faces and arms in color images combining color information with region information and domain knowledge information. The obtained method is able to work very accurately regardless of races and skin colors, poses, resolutions, lighting conditions, and so on. It has been tested with a representative range of different arm positions, achieving encouraging results.

A. Fernandez, M. Ortega, B. Cancela, M. G. Penedo
Stereo-Vision Algorithm Based on Bio-Inspired Silicon Retinas for Implementation in Hardware

In this paper, we propose a silicon-retina-based stereo vision system, used for pre-crash warning respectively side-impact detection applications in vehicles. The bio-inspired Silicon Retina sensor is a new kind of sensor, with a high temporal resolution of 1

ms

and a dynamic range of approx. 120

dB

. These type of imagers deliver data asynchronously and only when the intensity of the ambient light changes. Therefore, the amount of data that must be processed decreases significantly compared to standard CMOS or CCD imagers. The sensor uses an address-event representation (AER) protocol to transfer the event-triggered information. Concerning these special output characteristics, a novel approach regarding acquisition, storage, and stereo matching of the data were implemented. The concept of the algorithm is specifically targeted and optimized for an implementation in hardware, e.g. on a Field Programmable Gate Array (FPGA).

Florian Eibensteiner, Jürgen Kogler, Christoph Sulzbachner, Josef Scharinger
Backmatter
Metadaten
Titel
Computer Aided Systems Theory – EUROCAST 2011
herausgegeben von
Roberto Moreno-Díaz
Franz Pichler
Alexis Quesada-Arencibia
Copyright-Jahr
2012
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-642-27549-4
Print ISBN
978-3-642-27548-7
DOI
https://doi.org/10.1007/978-3-642-27549-4

Premium Partner