Skip to main content

2012 | Buch

Applied Computational Intelligence in Engineering and Information Technology

Revised and Selected Papers from the 6th IEEE International Symposium on Applied Computational Intelligence and Informatics SACI 2011

herausgegeben von: Radu-Emil Precup, Szilveszter Kovács, Stefan Preitl, Emil M. Petriu

Verlag: Springer Berlin Heidelberg

Buchreihe : Topics in Intelligent Engineering and Informatics

insite
SUCHEN

Über dieses Buch

This book highlights the potential of getting benefits from various applications of computational intelligence techniques. The present book is structured such that to include a set of selected and extended papers from the 6th IEEE International Symposium on Applied Computational Intelligence and Informatics SACI 2011, held in Timisoara, Romania, from 19 to 21 May 2011. After a serious paper review performed by the Technical Program Committee only 116 submissions were accepted, leading to a paper acceptance ratio of 65 %. A further refinement was made after the symposium, based also on the assessment of the presentation quality. Concluding, this book includes the extended and revised versions of the very best papers of SACI 2011 and few invited papers authored by prominent specialists. The readers will benefit from gaining knowledge of the computational intelligence and on what problems can be solved in several areas; they will learn what kind of approaches is advised to use in order to solve these problems. A very important benefit for the readers is an understanding of what the major difficulties are and the cost-effective solutions to deal with them. This book will offer a convenient entry for researchers and engineers who intend to work in the important fields of computational intelligence.

Inhaltsverzeichnis

Frontmatter
Towards More Realistic Human Behaviour Simulation: Modelling Concept, Deriving Ontology and Semantic Framework
Abstract
This chapter argues in favour of semantic methods and approaches to human behaviour modelling. Semantic perspective can provide a seamless bridge between theoretical models and their software implementations, as well as contribute towards elegant and generic modular structure of the resulting simulation system. We describe our work in progress regarding highly realistic models of human behaviour and the impact of ontological reasoning on real-time simulations. We illustrate our approach in the context of the EDA project A-0938-RT-GC EUSAS, where we plan to implement it.
Ladislav Hluchý, Marcel Kvassay, Štefan Dlugolinský, Bernhard Schneider, Holger Bracker, Bartosz Kryza, Jacek Kitowski
2-DOF and Fuzzy Control Extensions of Symmetrical Optimum Design Method: Applications and Perspectives
Abstract
This chapter treats theoretical results concerning the Symmetrical Optimum method (SO-m), linear 2-DOF and fuzzy control extensions, perspectives and applications. The theoretical results are related to the Extended SO-m (ESO-m) and the double parameterization of the SO-m (2p-SO-m) introduced previously by the authors. Digital implementation aspects are given. The applications deal with speed and position control of rapid plants in mechatronic systems with focus on electrical drives with BLDC motors and variable moment of inertia.
Stefan Preitl, Alexandra-Iulia Stînean, Radu-Emil Precup, Claudia-Adina Dragoş, Mircea-Bogdan Rădac
Mixed Multidimensional Risk Aversion
Abstract
The topic treated in this chapter is the risk aversion of an agent in front of a situation of uncertainty with many risk parameters. We will study a general model of risk aversion in which some parameters are probabilistically described (by random variables) and others are possibilistically described (by fuzzy numbers). For the construction of this model, firstly, mixed expected utility, a notion, which unifies probabilistic and possibilistic aspects of expected utility theory is introduced. The notion of mixed risk premium vector is introduced as a measure of risk aversion with mixed parameters. The main result of the chapter is an approximate calculation formula for mixed risk premium vector. Lastly, our model is applied in the evaluation of risk aversion in grid computing.
Irina Georgescu, Jani Kinnunen
Strong Berge and Strong Berge Pareto Equilibrium Detection Using an Evolutionary Approach
Abstract
Nash equilibrium is an important solving concept in Game Theory. Playing in Nash sense means that no player can deviate from the equilibrium strategy in order to increase her/his payoff. Some games can have several Nash equilibria. Strong Berge and strong Berge Pareto equilibria are important refinements of the Nash equilibrium. An evolutionary technique based on non-domination is proposed in order to detect the strong Berge and strong Berge Pareto equilibria. Some numerical experiments are presented in order to illustrate the proposed method.
Noémi Gaskó, Rodica Ioana Lung, Dan Dumitrescu
A Proposal of the Information Retrieval System Based on the Generalized One-Sided Concept Lattices
Abstract
One of the important issues in information retrieval is to provide methods suitable for searching in large textual datasets. Some improvement of the retrieval process can be achieved by usage of conceptual models created automatically for analysed documents. One of the possibilities for creation of such models is to use well-established theory and methods from the area of Formal Concept Analysis. In this work we propose conceptual models based on the generalized one-sided concept lattices, which are locally created for subsets of documents represented by object-attribute table (document-term table in case of vector representation of text documents). Consequently, these local concept lattices are combined to one merged model using agglomerative clustering algorithm based on the descriptive (keyword-based) representation of particular lattices. Finally, we define basic details and methods of IR system that combines standard full-text search and conceptual search based on the extracted conceptual model.
Peter Butka, Jana Pócsová, Jozef Pócs
Visualization and Simulation Tool for Analyzing P-Graph Based Workflow Systems
Abstract
The operation of business and social organizations – on their own and also as being part of a big network – is highly complex. Without the modeling of business processes the efficient and cost effective operation of such systems is impossible. For this purpose the workflow systems have been introduced. The optimization of the workflow of the administrative work-based systems means a great challenge. Elaborated method is available for the P-graph based workflow modeling of administrative processes. This chapter wishes to give some help to this optimization by presenting the concept of visualization and simulation tool supporting workflow analysis and examination. Such an analysis makes the determination of resource constraints, bottlenecks and redundancies possible and enables more efficient operation.
József Tick
Benchmark Based Comparison of Two Fuzzy Rule Base Optimization Methods
Abstract
Parameter optimization is a key step during the creation of a fuzzy rule based system. It also has a determining effect on the resulting system’s performance. In this chapter, we examine the performance of several fuzzy systems obtained by applying two different optimization methods. In each case we start from an initial rule base that is created using fuzzy c-means clustering of a sample data set. The first examined optimization approach is the cross-entropy method while the second one is a hill-climbing based technique. We compare them in case of four benchmarking problems.
Zsolt Csaba Johanyák, Olga Papp
Three Evolutionary Optimization Algorithms in PI Controller Tuning
Abstract
This chapter discusses three evolutionary optimization algorithms employed in the optimal tuning of PI controllers dedicated to a class of second-order processes with an integral component and variable parameters. The evolutionary algorithms used in this chapter are: Particle Swarm Optimization (PSO), Gravitational Search Algorithm (GSA) and Charged System Search (CSS). The PI controllers are tuned such that to ensure a reduced sensitivity with respect to the parametric variations of the small time constant of the process. The application of the algorithms is illustrated in a case study.
Radu-Codruţ David, Radu-Emil Precup, Stefan Preitl, József K. Tar, János Fodor
Statistical Analysis of Next Generation Network Traffics Based on Wavelets and Transformation ON/(ON+OFF)
Abstract
The significant increase of trunk channel bandwidth makes much easier to integrate different types of traffics on the tier links without activating high processing power consuming QoS (Quality of Service) mechanisms in the intermediate nodes. Self-similarity, long range dependence and fractal characteristics of packet flows are strongly influenced by the QoS parameters in congested network environment. Several models are proposed for the qualitative and quantitative evaluation of physical phenomenon supervened on different OSI layers at the routers and switches. Most of these claims relatively long traces for evaluating both scale independence and fractal characteristics. The highlights of common usage of wavelet and ON/(ON+OFF) transformations in network traffic analysis are evaluated in this chapter. We take into consideration the channel load and the channel intensity as complex time series for evaluation the statistical characteristics of changes in time of the flows nature in packet switched networks. UDP and TCP traffics in tier and LAN networks are considered and statistically analyzed based on MRA (Multi Resolution Analysis) wavelets method. A fast detection algorithm of data and real time traffic burstiness is presented for a QoS based packet switched network environment with congestion.
Zoltan Gal, Gyorgy Terdik
Data Cleaning and Anomaly Detection for an Intelligent Greenhouse
Abstract
The effectiveness of greenhouse control can be improved by the application of model based intelligent control. Such control requires a good model of the greenhouse. For a large variety of industrial or recreational greenhouses the derivation of an analytical model is not feasible therefore black-box modeling has to be applied. Identification of black-box models requires large amount of data from real greenhouse environments. Measurement errors or missing values are common and must be eliminated to use the collected data efficiently as training samples. Rare weather conditions can temporally lead to unusual thermal behavior around and within the greenhouse. Anomaly detection run on the measurement data can identify such unusual samples and by excluding those from the model building better models and higher validation accuracy can be achieved. This chapter discusses problems of cleaning the measurement data collected in a well instrumented greenhouse, and introduces solutions for various kinds of missing data and anomaly detection problems.
Peter Eredics, Tadeusz P. Dobrowiecki
Clustering of Interval Data Using Self-Organizing Maps – Application to Meteorological Data
Abstract
The self-organizing map is a kind of artificial neural network used to map high dimensional data into a low dimensional space. This chapter presents a self-organizing map to do unsupervised clustering for interval data. This map uses an extension of the Euclidian distance to compute the proximity between two vectors of intervals where each neuron represents a cluster. The performance of this approach is then illustrated and discussed while applied to temperature interval data coming from Chinese meteorological stations. The bounds of each interval are the measured minimal and maximal values of the temperature. In the presented experiments, stations of similar climate regions are assigned to the same neuron or to a neighbor neuron on the map.
Chantal Hajjar, Hani Hamdan
A Set of Java Metrics for Software Quality Tree Based on Static Code Analyzers
Abstract
Assessing software quality allows cost cuts from the early development stages. Software quality information helps taking development decisions, checking fault corrections effect, estimating maintenance effort. Our fault density based quality model relies on static source code analyzers and on a set of language specific metrics. We compute the fault ratio for each static analyzer rule. Giving user defined weights to fault ratios we can quantify quality as a number. We identified, described informally and implemented in a prototype a set of Java metrics in order to fulfill our model and to accomplish our quality assessment goal.
Ciprian-Bogdan Chirilă, Vladimir Creţu
VLearning, a New Direction for eLearning Challenges
Abstract
In the information society the exponentially growing knowledge means a serious challenge to all the participants in the teaching-learning process. The methodology of education and its technical background have developed and improved drastically, which after the proliferation of the information technology and the internet got a new spur. It has been especially true since the wider and wider spreading of the web 2.0 and the 3D based virtual environments. Compared to former eLearning solutions this new virtual learning environment based on 3D information technology encourages more vividly the improvement of the efficiency of the learning management systems, the intaking of a wider scope of students in the learning process and fosters study motivation slightly subconsciously. This chapter outlines the development of this technology starting from eLearning to vLearning, as well as analyzes the most significant components of the mentioned technology, with the use of which education strives to make pace with the challenges of the present era.
Andrea Tick
A Receding Horizon Control Approach to Navigation in Virtual Corridors
Abstract
Applications in mobile robotics require safe and goal-oriented motion while navigating in an environment obstructed by obstacles. The dynamic window approach (DWA) to collision avoidance and its different variants provide safe motion among obstacles, although they have the same limitation, namely using an objective function consisting of weighted terms. Different situations require different weights; however, there is no algorithm for choosing them. The Global Dynamic Window Approach with Receding Horizon Control (GDWA/RHC) presented in this chapter is similar to DWA but it uses a global navigation function (NF) and a receding horizon control scheme for guiding the robot. In order to make the calculation of the navigation function computationally tractable it is constructed by interpolation from a discrete function. In addition to that the domain of the navigation function is restricted to a virtual corridor between the start and goal positions of the robot.
Domokos Kiss, Gábor Tevesz
High Speed Stereo Vision Based Automotive Collision Warning System
Abstract
This chapter presents a high speed, low latency stereo vision based collision warning system for automotive applications. The system uses two high speed cameras running at 100 fps and achieves latency below 0.1s by using an Nvidia Tesla C1060 GPU for accelerating computational expensive algorithms. From each pair of captured stereo images a disparity map is computed using the block matching algorithm, which is afterwards segmented in order to detect different objects in the scene. This segmentation is performed using a novel segmentation method based on the pixels’ intensity value and their connectivity. For each detected object its distance to the front of the vehicle is computed and the degree of danger is estimated by the collision warning module. Extensive experiments show that the presented system delivers reliable results for object detection as well as precise results in terms of estimated distance to the detected objects.
Adrian Leu, Dorin Aiteanu, Axel Gräser
Multithreaded Peripheral Processor for a Multicore Embedded System
Abstract
Multithreaded and multicore architectures represent a good solution used for increasing of parallelism degree exploited in modern computing systems and they can reduce the power dissipated in the chip by using low-frequency clock signals. These advantages recommend these parallel architectures for integration in the embedded systems, with restrictions imposed by the relatively small integration area. Particularities of embedded applications require hardware support able to handling in real time the peripheral interrupt requests. The performance of this hardware influences the performance of the entire parallel system. The current trend is to integrate in one microcontroller more processors used for general purpose application and a processor used for peripherals related applications (like interrupt services). In this chapter we present a peripheral processor architecture implemented in multithreaded technology, processor able to handle more tasks concurrently. Parallel execution of these peripheral tasks, in conjunction with a multicore processor used in processing of the general application (e.g. operating system), will lead to an increase in overall performance of the embedded system.
Horia V. Caprita, Mircea Popa
Using Cycle-Approximate Simulation for Bus Based Multi-Processor System-On Chip Analysis
Abstract
In this chapter, a cycle approximate simulator for multi-processor system-on chip is presented. The aim of this simulation tool is to enable an enhanced software/hardware analysis capability for bus based systems. The most important contributions are represented by its high flexibility (easy configuration of a SoC using dedicated libraries for generic and specific components and easy integration of other simulators and models), accurate modeling of features specific to multiprocessor systems (busses, inter processor communication mechanisms, etc), accurate implementation of a wide range of performance metrics and power consumption estimates (for processors that support this) and high simulation speed. This way, the proposed simulator can be used for both hardware architecture design exploration and software development.
Alexandru Amaricai, Alin Dobre, Oana Boncalo, Andrei Tanase, Camelia Valuch
A Case Study on Hardware/Software Codesign in Embedded Artificial Neural Networks
Abstract
Software/hardware codesign is a complex research problem that has been slowly making headway into industry-ready system design products. Recent advances have shown viability to this direction within the design space exploration scope, especially with regards to rapid development cycles. Here, we exploit the hardware/software codesign landscape in the artificial neural network problem space. Automated tools requiring minimal technical expertise from Altera and Tensilica are examined along with newer advances solely within hardware/software codesign research domain. The design space exploration options discussed here look to achieve better software/hardware partitions using instruction-set extensions and coprocessors. As neural networks continue to find usage in embedded systems, it has become imperative to efficiently optimize their implementation within a short development cycle. Modest speedups can be easily achieved with these automated hardware/software codesign tools on the benchmarks examined.
Jonathan Parri, John-Marc Desmarais, Daniel Shapiro, Miodrag Bolic, Voicu Groza
Pragmatic Method to Obtain Optimal Control Laws for Small Windgenerators
Abstract
The chapter presents a pragmatic method to obtain optimal implementable control methods of variable speed fixed blades small windgenerators, depending of available information values about the control object (wind speed, rotation speed, air density, blade’s position in the air flow). The elaborated method presumes the existence off a laboratory model having: an electromechanical analog model of considered turbine in accordance with a pre-established turbine theoretical mathematical model and the full generator-grid system, identical with the implemented on the site one. The pre-established turbine theoretical model allows to calculate for different wind speed values, in the established working domain, the optimum turbine values n T opt k, M T opt k, P T opt k , k=1, 2, ... .Having the obtained optimum turbine operation values, these optimum regimes may be experimentally obtained on the laboratory model, and all operation values of different elements of the conversion line “generator - power electronic interface – grid” may be measured. Different obtained parameter pairs may be use to determine regression functions that may be used as optimum wind control laws. Considering the hardware structure of studied windgenerator were chosen some optimum control laws. The obtained control laws was implemented and verified on Matlab / Simulink model considering different wind speed variations. The simulation results confirm the opportunity and the quality of the adopted optimal control law.
Radu Boraci, Octavian Prostean, Nicolae Budisan, Cosmin Koch-Ciobotaru
Applicability of Asymptotic Tracking in Case of Type 1 Diabetes
Abstract
The alarming increasing tendency of diabetes population attracts technological interest too. From an engineering point of view, the treatment of diabetes mellitus can be represented by an outer control loop, to replace the partially or totally deficient blood glucose control system of the human body. To acquire this “artificial pancreas” a reliable glucose sensor and an insulin pump is needed as hardware, and a control algorithm to ensure the proper blood glucose regulation is needed as software. The latter is a key point of the diabetes “closing the loop” problem and its primary prerequisite is a valid model able to describe the blood glucose system. In the current chapter one of the most widely used and complex nonlinear model will be investigated with a dual purpose. Specific control aspects are discussed in the literature only on linearized versions; however, differential geometric approaches give more general formalization. As a result our first aim is to hide the nonlinearity of the physiological model by transforming the control input provided by a linear controller so that the response of the model would mimic the behavior of a linear system. Hence, the validity of linear controllers can be extended from the neighborhood of a working point to a larger subset of the state-space bounded by specific constraints. On the other hand, applicability of the nonlinear methodology is tested on a simple PID control based algorithm compared with LQG optimal method. Simulations are done under MATLAB on realistic input scenarios. Since the values of the state variables are needed Kalman filtering is used for state estimation.
Péter Szalay, Levente Kovács
Optimal Energetic Conditions for Cell Seeding of Scaffolds
Abstract
Tissue Engineering is a novel area of biomedical research that combines the principles and methods of engineering and biology, in order to develop living tissues in vitro. The creation of functional tissue constructs presumes that living cells are seeded on different types and structures of biomaterials. Since the laboratory experiments are expensive and hard to perform, the computational approaches to tissue engineering are a cost-efficient alternative for predicting the growth of tissue constructs in vitro. This study developed computational models of biological systems formed by a cellular aggregate located on the plane surface of a biomaterial, respectively by a cellular aggregate located on a porous scaffold. Based on the Metropolis Monte Carlo method, we simulate the evolution of a cellular aggregate on the biomaterial’s surface and we identify the energetic conditions that lead to a uniform and rapid cell spreading. We have monitored the evolution of the centre of mass of the cells in the system and the number of cells attached to the substrate after running a certain number of Monte Carlo steps and we found that cell-cell interactions disfavor cell spreading, while cell-biomaterial interactions favor cell spreading. We have also simulated the distribution of cells in a porous scaffold, analyzing the energetic conditions that lead to a successful cell seeding.
Andreea Robu, Lacramioara Stoicu-Tivadar, Adrian Neagu
Ideas on a Pattern of Human Knowledge
Abstract
This chapter presents some ideas that concern a pattern of human knowledge. This pattern is based on the experimentation of causal relations. The cultural origin of the patterns is analyzed in terms of philosophical, psychological and linguistic points of view. An application scenario related to a robot integrated in a cognitive system is described. The definitions of signatures and of signature classes are given as useful steps in an alternative modeling approach to the observation process.
Claudiu Pozna, Radu-Emil Precup
Structural Improvements of the OpenRTM-aist Robot Middleware
Abstract
The robot middleware is a key concept in developing complex robot systems even in geographically distributed environment. Giving a handy API, reusable standardized components and communication channels together with some automatism, the robot middleware helps the user to build easily reconfigurable systems. The behavior of the components and the manner of interaction among them are standardized by the Robotic Technology Component (RTC) specification. One implementation of the RTC specification is the OpenRTM-aist. It is a well written and convenient modular system built upon the Common Object Request Broker Architecture (CORBA). The first version of CORBA is released in 1991. Nowadays the CORBA is getting more and more out of date. Our ultimate goal is developing a new robot middleware in which we apply a modern distributed framework instead of CORBA. As a first step of the substitution we suggest some practical extensions of the OpenRTM-aist by the adaptation of the Internet Communication Engine (ICE) for component communication and the introduction of the web application concept for system editing and control. Having these structural modifications the resulting system became more powerful than the original system has been. The chapter introduces some structural and implementation details of the OpenRTM-aist together with the results of the experiments done for the performance comparison of the original and extended system.
Zoltán Krizsán, Szilveszter Kovács
Decision Support at a New Global Level Definition of Products in PLM Systems
Abstract
Knowledge based support of decision making in product definition using modeling system may produce the advantage of coordination different opinions, attempts, and intents. In order to achieve this, special interaction between engineer and product object definition processes was proposed by the authors in order to communicate content for decision in the form of expertise and experience based knowledge. This knowledge is required to represent engineer in the course of coordinated definition of product objects. This chapter introduces a new method in order to establish a global level of the decision making on product object parameters. Global level control organizes product object generation processes on the local level of currently applied product entity definition. Local level processes should be modified in order to accept control from the global level. This control replaces direct human definition of product object parameters. Other new methodological element in the proposed modeling is that the only allowed way of product object definition is through definition of function and quality based behaviors of product. The main contribution of this chapter is introduction of the contextual chains along which communication is done from human thinking to product model entity parameter generation. The proposed method is demanded to be capable of serving product model definition for lifecycle in a product lifecycle management (PLM) system.
László Horváth, Imre J. Rudas
Autocollimator Calibration Using a Tangent Bar
Abstract
The chapter describes a new low cost angle generator intended for the calibration of autocollimator. The description of the working principle is followed by the detailed calibration procedure which is based on the comparison principle. The small angles are derived from relatively large displacements. Following the presentation of the operation of the small angle generator, the various error sources are discussed. Finally the chapter discusses the calculation of the extended calibration uncertainty in details.
Gyula Hermann, Kálmán Tomanyiczka
Gesture Control: A New and Intelligent Man-Machine Interface
Abstract
Although investigated from early days of research in the domain of human-computer interfaces, gesture-based control of computer application entered in the everyday life of computer users with the advent of 3D infrared cameras. The usage of real-time depth-mapping cameras and of robust image processing applied the flow of depth map streams triggered the production of a plethora of applications ranging from controlling electronic games and electronic devices such as TV sets or set-top boxes, to e-learning, and sterile environments such as operating rooms. Gesture and motion-based control of computer applications received increased attention from both academic and industrial research groups for the unique interactive experiences it offers. Of particular research interest has been the control of games through gesture-based interface. In this chapter after a brief survey of the methods and technologies used in the gesture control area, a new and intelligent user interface based on a sequence of gestures linked in a gesture language through a sign grammar will be introduced and described. The gesture recognition language is based on image processing functions which are grouped in a combination of algorithmic and learning approaches. Applications of the gesture language in gaming, TV and set-top box control, e-learning and virtual reality-reality interaction illustrate the validity of the approach.
Dan Ionescu, Bogdan Ionescu, Cristian Gadea, Shahid Islam
Backmatter
Metadaten
Titel
Applied Computational Intelligence in Engineering and Information Technology
herausgegeben von
Radu-Emil Precup
Szilveszter Kovács
Stefan Preitl
Emil M. Petriu
Copyright-Jahr
2012
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-642-28305-5
Print ISBN
978-3-642-28304-8
DOI
https://doi.org/10.1007/978-3-642-28305-5

Premium Partner