Skip to main content
Top

2016 | Book

Computational Collective Intelligence

8th International Conference, ICCCI 2016, Halkidiki, Greece, September 28-30, 2016. Proceedings, Part I

Editors: Ngoc-Thanh Nguyen, Lazaros Iliadis, Yannis Manolopoulos, Bogdan Trawiński

Publisher: Springer International Publishing

Book Series : Lecture Notes in Computer Science

insite
SEARCH

About this book

This two-volume set (LNAI 9875 and LNAI 9876) constitutes the refereed proceedings of the 8th International Conference on Collective Intelligence, ICCCI 2016, held in Halkidiki, Greece, in September 2016.

The 108 full papers presented were carefully reviewed and selected from 277 submissions.

The aim of this conference is to provide an internationally respected forum for scientific research in the computer-based methods of collective intelligence and their applications in (but not limited to) such fields as group decision making, consensus computing, knowledge integration, semantic web, social networks and multi-agent systems.

Table of Contents

Frontmatter

Multi-agent Systems

Frontmatter
A Novel Interaction Protocol of a Multiagent System for the Study of Alternative Decisions

The process of decision making is one of the core components of a cognitive system. In this paper, a simple, deterministic protocol for agent interaction in a multiagent system is proposed, which is based on passive stigmergy and can exhibit complex interactions, although in the end it stabilizes. This system can be used to study the effect of alternative decisions. A statistical analysis of several scenarios is provided, in which a perturbation, i.e. a single or a small set of alternative decisions, can change the final utility of an agent from minimum to maximum.

Florin Leon
Modeling Internalizing and Externalizing Behaviour in Autism Spectrum Disorders

This paper presents a neurologically inspired computational model for Autism Spectrum Disorders addressing internalizing and externalizing behaviour. The model has been verified by mathematical analysis and it is shown how by parameter tuning the model can identify the characteristics of a person based on empirical data.

Laura M. van der Lubbe, Jan Treur, Willeke van Vught
Analysis and Refinement of a Temporal-Causal Network Model for Absorption of Emotions

For an equilibrium for any member t An earlier proposed temporal-causal network model for mutual absorption of emotions aims to model emotion contagion in networks using characteristics such as traits of openness and expressiveness of the members of the network, and the strengths of the connections between them. The speed factors describing how fast emotional states change, were modeled based on these characteristics according to a fixed dependence relation. In this paper, particular implications of this choice are analyzed. Based on this analysis, a refinement of the model is proposed, offering alternative ways of modeling speed factors. This refinement is also analyzed and evaluated.

Eric Fernandes de Mello Araújo, Jan Treur
Cognitive Modelling of Emotion Contagion in a Crowd of Soccer Supporter Agents

This paper introduces a cognitive computational model of emotion contagion in a crowd of soccer supporters. It is useful for: (1) better understanding of the emotion contagion processes and (2) further development into a predictive and advising application for soccer stadium managers to enhance and improve the ambiance during the soccer game for safety or economic reasons. The model is neurologically grounded and focuses on the emotions “pleasure” and “sadness”. Structured simulations showed the following four emergent patterns of emotion contagion: (1) hooligans are very impulsive and are not fully open for other emotions, (2) fanatic supporters are very impulsive and open for other emotions, (3) family members are very easily influenced and are not very extravert, (4) the media is less sensible to the ambiance in the stadium. For validation of the model, the model outcomes were compared to the heart rate of 100 supporters and reported emotions. The model produced similar heart rate and emotional patterns. Further implications of the model are discussed.

Berend Jutte, C. Natalie van der Wal
The Role of Mood on Emotional Agents Behaviour

Formal modelling of emotions in agents is a challenging task. This is mainly due to the absence of a widely accepted theory of emotions as well as the two-way interaction of emotions with mood, perception, personality, communication, etc. In this work, we use the widely accepted dimensional theory of emotions according to which mood is a significant integrated factor for emotional state change and therefore behaviour. The theory is formally modelled as part of a state-based specification which naturally leads towards simulation of multi-agent systems. We demonstrate how moods of individual agents affect the overall behaviour of the crowd in a well-known example, that of the El Farol problem.

Petros Kefalas, Ilias Sakellariou, Suzie Savvidou, Ioanna Stamatopoulou, Marina Ntika
Modelling a Mutual Support Network for Coping with Stress

The emotional state of an individual is continuously affected by daily events. Stressful periods can be coped with by support from a person’s social environment. Support can for example reduce stress and social disengagement. Before improvements on the process of support are however made, it is essential to understand the actual real world process. In this paper a computational model of a network for mutual support is presented. The dynamic model quantifies the change in the network over time of stressors and support. The model predicts that more support is provided when more stress is experienced and when more people are capable of support. Moreover, the model is able to distinguish personal characteristics. The model behaves according to predictions and is evaluated by simulation experiments and mathematical analysis. The proposed model can be important in development of a software agent which aims to improve coping with stress through social connections.

Lenin Medeiros, Ruben Sikkes, Jan Treur

Knowledge Engineering and Semantic Web

Frontmatter
Knowledge Integration Method for Supply Chain Management Module in a Cognitive Integrated Management Information System

The diversity of the criteria and methods of analysis used in Supply Chain Management systems leads to a situation in which the system generates a lot of variants of solutions. However, the user expects the system one final variant. Therefore, the integration of knowledge is necessary. To resolve this problem, the consensus method is proposed in this paper. The aim of this paper is to develop and verify a method for knowledge integration in the SCM module in Cognitive Integrated Management Information System (CIMIS).The first part characterizes a SCM module in CIMIS. Next, a method for knowledge integration has been described. The last part of paper presents results of verification of developed method.

Marcin Hernes
An Asymmetric Approach to Discover the Complex Matching Between Ontologies

This paper introduces an extensional and asymmetric alignment approach capable of identifying complex mappings between OWL ontologies. This approach employ the association rule to detect implicative and conjunctive mapping containing complex correspondences. Method for extracting the complex mappings is presented and results of experiments carried out on the large biomedical ontologies and the anatomy track available to Test library of Ontology Alignment Evaluation Initiative show the efficiency of the approach proposed.

Fatma Kaabi, Faiez Gargouri
An Evidential Approach for Managing Temporal Relations Uncertainty

Temporal information can be often perceived in a vague way as infected with imprecision and uncertainty. Therefore, we need to find some way of handling the invaluable temporal information. This paper presents a belief functions-based approach to represent temporal relations uncertainty in the point algebra context. We would like to show the concept of mass function is suitable for modeling the uncertain knowledge about possible relations between dates. The temporal uncertainty can also be expressed thanks to a vector of belief measures. A set of rules that allows some reasoning about evidential temporal relations, is then established.

Nessrine El Hadj Salem, Allel Hadjali, Aymen Gammoudi, Boutheina Ben Yaghlane
An Improvement of the Two-Stage Consensus-Based Approach for Determining the Knowledge of a Collective

Generally the knowledge of a collective, which is considered as a representative of the knowledge states in a collective, is often determined based on a single-stage approach. For big data, however, a collective is often very large, a multi-stage approach can be used. In this paper we present an improvement of the two-stage consensus-based approach for determining the knowledge of a large collective. For this aim, clustering methods are used to classify a large collective into smaller ones. The first stage of consensus choice aims at determining the representatives of these smaller collectives. Then these representatives will be treated as the knowledge states of a new collective which will be the subject for the second stage of consensus choice. In addition, all the collectives will be checked for susceptibility to consensus in both stages of consensus choice process. Through experiments analysis, the improvement method is useful in minimizing the difference between single-stage and two-stage consensus choice approaches in determining the knowledge of a large collective.

Van Du Nguyen, Ngoc Thanh Nguyen, Dosam Hwang

Natural Language and Text Processing

Frontmatter
An Approach to Subjectivity Detection on Twitter Using the Structured Information

In this paper, we propose an approach to the subjectivity detection on Twitter micro texts that explores the uses of the structured information of the social network framework. The sentiment analysis on Twitter has been usually performed through the automatic processing of the texts. However, the established limit of 140 characters and the particular characteristics of the texts reduce drastically the accuracy of Natural Language Processing (NLP) techniques. Under these circumstances, it becomes necessary to study new data sources that allow us to extract new useful knowledge to represent and classify the texts. The structured information, also called meta-information or meta-data, provide us with alternative features of the texts that can improve the classification tasks. In this study we have analysed the use of features extracted from the structured information in the subjectivity detection task, as a first step of the polarity detection task, and their integration with classical features.

Juan Sixto, Aitor Almeida, Diego López-de-Ipiña
WSD-TIC: Word Sense Disambiguation Using Taxonomic Information Content

Word sense disambiguation (WSD) is the ability to identify the meaning of words in context in a computational manner. WSD is considered as an AI-complete problem, that is, a task whose solution is at least as hard as the most difficult problems in artificial intelligence. This is basically used in application like information retrieval, machine translation, information extraction because of its semantics understanding. This paper describes the proposed approach (WSD-TIC) which is based on the words surrounding the polysemous word in a context. Each meaning of these words is represented by a vector composed of weighted nouns using taxonomic information content. The main emphasis of this paper is feature selection for disambiguation purpose. The assessment of WSD systems is discussed in the context of the Senseval campaign, aiming at the objective evaluation of our proposal to the systems participating in several different disambiguation tasks.

Mohamed Ben Aouicha, Mohamed Ali Hadj Taieb, Hania Ibn Marai
What Makes Your Writing Style Unique? Significant Differences Between Two Famous Romanian Orators

This paper introduces a novel, in-depth approach of analyzing the differences in writing style between two famous Romanian orators, based on automated textual complexity indices for Romanian language. The considered authors are: (a) Mihai Eminescu, Romania’s national poet and a remarkable journalist of his time, and (b) Ion C. Brătianu, one of the most important Romanian politicians from the middle of the 18th century. Both orators have a common journalistic interest consisting in their desire to spread the word about political issues in Romania via the printing press, the most important public voice at that time. In addition, both authors exhibit writing style particularities, and our aim is to explore these differences through our ReaderBench framework that computes a wide range of lexical and semantic textual complexity indices for Romanian and other languages. The used corpus contains two collections of speeches for each orator that cover the period 1857–1880. The results of this study highlight the lexical and cohesive textual complexity indices that reflect very well the differences in writing style, measures relying on Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) semantic models.

Mihai Dascalu, Daniela Gîfu, Stefan Trausan-Matu
A Novel Approach to Identify Factor Posing Pronunciation Disorders

Literature seems rich with approaches which are based on the features contained in the speech signal and natural language processing techniques to detect vocal pathologies in human speeches. From the literature, we can mention also that several factors (vocal pathology, non-native speaker, psychological state, age …) can pose pronunciation disorders [10]. But to our knowledge, no work has treated pathological speech to identify factor posing pronunciation disorders. The current work consists in introducing an original approach based on the forced alignment score [8] to identify the factor posing mispronunciations contained in the Arabic speech. We distinguish two main factors: the pronunciation disorders can be from native speakers with vocal pathology or from non-native speakers who do not master Arabic-phoneme pronunciation. The results are encouraging; we attain an identification rate of 95 %. Biologists and computer scientists can benefit from our proposed approach to design high performance systems of vocal pathology diagnostic.

Naim Terbeh, Mounir Zrigui
Towards an Automatic Intention Recognition from Client Request

Nowadays, the relentless growth of the IT (Information Technology) market and the evolution of Service-oriented architectures (SOA) make the establishment of Service Level Agreements (SLA) between providers and clients a complex task. In fact, clients find many IT offers with complex terms especially if they do not share the same technical knowledge with providers. These latter have to well understand clients’ requirements in order to be able to properly address their needs. In this context, ontologies can help in bridging the gap between provider’s offers and client’s needs. In this paper, we define firstly an ontology structure that models clients’ intentions. Furthermore, we propose an approach for intention recognition from textual request written in English to automatically populate the intention ontology structure. An illustrative case is finally presented to prove the accurate performance of our proposed approach.

Noura Labidi, Tarak Chaari, Rafik Bouaziz

Data Mining Methods and Applications

Frontmatter
Fuzzy Cognitive Maps for Long-Term Prognosis of the Evolution of Atmospheric Pollution, Based on Climate Change Scenarios: The Case of Athens

Air pollution is related to the concentration of harmful substances in the lower layers of the atmosphere and it is one of the most serious problems threatening the modern way of life. Determination of the conditions that cause maximization of the problem and assessment of the catalytic effect of relative humidity and temperature are important research subjects in the evaluation of environmental risk. This research effort describes an innovative model towards the forecasting of both primary and secondary air pollutants in the center of Athens, by employing Soft Computing Techniques. More specifically, Fuzzy Cognitive Maps are used to analyze the conditions and to correlate the factors contributing to air pollution. According to the climate change scenarios till 2100, there is going to be a serious fluctuation of the average temperature and rainfall in a global scale. This modeling effort aims in forecasting the evolution of the air pollutants concentrations in Athens as a consequence of the upcoming climate change.

Vardis-Dimitris Anezakis, Konstantinos Dermetzis, Lazaros Iliadis, Stefanos Spartalis
The Results of a Complex Analysis of the Modified Pratt-Yaskorskiy Performance Metrics Based on the Two-Dimensional Markov-Renewal-Process

The paper presents the results of a quantitative estimation of the edge detection quality using modified Pratt-Yaskorskiy criterion, as well as generalization and adaptation of both approaches based on the generalized quality criterion as part of «CS sF» stochastic simulation software package. The reference images are approximated by the two-dimensional high rise renewal stream offering the stationarity properties with no aftereffects and ordinariness. The efficiency of the proposed metrics is considered for three edging algorithms (Marr-Hildreth, ISEF and Canny) at different levels of the additive normal noise. The estimated errors of the first and second kind are given, which allow referring to the efficiency of the proposed generalized quality criterion.

Viktor Geringer, Dmitry Dubinin, Alexander Kochegurov
Generic Ensemble-Based Representation of Global Cardiovascular Dynamics for Personalized Treatment Discovery and Optimization

Accurate and timely diagnostics does not warranty successful treatment outcome due to subtle personal differences, especially in the case of complex or rare cardiac abnormalities. A proper representation of global cardio dynamics could be used for quick and objective matching of the current patient to former cases with known treatment plans and outcomes. Previously we have proposed the approach for heart rate variability (HRV) analysis based on ensembles of different measures discovered by boosting algorithms. Unlike original HRV techniques, ensemble-based metrics could be much more accurate in early detection of short-lived or emerging abnormal regimes and slow changes in long-range dynamic patterns. Here we demonstrate that the same metrics applied to long HRV time series, collected by Holter monitors or other means, could provide effective characterization of global cardiovascular dynamics for decision support in discovery and optimization of personalized treatments.

Olga Senyukova, Valeriy Gavrishchaka, Maria Sasonko, Yuri Gurfinkel, Svetlana Gorokhova, Nikolay Antsygin
Avoiding the Curse of Dimensionality in Local Binary Patterns

Local Binary Patterns is a popular grayscale texture operator used in computer vision for classifying textures. The output of the operator is a bit string of a defined length, usually 8, 16 or 24 bits, describing local texture features. We focus on the problem of succinctly representing the patterns using alternative means and compressing them to reduce the number of dimensions. These reductions lead to simpler connections of Local Binary Patterns with machine learning algorithms such as neural networks or support vector machines, improve computation speed and simplify information retrieval from images. We study the distribution of Local Binary Patterns in 100000 natural images and show the advantages of our reduction technique by comparing it to existing algorithms developed by Ojala et al. We have also confirmed Ojala’s findings about the uniform LBP proportions.

Karel Petranek, Jan Vanek, Eva Milkova
On the Foundations of Multinomial Sequence Based Estimation

This paper deals with the relatively new field of sequence-based estimation which involves utilizing both the information in the observations and in their sequence of appearance. Our intention is to obtain Maximum Likelihood estimates by “extracting” the information contained in the observations when perceived as a sequence rather than as a set. The results of [15] introduced the concepts of Sequence Based Estimation (SBE) for the Binomial distribution. This current paper generalizes these results for the multinomial “two-at-a-time” scenario. We invoke a novel phenomenon called “Occlusion” that can be described as follows: By “concealing” certain observations, we map the estimation problem onto a lower-dimensional binomial space. Once these occluded SBEs have been computed, we demonstrate how the overall Multinomial SBE (MSBE) can be obtained by mapping several lower-dimensional estimates onto the original higher-dimensional space. We formally prove and experimentally demonstrate the convergence of the corresponding estimates.

B. John Oommen, Sang-Woon Kim
Global Solar Radiation Prediction Using Backward Propagation Artificial Neural Network for the City of Addis Ababa, Ethiopia

Ethiopia is located close to the equatorial belt that receives abundant solar energy. For Ethiopia, to achieve the optimum utilization of solar energy, it is necessary to evaluate the incident solar radiation over the countries of interest. Though, sophisticated and costly equipment are available but they are very limited for developing countries’ like Ethiopia. This paper is therefore tries to explore the use of artificial neural network method for predicting the daily global solar radiation in the horizontal surface using secondary data in the city of Addis Ababa. For this purpose, the meteorological data of 1195 days from one station in Addis Ababa along the years 1985–1987 were used for training testing and validating the model All independent variables (Min and Max Temperature, humidity, sunshine hour and wind speed were normalized and added to the model. Then, Back propagation (BP) Artificial Neural Network (ANN) method was applied for prediction and training respectively to determine the most suitable independent (input) variables. The results obtained by the ANN model were validated with the actual data and error values were found within acceptable limits. The findings of the study show that the Root Mean Square Error (RMSE) is found to be 0.11 and correlation coefficient (R) value was obtained 0.901 during prediction.

Younas Worki, Eshetie Berhan, Ondrej Krejcar
Determining the Criteria for Setting Input Parameters of the Fuzzy Inference Model of P&R Car Parks Locating

The paper presents the fuzzy inference model of evaluation of P&R facilities location. In such a system there are car parks where travellers can change from car to public transport. Recognising proper locations of the P&R facilities is a key aspect for the system. On the basis of previous studies it was found that developing algorithms supporting the determination of input parameters to the model is necessary. The authors created recurrent algorithm for determining the parameter of the road quality. For the parameters of quantity of public transport means and the distance to the city centre the algorithm based on the exponential function was made. For determining the parameter describing the connection to the main road the fuzzy inference model was built. Thanks to the developed methodology the model of P&R locating has become more universal and possible to use for a broader group of experts.

Michał Lower, Anna Lower
On Systematic Approach to Discovering Periodic Patterns in Event Logs

Discovering periodic patterns from historical information is a computationally hard problem due to the large amounts of historical data to be analyzed and due to a high complexity of the patterns. This work shows how the derivations rules for periodic patterns can be applied to discover complex patterns in case of logs of events. The paper defines a concept of periodic pattern and its validation in a workload trace created from the logs of events. A system of derivations rules that transforms periodic patterns into the logically equivalent ones is proposed. The paper presents a systematic approach based on the system of derivation rules to discovery of periodic patterns in logs of events.

Marcin Zimniak, Janusz R. Getta
The Fuzzy Approach to Assessment of ANOVA Results

Typically, the analysis of variance (ANOVA) is used to compare means in the subsets obtained through the division of a large numerical dataset by assigning a categorical variable labels to dataset’s values. The test criterion for the decision on ‘all equal’ vs. ‘not all equal’ is a comparison of the significance level described by a well-known p-value and the a priori assigned critical significance level, α, usually 0.05. This comparison is treated very strictly basing on the crisp value; however, it should not be so, especially if p-value is near α, because the certainty of the decision varies rather smoothly from ‘strongly not’ to ‘no opinion’ to ‘strongly yes’. It is very interesting to analyze such results on the basis of the fuzzy arithmetic theory, using the modified Buckley’s fuzzy approach to the statistics combined with the bootstrap approach, because it may be adopted to the cases where subjective assessments are introduced as quasi-measurements.

Jacek Pietraszek, Maciej Kołomycki, Agnieszka Szczotok, Renata Dwornicka
Virtual Road Condition Prediction Through License Plates in 3D Simulation

Predicting the road conditions lie curves, slopes, hills, helps drivers react faster to avoid possible collisions in hypovigilance and besides, this kind of driver assistance system is more crucial for intelligent vehicles. Even though there are many radar, wifi, infrared systems and devices, what we propose in this paper is a monocular license plate segmentation to foresee the road ahead while cruising behind a blinding vehicle. License plates in the precalibrated images from 3D simulation are segmented and analyzed to identify the front car’s angle of repose. Therefore the angles of the road are estimated frame by frame with calculated distances for prediction of the virtual road.

Orcan Alpar, Ondrej Krejcar

Decision Support and Control Systems

Frontmatter
Creating a Knowledge Base to Support the Concept of Lean Administration Using Expert System NEST

Concept of lean administration is a part of a comprehensive concept of lean company. The purpose of the concept of lean administration is to identify, classify and minimize waste in administrative processes. This paper discusses the following topics: lean administration principles, basic model of lean administration and creating lean administration knowledge base using IF-THEN rules. Case study focuses on process of creating a knowledge base that will support an implementation of lean administration concept. The main tasks of the knowledge base are mainly the following: identification of level of waste in administrative processes and recommendation of appropriate methods and tools of industrial engineering to reduce waste in administrative processes.The knowledge base for lean administration was created in an expert system NEST and testing was done on the basis of fictitious data about administrative processes.

Radim Dolak
Fuzzy Bionic Hand Control in Real-Time Based on Electromyography Signal Analysis

In this paper a fuzzy model for control of bionic hand in real-time is proposed. The control process involves interpretation and analysis of surface electromyography signal (sEMG) acquired from patients with amputees. The work considers the use of force sensing resistor to achieve better control of the artificial hand. The classical type-1 Mamdani fuzzy control model is considered for this application. The conducted experiments show comparable results with respect to applied assumptions that give the confidence to implement the proposed concept into real-time control process.

Martin Tabakov, Krzysztof Fonal, Raed A. Abd-Alhameed, Rami Qahwaji
Controllability of Positive Discrete-Time Switched Fractional Order Systems for Fixed Switching Sequence

In the article unconstrained controllability problem of positive discrete-time switched fractional order systems is addressed. A solution of discrete-time switched fractional order systems is presented. Additionally, a transition matrix of considered dynamical systems is given. A sufficient condition for unconstrained controllability in a given number of steps is formulated and proved using the general formula of solution of difference state equation. Finally, the illustrative examples are also presented.

Artur Babiarz, Adrian Łęgowski, Michał Niezabitowski
A Trace Clustering Solution Based on Using the Distance Graph Model

Process discovery is the most important task in the process mining. Because of the complexity of event logs (i.e. activities of several different processes are written into the same log), the discovered process models may be diffuse and unintelligible. That is why the input event logs should be clustered into simpler event sub-logs. This work provides a trace clustering solution based on the idea of using the distance graph model for trace representation. Experimental results proved the effect of the proposed solution on two measures of Fitness and Precision, especially the effect on the Precision measure.

Quang-Thuy Ha, Hong-Nhung Bui, Tri-Thanh Nguyen
On Expressiveness of TCTL for Model Checking Distributed Systems

Systems analysis is becoming more and more important in different fields such as network applications, communication protocols and client server applications. This importance is seen from the fact that these systems are faced to specific errors like deadlocks and livelocks which may in the major cases cause disasters. In this context, model checking is a promising formal method which permits systems analysis at early stage, thus ensuring prevention from errors occurring. In this paper, we propose an extension of timed temporal logic TCTL with more powerful modalities called $$TCTL^{\varDelta }_{h}$$. This logic permits to combine in the same property clocks quantifiers as well as features for transient states. We formally define the syntax and the semantics of the proposed quantitative logic and we show via examples its expressive power.

Naima Jbeli, Zohra Sbaï, Rahma Ben Ayed

Innovations in Intelligent Systems

Frontmatter
Cost Optimizing Methods for Deterministic Queuing Systems

The paper discusses two proposed methods for the cost optimization of the deterministic queuing system based on the control of the queue lengths. The first method uses the evaluation of actual states at the particular service places according to their development. The decision is then based on the comparison of the criteria of productivity and the expended costs. The suggested change in the system setting with the highest priority is then accomplished. The second method is based on the simulation of the future states and on this basis the appropriate time and type of the modification of the system setup is suggested.

Martin Gavalec, Zuzana Němcová
User Authentication Through Keystroke Dynamics as the Protection Against Keylogger Attacks

This paper addresses an authentication’s scheme based on behavioural biometric method which is users’ keystroke style. During an initial interaction with some system a proposed method identifies user’s typing pattern and eventually creates a search template that will be further used during the authentication. It is based on an encryption of random characters into a typed password by injecting a set of emulated keystrokes when the actual typing occurs. In a decoding phase, the algorithm searches for characters for which the user’s typing time is the most suitable within the assigned typing template. The article contains an overview of the developed method along with an analysis of its usability and an experimental evaluation based on assumed criteria of the false acceptance rate (FAR), the false rejection rate (FRR) and the equal error rate (EER). The obtained results have been compared with the existing method.

Adrianna Kozierkiewicz-Hetmańska, Aleksander Marciniak, Marcin Pietranik
Data Warehouses Federation as a Single Data Warehouse

In this paper author presents an experiment, which shows that it is possible to form a federation of data warehouses that may simulate effectively one, “super” data warehouse. There is no need to create complete ETL tool to load data from source data warehouses into one, dedicated data warehouse. Good relations between global schema and local schemas extracted during schema integration are indispensable to create an effective federation.

Rafał Kern
A Solution for Automatically Malicious Web Shell and Web Application Vulnerability Detection

According to Internet Live Stats, it is evident that organizations and developers are underestimating security issues on their system. In this paper, we propose a protective and extensible solution for automatically detecting both the Web application vulnerabilities and malicious Web shells. Based on the original THAPS, we proposed E-THAPS that has a new detecting mechanism, improved SQLi, XSS and vulnerable functions detecting capabilities. For malicious Web shell detection, taint analysis and pattern matching methods are selected as the main approach. The broad experiment that we performed showed our outstanding results in comparison with other solutions for detecting the Web application vulnerabilities and malicious Web shells.

Van-Giap Le, Huu-Tung Nguyen, Dang-Nhac Lu, Ngoc-Hoa Nguyen
Data Evolution Method in the Procedure of User Authentication Using Keystroke Dynamics

Due to the rapid development of Internet and web-based application the number of system which an ordinary user needs to interact grows almost proportionally. People are expected to make bank transfers, send emails using multiple mailboxes, send tax declarations, send birthday wishes solely online. What is more, sometimes only this way being available. The sensitivity of information created using online tools is unquestionable and the highest possible level of data security is therefore expected not only on a corporate level, but also it should be guaranteed to ordinary users. That is the reason why a convenient solution, that do not require any additional expensive equipment (e.g. RFID cards, fingerprint readers, retinal scanners), can assure such security is highly wanted. Therefore, a number of publications have been devoted to methods of user authentication based on their biometrical characteristics (that are obviously individual and can be easily used to encrypt users’ credentials) and one potentially most accessible group of methods is build on top of analysis of users’ personal typing styles. This paper is a presentation of a data evolution method used in our novel biometrical authentication procedure and contains a statistical analysis of the conducted experimental verification.

Adrianna Kozierkiewicz-Hetmanska, Aleksander Marciniak, Marcin Pietranik

Cooperative Strategies for Decision Making and Optimization

Frontmatter
Multiagent Cooperation for Decision-Making in the Car-Following Behavior

This paper presents a decision-making model for determining the velocity and safety distance values basing-on anticipation of the simulation parameters. Thus, this paper is composed of two parts. In the first one, we used a bi-level bi-objective modeling to address the problem of decision-making with two objectives, which are, maximize the safety distance and maximize the velocity, in order to define a link between the increase of velocity and the road safety in the car-following behavior. In the second part, we resolve our modeling basing-on a multi-agent cooperation approach by applying of the Tabu search algorithm. The simulation results showing the advantages of our approach, such as, the use of the multi-agent cooperation approach reflects the high number of tested solutions in a very short search time, which guarantees the high quality of selected solution for each simulation step.

Anouer Bennajeh, Fahem Kebair, Lamjed Ben Said, Samir Aknine
GRASP Applied to Multi–Skill Resource–Constrained Project Scheduling Problem

The paper describes an application of Greedy Randomized Adaptive Search Procedure (GRASP) in solving Multi–Skill Resource-Constrained Project Scheduling Problem (MS-RCPSP). Proposed work proposes a specific greedy–based local search and schedule constructor specialised to MS-RCPSP. The GRASP is presented as the better option to classical heuristic but also as a faster and successful alternative to another metaheuristic. To compare results of GRASP to others approaches, various methods are proposed: methods of constructing scheduling based on the greedy algorithm, randomized greedy approach, and HAntCO. The research was performed using all instances of benchmark iMOPSE dataset and the results compared to best–known methods.

Paweł B. Myszkowski, Jȩdrzej J. Siemieński
Biological Regulation and Psychological Mechanisms Models of Adaptive Decision-Making Behaviors: Drives, Emotions, and Personality

The aim of this paper is to suggest a framework for adaptive agent decision-making modeling of biological regulation and psychological mechanisms. For this purpose, first, a perception-action cycle scheme for the agent-environment interactions and deduced framework for adaptive agent decision-making modeling are developed. Second, motivation systems: drives (homeostatic regulation), personality traits (five-factor model), and emotions (basic emotions) are developed. Third, a neural architecture implementation of the framework is suggested. Then, first tests related to a stimulation-drive (from a moving object), for two different agent personalities, and the activation level of emotions are presented and analyzed. The obtained results demonstrate how the personality and emotion of the agent can be used to regulate the intensity of the interaction; predicting a promising result in future: to demonstrate how the nature of the interaction (stimulation-drive, social-drive, …) influences the agent behavior which could be very interesting for cooperative agents.

Amine Chohra, Kurosh Madani
A Constraint-Based Approach to Modeling and Solving Resource-Constrained Scheduling Problems

Constrained scheduling problems are common in manufacturing, project management, transportation, supply chain management, software engineering, computer networks etc. Multiple binary and integer decision variables representing the allocation of resources to activities and numerous specific constraints on these variables are typical components of the constraint scheduling problem modeling. With their increased computational complexity, the models are more demanding, particularly when methods of operations research (mathematical programming, network programming, dynamic programming) are used. By contrast, most resource-constrained scheduling problems can be easily modeled as instances of the constraint satisfaction problems (CSPs) and solved using constraint programming (CP) or others methods. In the CP-based environment the problem definition is separated from the methods and algorithms used to solve the problem. Therefore, a constraint-based approach to resource-constrained scheduling problems that combines an OR-based approach for problem solving and a CP-based approach for problem modeling is proposed. To evaluate the applicability and efficiency of this approach and its implementation framework, illustrative examples of resource-constrained scheduling problems are implemented separately for different environments.

Paweł Sitek, Jarosław Wikarek
Some Remarks on the Mean-Based Prioritization Methods in AHP

EVM (eigenvector method) and GMM (geometric mean method) are probably the two most popular priority deriving techniques for AHP (Analytic Hierarchy Process). Although much has already been discussed about these methods, one frequently repeated question is: what do they have in common? In this paper we show that both these methods can be constructed based on the same principle that the priority of one alternative should correspond to the weighted mean of priorities of other alternatives. We also show how the accepted principle can be used to construct priority deriving methods for the generalized (non-reciprocal) pairwise comparisons matrices.

Konrad Kułakowski, Anna Kȩdzior
Bi-criteria Data Reduction for Instance-Based Classification

One of the approaches to deal with the big data problem is the training data reduction which may improve generalization quality and decrease complexity of the data mining algorithm. In this paper we see the instance reduction problem as the multiple objective one and propose criteria which allow to generate the Pareto-optimal set of ‘typical’ instances. Next, the reduced dataset is used to construct classification function or to induce a classifier. The approach is validated experimentally.

Ireneusz Czarnowski, Joanna Jȩdrzejowicz, Piotr Jȩdrzejowicz
Dynamic Cooperative Interaction Strategy for Solving RCPSP by a Team of Agents

In this paper a dynamic cooperative interaction strategy for the A-Team solving the Resource-Constrained Project Scheduling Problem (RCPSP) is proposed and experimentally validated. The RCPSP belongs to the class of NP-hard optimization problems. To solve this problem a team of asynchronous agents (A-Team) has been implemented using multiagent environment. An A-Team consist of the set of objects including multiple optimization agents, manager agents and the common memory which through interactions produce solutions of hard optimization problems. In this paper the dynamic cooperative interaction strategy is proposed. The strategy supervises cooperation between agents and the common memory. To validate the proposed approach the preliminary computational experiment has been carried out.

Piotr Jędrzejowicz, Ewa Ratajczak-Ropel
Influence of the Waiting Strategy on the Performance of the Multi-Agent Approach to the DVRPTW

A multi-agent approach to the Dynamic Vehicle Routing Problem with Time Windows has been proposed in the paper. The process of solving instances of the problem is performed by a set of software agents. They are responsible for managing the sets of dynamic requests, allocating them to the available vehicles, and optimizing the routes covered by the vehicles in order to satisfy several requests and vehicles constraints. The paper focuses on waiting strategies which aim at deciding whether a vehicle should wait after servicing a request, before heading toward the next customer. The influence of the proposed waiting strategy on the performance of the approach has been investigated via a computational experiment. It confirmed the positive impact of the strategy on the obtained results.

Dariusz Barbucha
Local Termination Criteria for Stochastic Diffusion Search: A Comparison with the Behaviour of Ant Nest-Site Selection

Population based decision mechanisms employed by many Swarm Intelligence methods can suffer poor convergence resulting in ill-defined halting criteria and loss of the best solution. Conversely, as a result of its resource allocation mechanism, the solutions found by Stochastic Diffusion Search enjoy excellent stability. Previous implementations of SDS have deployed complex stopping criteria derived from global properties of the agent population; this paper examines two new local SDS halting criteria and compares their performance with ‘quorum sensing’ - a natural termination criterion deployed in nature by some species of tandem-running ants. We empirically demonstrate that local termination criteria are almost as robust as the classical SDS termination criteria, whilst the average time taken to reach a decision is around three times faster.

J. Mark Bishop, Andrew O. Martin, Elva J. H. Robinson

Meta-Heuristics Techniques and Applications

Frontmatter
Optimizing Urban Public Transportation with Ant Colony Algorithm

Transport system in most cities has some problems and should be optimized. In particular, timetable of the city public transportation needs to be changed. Metaheuristic methods for timetabling were considered the most efficient. Ant algorithm was chosen as one of these methods. It was adapted for optimization of an urban public transport timetable. A timetable for one bus route in the city of Tomsk, Russia was created on the basis of the developed software. Different combinations of parameters in ant algorithm allow obtaining new variants of the timetable that better fit passengers’ needs.

Elena Kochegurova, Ekaterina Gorokhova
A Hybrid Approach Based on Particle Swarm Optimization and Random Forests for E-Mail Spam Filtering

Internet is flooded every day with a huge number of spam emails. This will lead the internet users to spend a lot of time and effort to manage their mailboxes to distinguish between legitimate and spam emails, which can considerably reduce their productivity. Therefore, in the last decade, many researchers and practitioners proposed different approaches in order to increase the effectiveness and safety of spam filtering models. In this paper, we propose a spam filtering approach consisted of two main stages; feature selection and emails classification. In the first step a Particle Swarm Optimization (PSO) based Wrapper Feature Selection is used to select the best representative set of features to reduce the large number of measured features. In the second stage, a Random Forest spam filtering model is developed using the selected features in the first stage. Experimental results on real-world spam data set show the better performance of the proposed method over other five traditional machine learning approaches from the literature. Furthermore, four cost functions are used to evaluate the proposed spam filtering method. The results reveal that the PSO based Wrapper with Random Forest can effectively be used for spam detection.

Hossam Faris, Ibrahim Aljarah, Bashar Al-Shboul
Fuzzy Logic and PD Control Strategies of a Three-Phase Electric Arc Furnace

This paper presents a fuzzy control and a conventional proportional derivative control for the electrode positioning system of a three-phase electric arc furnace. Generally, it is necessary to maintain constant arc lengths for these kinds of furnaces. The two control strategies proposed in this paper regulates the current of the electric arc because arc length depends by the electric arc current. In order to do this, a new model of the electric arc developed by the authors of this paper was used. This paper illustrates a comparison of the performance analysis of a conventional PD controller and a fuzzy based intelligent controller. The fuzzy intelligent based controller has two inputs and one output. The whole systems are simulated by using of Matlab/Simulink software. These systems are tested when applying a disturbance in the process. The responses of the closed-loop systems illustrates that the proposed fuzzy controller has better dynamic performance, rapidity and good robustness as compared to the proposed PD controller.

Loredana Ghiormez, Octavian Prostean, Manuela Panoiu, Caius Panoiu
Hybrid Harmony Search Combined with Variable Neighborhood Search for the Traveling Tournament Problem

In this paper, we are interested in the mirrored version of the traveling tournament problem (mTTP) with reversed venues. We propose a new enhanced harmony search combined with a variable neighborhood search (V-HS) for mTTP. We use a largest-order-value rule to transform harmonies from real vectors to abstract schedules. We use also a variable neighborhood search (VNS) as an improvement strategy to enhance the quality of solutions and improve the intensification mechanism of harmony search. The overall method is evaluated on benchmarks and compared with other techniques for mTTP. The numerical results are encouraging and demonstrate the benefits of our approach. The proposed V-HS method succeeds in finding high quality solutions for several considered instances of mTTP.

Meriem Khelifa, Dalila Boughaci

Web Systems and Human-Computer Interaction

Frontmatter
Utilizing Linked Open Data for Web Service Selection and Composition to Support e-Commerce Transactions

Web Services (WS) have emerged during the past decades as a means for loosely coupled distributed systems to interact and communicate. Nevertheless, the abundance of services that can be retrieved online, often providing similar functionalities, can raise questions regarding the selection of the optimal service to be included in a value added composition. We propose a framework for the selection and composition of WS utilizing Linked open Data (LoD). The proposed method is based on RDF triples describing the functional and non-functional characteristics of WS. We aim at the optimal composition of services as a result of specific SPARQL queries and personalized weights for QoS criteria. Finally we utilize an approach based on the particle swarm optimization (PSO) method for the ranking of returned services.

Nikolaos Vesyropoulos, Christos K. Georgiadis, Elias Pimenidis
Improved Method of Detecting Replay Logo in Sports Videos Based on Contrast Feature and Histogram Difference

Sports videos are probably the most popular videos and the most frequently searched in the Web. Similarly to text documents for which abstracts can be automatically generated we need to automatically summarize the long videos presenting for example the whole soccer matches. The obvious solution to summarize a sports video is to detect and to extract replay segments as highlights which usually present the most interesting and exciting parts of a video containing important players and actions. Replays can be detected if replay shots are separated by special logo animations or replays are in slow motion. One of the automatic method of logo transition detection is a method based on contrast feature and histogram difference. The paper presents the improved method of replay logo detection and the results of the tests demonstrating the benefits of the proposed improvements.

Kazimierz Choroś, Adam Gogol
Dynamic MCDA Approach to Multilevel Decision Support in Online Environment

Effective online marketing requires technologies supporting campaign planning and execution at the operational level. Changing performance over time and varying characteristics of audience require appropriate processing for multilevel decisions. The paper presents the concept of adaptation of the Multi-Criteria Decision Analysis methods (MCDA) for the needs of multilevel decision support in online environment, when planning and monitoring of advertising activity. The evaluation showed how to integrate data related to economic efficiency criteria and negative impact on the recipient towards balanced solutions with limited intrusiveness within multi-period data.

Jarosław Jankowski, Jarosław Wątróbski, Paweł Ziemba
Usability Testing of a Mobile Friendly Web Conference Service

Results of usability study of the conference website developed using responsive web design approach are presented in the paper. Two variants of the application architecture were examined using laptops and smartphones. The testing sessions with users took place in laboratory conditions, whereas three expert inspection methods including cognitive walkthrough, heuristic evaluation, and control lists were accomplished remotely. The list of 111 recommendations for improving the website was formulated. In consequence, a new version of the website was developed and the second round of usability testing was planned.

Ida Błażejczyk, Bogdan Trawiński, Agnieszka Indyka-Piasecka, Marek Kopel, Elżbieta Kukla, Jarosław Bernacki
Latent Email Communication Patterns

The paper introduces a framework to analyze company internal email communication. In some cases, users tend to use email for various purposes where other tools would be more appropriate. Such behavior patterns cause high workload to other users who have to keep doing the same. A sample from Enron email corpus is used to discover such latent patterns using Latent Class Analysis method and strategies how to improve communication are proposed.

Miloš Vacek
Spectral Saliency-Based Video Deinterlacing

A spectral saliency-based motion compensated deinterlacing method is proposed in the sequel. We present a block-based deinterlacing method wherein the interpolation strategy is taken upon both field texture and viewer’s region of interest, for ensuring high quality frame interpolation. The proposed deinterlacer overpasses the classical interpolation approaches for both objective and subjective quality results and has a low complexity in comparison with the state of the art deinterlacers.

Umang Aggarwal, Maria Trocan, Francois-Xavier Coudoux
Backmatter
Metadata
Title
Computational Collective Intelligence
Editors
Ngoc-Thanh Nguyen
Lazaros Iliadis
Yannis Manolopoulos
Bogdan Trawiński
Copyright Year
2016
Electronic ISBN
978-3-319-45243-2
Print ISBN
978-3-319-45242-5
DOI
https://doi.org/10.1007/978-3-319-45243-2

Premium Partner