Skip to main content

Über dieses Buch

This book contains a selection of articles from The 2015 World Conference on Information Systems and Technologies (WorldCIST'15), held between the 1st and 3rd of April in Funchal, Madeira, Portugal, a global forum for researchers and practitioners to present and discuss recent results and innovations, current trends, professional experiences and challenges of modern Information Systems and Technologies research, technological development and applications.

The main topics covered are: Information and Knowledge Management; Organizational Models and Information Systems; Intelligent and Decision Support Systems; Big Data Analytics and Applications; Software Systems, Architectures, Applications and Tools; Multimedia Systems and Applications; Computer Networks, Mobility and Pervasive Systems; Human-Computer Interaction; Health Informatics; Information Technologies in Education; Information Technologies in Radio communications.



Information and Knowledge Management


Projects as Knowledge Swirls in the Technological Innovation Romania’s Situation

The present paper uses as research basis a new way of thinking regarding the relation between innovation and knowledge - the Knowledge Flow Percolation Model (KFPM). In this model’s center, human beings are seen as thinking electrons, both consuming and generating knowledge flows. Through the interdependent actions of individuals, knowledge circulates inside organizations, allowing them to innovate in order to obtain competitive advantages. But there is a wide range of barriers which impede the creation and movement of flows in the model grid and consequently, hinder their change into innovation. The solution proposed by this paper as one of the most adequate instruments to make KFPM more spreadable is the


. On this basis, in an empirical study, we try to demonstrate the hypothesis of the positive influence of projects, as knowledge swirls, on the development of innovative skills which will help solving problems in the organization, creating and widening of knowledge and reducing the barriers in knowledge transfer.

Daniela Popescul, Mircea Georgescu, Juan Miguel Alcántara Pilar

Question Answering Track Evaluation in TREC, CLEF and NTCIR

Question Answering (QA) Systems are put forward as a real alternative to Information Retrieval systems as they provide the user with a fast and comprehensible answer to his or her information need. It has been 15 years since TREC introduced the first QA track. The principal campaigns in the evaluation of Information Retrieval have been specific tracks focusing on the development and evaluation of this type of system. This study is a brief review of the TREC, CLEF and NTCIR Conferences from the QA perspective. We present a historical overview of 15 years of QA evaluation tracks using the method of systematic review. We have examined identified the different tasks or specific labs created in each QA track, the types of evaluation question used, as well as the evaluation measures used in the different competitions analyzed. Of the conferences, it is CLEF that has applied the greater variety of types of test question (factoid, definition, list, causal, yes/no, amongst others). NTCIR, held on 13 occasions, is the conference which has made use of a greater number of different evaluation measures. Accuracy, precision and recall have been the three most used evaluation measures in the three campaigns.

María-Dolores Olvera-Lobo, Juncal Gutiérrez-Artacho

Concurrency Detection on Finish-to-Start Activity Precedence Networks

We explore the finish-to-start precedence relations of project activities used in scheduling problems. From these relations, we devise a method to identify groups of activities that could execute concurrently, i.e. activities in the same group can all execute in parallel.

The method derives a new set of relations to describe the concurrency. Then, it is represented by an undirected graph and the maximal cliques problem identifies the groups.

We provide a running example with a project from our previous studies in resource constrained project cost minimization together with an example application on the concurrency detection method: the evaluation of the resource stress.

Rui Moutinho, Anabela Tereso

Refactoring Rules for Graph Databases

The information generated nowadays is growing in volume and complexity, representing a technological challenge which demands more than the relational model for databases can currently offer. This situation stimulates the use of different forms of storage, such as Graph Databases. Current Graph Databases allow automatic database evolution, but do not provide adequate resources for the information organization. This is mostly left under the responsibility of the applications which access the database, compromising the data integrity and reliability. The goal of this work is the definition of refactoring rules to support the management of the evolution of Graph Databases by adapting and extending existent refactoring rules for relational databases to meet the requirements of the Graph Databases features. These refactoring rules can be used by developers of graph database management tools to guarantee the integrity of the operations of database evolution.

Adriane M. Fonseca, Luiz Camolesi

Knowledge Asset Management Pertinent to Information Systems Outsourcing

Organisations have over time realised that leveraging their already-accumulated knowledge assets are the most cost effective way to increase their competitive standing and to harness innovation. In choosing to outsource their information systems (IS), they may unintentionally fragment their knowledge assets by missing critical learning opportunities, with a resulting loss of ensuing business gains. Organisations should manage knowledge exploitation effectively, especially in the context of IS outsourcing arrangements, where the planning, management and operation of all or a part of the IS function, are handed over to an independent third party. There is, however, currently a lack of applied research to explain how knowledge asset dynamics happen in organisational value creation mechanisms, especially in the context of IS outsourcing. This paper analyses and describes knowledge asset management relevant in such an IS outsourcing arrangement. By understanding the requirements to manage knowledge assets, an organisation may optimise the relationships among critical knowledge assets as well as the knowledge sharing mechanisms required to meet knowledge demands in the context of IS outsourcing.

Hanlie Smuts, Paula Kotzé, Alta van der Merwe, Marianne Loock

Meta-model of Information Visualization Based on Treemap

The interpretation and understanding of large quantities of data is a challenge for current information visualization methods. The visualization of information is important as it makes the appropriate acquisition of the information through the visualization possible. The choice of the most appropriate information visualization method before commencing with the resolution of a given visual problem is primordial to obtaining an efficient solution. This article has as its objective to describe an information visualization classification approach based on Treemap, which is able to identify the best information visualization model for a given problem. This is understood through the construction of an adequate information visualization meta-model. Firstly, the actual state of the visualization field is described, and then the rules and criteria used in our research are shown, with the aim of presenting a meta-model proposal based on treemap visualization methods. Besides this, the authors present a case study with the information contained in the periodic table visualization meta-model along with an analysis of the information search time complexity in each of the two meta-models. Finally, an evaluation of the results is presented through the experiments conducted with users and a comparative analysis of the methods based on Treemap and the Periodic Table.

Eduardo C. Oliveira, Luciene C. Oliveira, Alexandre Cardoso, Leandro Mattioli, Edgard A. Lamounier

Project Management: Evaluation of the Problems in the Portuguese Construction Industry

During recent decades it has been possible to identify several problems in construction industry project management, related with to systematic failures in terms of fulfilling its schedule, cost and quality targets, which highlight a need for an evaluation of the factors that may cause these failures. Therefore, it is important to understand how project managers plan the projects, so that the performance and the results can be improved. However, it is important to understand if other areas beyond cost and time management that are mentioned on several studies as the most critical areas, receive the necessary attention from construction project managers. Despite the cost and time are the most sensitive areas/fields, there are several other factors that may lead to project failure. This study aims at understand the reasons that may cause the deviation in terms of cost, time and quality, from the project management point of view, looking at the knowledge areas mentioned by PMI (Project Management Institute).

Leonel Rocha, Anabela Tereso, Joõ Pedro Couto

A Structural Prototype for Planning and Controlling a Manufacturing System

Structuring the information and knowledge needed to implement a factory, as well as doing its control and management by computer systems, is the purpose of the meta-model and solutions here presented.

This meta-model includes: an object-oriented model to structure the needed information and knowledge; a model for manufacturing systems; a prototype that integrates the two previously referred models with real time control, in order to plan and to realize the management of a factory.

A first step of this work has been accomplished at the

Ephestia Kuehniella

Bio-factory, controlled by computer in the Laboratory of the Biology Department at the University of Azores.

In this paper we present a meta-model for a factory planning, controlling and managing, with updated conceptual and technological options, preparing the way for a new industrial implementation of the above mentioned Biofactory.

Paulo Enes Silveira

Towards a Multidimensional Information Retrieval

Information retrieval is a highly covered field by the research community in recent years, mainly because of the urgent need for information. Indeed, with the expansion and the availability of internet, users have become more exacting regarding the results they want in terms of quality and quickness. However, although the information retrieval systems are able to produce a response to user needs, they cannot guarantee precise answers. The reason is that the existing information retrieval systems do not take into account the information in its various aspects. Rather they consider only one possible interpretation, added to this, ongoing growth of information that makes interpretation tasks even more complex. To face these challenges we propose a multidimensional approach that tends to capture at best all aspects generated information according to different points of view to improve the quality of response and time search.

Hadia Mosteghanemi, Habiba Drias

A Comparative Study of Platforms for Research Data Management: Interoperability, Metadata Capabilities and Integration Potential

Research data management is acknowledged as an important concern for institutions and several platforms to support data deposits have emerged. In this paper we start by overviewing the current practices in the data management workflow and identifying the stakeholders in this process. We then compare four recently proposed data repository platforms—DSpace, CKAN, Zenodo and Figshare—considering their architecture, support for metadata, API completeness, as well as their search mechanisms and community acceptance. To evaluate these features, we take into consideration the identified stakeholders’ requirements. In the end, we argue that, depending on local requirements, different data repositories can meet some of the stakeholders requirements. Nevertheless, there is still room for improvements, mainly regarding the compatibility with the description of data from different research domains, to further improve data reuse.

Ricardo Carvalho Amorim, João Aguiar Castro, João Rocha da Silva, Cristina Ribeiro

Automatic Upload of Professional Profiles Directly from Sources

Globalization of education and employment is a fact [1]. Also, according to one International Labor Organization report [2], 6% of the world’s work force was without job in 2012. Finally, a high development of information and communications technology (ICT) until 2020 is possible [3]. Why, then, should we go on using the conventional, complex, irksome, bureaucratic process of gathering our CV from often remote Sources, and share it with remote Recruiters who often don’t trust on it? Alternatively, it should be possible to solely supply our national identification to an


, and get, through national platforms, or else, our CV, expeditiously, and directly from those Sources; and to use it to apply to the Recruiting Entities, or to use them in internet platforms (LinkedIn, #1 Naukri, Times Jobs, Monster India, EURES, EUROPASS, Guardian’s and so). We propose a methodology, and ICT system, to upgrade the CV’s delivery to an expeditious and cheaper process.

Rui Barbosa, Domingos Martinho, Rui Neto

Prototype of Knowledge Management in Social Networks: Case LinkedIn

This paper describes how social networks have contributed to the evolution of companies. It explains how collaboration between people that have achieved this growth and demonstrates the importance of extracting tacit knowledge of these relationships, taking into account the difficulties encountered in trying to codify this knowledge works. For which a system model analysis of conversations based on knowledge management in order to achieve extract the relevant features of the information processed or transferred to generate knowledge to users interacting in a network is proposed, taking the case LinkedIn network. A software prototype was implemented to test the model, making an assessment of key indicators such management.

Elsa Paola Mora Holguín, Víctor Hugo Medina García

Strength Pareto Fitness Assignment for Generating Expansion Features

Owing to the increasing use of ambiguous and imprecise words in expressing the user’s information need, it has become necessary to expand the original query with additional terms that best capture the actual user intent. Selecting the appropriate words to be used as additional terms is mainly dependent on the degree of relatedness between a candidate expansion term and the query terms. In this paper, we propose two criteria to assess the degree of relatedness: (1) attribute more importance to terms occurring in the largest possible number of documents where the query keywords appear, (2) assign more importance to terms having a short distance with the query terms within documents. We employ the strength Pareto fitness assignment in order to satisfy both criteria simultaneously. Our computational experiments on OHSUMED test collection show that our approach significantly improves the retrieval performance compared to the baseline.

Ilyes Khennak, Habiba Drias

Towards Reusing Data Cleaning Knowledge

The organizations’ demand to integrate several disparate data sources and an ever-increasing amount of data is intensifying the occurrence of data quality problems. Currently, data cleaning approaches are tailored for data sources having different schemas but sharing the same data model (e.g. relational model), and are highly dependent on a domain expert to specify data cleaning operations. This paper presents a novel and generic data cleaning methodology aiming to assist the domain expert during the specification of data cleaning operations through reusing knowledge previously expressed for other data sources even if those sources have different data models and/or schemas. This is achieved by abstracting data source models and schemas to a closer human level and by the use of vocabulary to describe the structure and the semantics of data cleaning operations.

Ricardo Almeida, Paulo Maio, Paulo Oliveira, Barroso João

Big Data, Internet of Things and Cloud Convergence for E-Health Applications

Big data storage and processing are considered as one of the main applications for cloud computing systems. Furthermore, the development of the Internet of Things (IoT) paradigm has advanced the research on Machine to Machine (M2M) communications and enabled novel tele-monitoring architectures for E-Health applications. However, there is a need for converging current decentralized cloud systems, general software for processing big data and IoT systems. The purpose of this paper is to analyze existing components and methods of integrating big data processing with cloud M2M systems based on Remote Telemetry Units (RTUs) and to propose a converged E-Health architecture built on Exalead CloudView, a search based application. Finally, we discuss the main findings of the proposed implementation and future directions.

George Suciu, Victor Suciu, Simona Halunga, Octativan Fratu

User-Driven Methodology for Data Quality Assessment in the Context of Robbery Events

Situation assessment is a significant process for Situation Awareness (SAW) improvement since this process can be degraded if obtained by means of low quality data, which undermines the decision-making process. Identification of situations in military systems is of utmost importance since the information acquired is used in order to answer emergency calls in which poor understanding of situations for further resources allocation to answer it impacts on the victims involved. In order to assist the first level of SAW formation in the context of military situation assessment of robbery, the present work aims to present a methodology to provide support to the development of military decision-making systems addressed to represent and evaluate data quality. Such methodology is composed by gathering requirements for the robbery domain, the establishment of quantitative metrics for quality evaluation such as completeness and timeliness, the development of an ontology to capture and represent the semantic knowledge and the enforcement of evaluation functions according to scores set by means of quality metrics. Finally, it will be addressed a case study in response to robbery reports to illustrate the applicability of the methodology.

Jessica Souza, Leonardo Botega, José Eduardo Santarém Segundo, Claudia Berti

TIGER-Based Assessment of Nursing Informatics Competencies (TANIC)

All nurses must have a minimum set of competencies in nursing informatics (NI). Online, reliable, and valid tools for self-assessment of one’s skill level with a set of NI competencies were lacking. Research team followed a systematic process to design, develop, and pilot test an online tool based on the TIGER Initiative competencies. Tool, known as TANIC, addresses 3 competency areas: basic computer, information literacy, and clinical information management. Pilot test was conducted by inviting participants in a nursing informatics online forum to become respondents. The content validity index (CVI) approach was used to estimate content validity. Values ranged from .9 to 1.0 with an average of .98, demonstrating acceptable content validity. Internal consistency reliability results were: basic computer (.948), information literacy (.980), and clinical information management (.944). TANIC was determined to be a reliable, valid, and usable online tool.

Kathleen Hunter, Dee McGonigle, Toni Hebda, Carolyn Sipes, Taryn Hill, Jean Lamblin

An Assessment of Chronic Kidney Diseases

Once kidney disease is exposed, the presence or degree of kidney dysfunction and its progression are assessed, and the underlying syndrome may be diagnosed. Although the patient‘s history and corporeal examination may be useful, some key information is obtained from valuation of the Glomerular Filtration Rate, and analysis of the urinary sediment. On the one hand, Chronic Kidney Diseases (CKDs) depicts anomalous kidney function and/or its makeup. On the other hand, there is evidence that treatment may avoid or delay the progression of CKDs, either by reducing and prevent the development of complications, or by reducing the risk of CardioVascular Illnesses. Acute Renal Failure (ARF) can occur over hours to days based on the underlying mechanism of injury and relative health of the individual. ARF is often reversible if it is recognized early and treated promptly. This is the reason behind our compromise in presenting this work, that aims at the development of an early diagnosis system to monitor the occurrence of the disease, and therefore to allow one to act proactively.

José Neves, M. Rosário Martins, Henrique Vicente, João Neves, António Abelha, José Machado

A Guideline for Using Knowledge Management in Telemedicine Systems Dedicated for Diabetes Patients in Saudi Arabia

The research of Knowledge Management (KM) has grown rapidly in health care sector recently. This research combines Knowledge Management and Telemedicine. It focuses on how Knowledge Management can be used for enhancing the efficiency and effectiveness of telemedicine. A concentration has been made especially on diabetes patients in Saudi Arabia. As a result, this study sets the guidelines of using Knowledge Management in Telemedicine Systems Dedicated for Diabetes Patients in Saudi Arabia.

Abeer AlSanad, Nessrine Zemirli

Collaboration with AORN to Develop the Technology in the OR (TOR) Tool: How Does Technology Affect the Nature of Surgeries for PeriOperative Nurses – A Research Study

The Association of periOperative Registered Nurses (AORN) approached the Nursing Informatics faculty of a major university with the request to collaborate on the development of a new computer-based informatics competency tool as well as a survey instrument for determining utilization and relevancy of current perioperative technology. The goal of this collaboration was to develop a survey tool that can be used to assess the technology use of perioperative nurses and surgical outcomes.


A collaboration proposal by AORN and Chamberlain Nursing Informatics group, created a timeline for the research and tool development. Through meetings with AORN editors and SMEs, the research process and potential outcomes were revised as was the research question, “How does technology affect the nature of surgeries for perioperative nurses”? The outcomes of this ongoing collaboration will be the development of a reliable and valid survey tool that will be used annually, assess utilization of perioperative technology and three peer-reviewed, published articles for the AORN journal.

Carolyn Sipes, Dee McGonigle, Kathleen Hunter, Toni Hebda, Taryn Hill, Jean Lamblin

Nursing Informatics Competencies Assessment Level 3 and Level 4 (NICA L3/L4)

Advanced nursing informatics practitioners must have a minimum set of competencies in nursing informatics (NI). Online, reliable, and valid tools for self-assessment of one’s skill level with a set of NI competencies were lacking. Research team followed a systematic process to design, develop, and pilot test an online tool through a synthesis of both seminal work and current literature, using a Delphi approach (Dalkey & Helmer, 1963). Tool, known as NICA L3/L4, addresses 3 advanced NI competency areas: computer skills, informatics knowledge, and informatics skills. Pilot test was conducted following IRB approval using a purposeful, convenience sample from the NI community. The content validity index (CVI) approach was used to estimate content validity. Values ranged from .9 to 1.0 with an average of .98, demonstrating acceptable content validity. Internal consistency reliability results were: computer skills (.909), informatics knowledge (.982), and informatics skills (.992). NICA L3/L4 was determined to be a reliable, valid, and usable online tool.

Dee McGonigle, Kathleen Hunter, Toni Hebda, Carolyn Sipes, Taryn Hill, Jean Lamblin

A Comparison of Two Oversampling Techniques (SMOTE vs MTDF) for Handling Class Imbalance Problem: A Case Study of Customer Churn Prediction

Predicting the behavior of customer is at great importance for a project manager. Data driven industries such as telecommunication industries have advantage of various data mining techniques to extract meaningful information regarding customer’s future behavior. However, the prediction accuracy of these data mining techniques is significantly affected if the real world data is highly imbalanced. In this study, we investigate and compare the predictive performance of two well-known oversampling techniques Synthetic Minority Oversampling Technique (SMOT) and Megatrend Diffusion Function (MTDF) and four different rule generation algorithms (Exhaustive, Genetic, Covering, and LEM2) based on rough set classification using publicly available data sets. As useful feature extraction can play a vital role not only in improving the classification performance, but also to reduce the computational cost and complexity by eliminating unnecessary features from the dataset. Minimum Redundancy Maximum Relevance (mRMR) technique has been used in the proposed study for feature extraction which not only selects the best feature subset but also reduces the features space. The results clearly demonstrate the predictive performance of both oversampling techniques and rules generation algorithms that will help the decision makers/researcher to select the ultimate one.

Adnan Amin, Faisal Rahim, Imtiaz Ali, Changez Khan, Sajid Anwar

A Supervised Approach to Arabic Text Summarization Using AdaBoost

In recent years, research in text summarization has become very active for many languages. Unfortunately, looking at the effort devoted to Arabic text summarization, we find much fewer attention paid to it. This paper presents a Machine Learning-based approach to Arabic text summarization which uses AdaBoost. This technique is employed to predict whether a new sentence is likely to be included in the summary or not. In order to evaluate the approach, we have used a corpus of Arabic articles. This approach was compared against other Machine Learning approaches and the results obtained show that the approach we suggest using AdaBoost outperforms other existing approaches.

Riadh Belkebir, Ahmed Guessoum

Commercial Business Intelligence Suites Comparison

Business Intelligence (BI) can be a powerful ally to organizations, in decision making processes. BI is not yet a reality in some companies in fact, it’s even unknown in many of them. It is essential that the information and functionalities on BI suites come clearly to organizations. Only a correct choice will allow data to result in gains and business success. In this paper we present a comparative analysis of some BI tools capabilities, in order to aid organizations in the selection process of the most suitable BI platform. We also test the use of collaborative technologies to integrate data from social networks because it can be determinant in the design of future business strategies.

Joaquim Lapa, Jorge Bernardino, Ana Almeida

A Multi-agent Framework for Web Information Foraging: Application to MedlinePlus

A multi-agent architecture is proposed for an intelligent system capable of foraging information from the Web in an efficient and rapid way. We distinguish three important phases. The first one is a learning process for localizing the most relevant pages that might interest the user. The second consists in an incremental learning taking into account the changes that undergoes the Web. Finally, the last step simulates the information foraging process once the authorities are learned. Experiments were conducted on MedlinePlus, a public site dedicated for research in the domain of Health. The results are promising either for those related to Web regularities and for surfing time.

Yassine Drias, Samir Kechid

The Length of Hospital Stay in Acute Myocardial Infarction: A Predictive Model with Laboratory and Administrative Data

Introduction: The length of hospital stay (LOS) is an important measure of efficiency in the use of hospital resources. Acute Myocardial Infarction (AMI), as one of the diseases with higher mortality and LOS variability in the OECD countries, has been studied with predominant use of administrative data (AD). This paper presents preliminary results of a predictive model for LOS using both AD and laboratory data (LD) in order to develop a decision support system. Methods: LD and AD of a Portuguese hospital were included, using logistic regression to develop a predictive model for extended LOS. The examples of three individuals were performed to show how model works. Results: A model with three LD and seven AD variables, with excellent discriminative ability and a good calibration, was obtained. Age >= 69, presence of comorbidities and abnormal LD predict a higher probability of extended LOS.

Teresa Magalhães, Sílvia Lopes, João Gomes, Filipe Seixo

Managing Academic Profiles on Scientific Social Networks

This article investigates the way in which communication science researchers use and manage academic social network sites. On this basis, a representative sample of 244 Spanish scholars was selected from the Spanish Association of Communication Researchers, which consists of more than 600 members. This article explores the presence of Spanish communication scholars in two academic networks: ResearchGate and Furthermore, it examines the way in which researchers are using these platforms of scientific communication, managing their academic profiles, and searching and interacting through these networks. The purpose is to assess the introduction, use, management, popularity, influence, and impact of these social academic networks. An additional aim is to analyze the new metrics generated by the media and digital social networks, studied as systems for disseminating science and managing communication and interaction among scholars.

Francisco Campos, Andrea Valencia

The Influence of the Use of Mobile Devices and the Cloud Computing in Organizations

In the last decade, companies tend to become virtualized. This virtualization is due in large part to developments in the use of mobile platforms and the impact on their infrastructure. The concept of BYOD (Bring Your Own Device) adoption and use of cloud computing are having profound implications for how the technologies are being used, as well as the interaction between individuals and these technologies. In this context, we conducted a study based on a questionnaire about the impact of the use of mobile devices and the cloud in organizations, taking into consideration their use in performing professional and personal tasks. With the questionnaire and its subsequent analysis it was found that organizations have a lack of understanding about almost all of the implications and consequences of the use of mobile devices and the cloud.

Fernando Moreira, Manuel Pérez Cota, Ramiro Gonc̨alves

QUALITUS: An Integrated Information Architecture for the Quality Management System of Hospitals

In the present research work we planned the Information Architecture (IA) for the Quality Management System (QMS) of a Hospital, which allowed us to develop and implement the QUALITUS computer application. This solution translated itself in significant gains for the Hospital Unit under study, accelerating the quality management process and reducing the tasks, the number of documents, the information to be filled in and information errors, amongst others.

Jorge Freixo, Álvaro Rocha

Digital Preservation and Criminal Investigation: A Pending Subject

Digital preservation is considered as a vital process in order to have long-term information; it has a short history and a fairly coherent and justified fundament. In this article, a comparison of models and standards of digital preservation is developed, as well as the current situation of digital preservation and an analysis about the most outstanding projects that can serve as a basis for the application of an existing preservation model in a specific setting, or the proposal of a new one.

A comparison of the existing models and their application in the specific environment of institutions of criminal investigation is performed by determining its applicability to the relevant aspects required in this environment.

It is important to bear in mind that for every application domain and its reality, there is a very high probability of requiring a particular model which can be based on the existing ones.

Fernando Molina Granja, Glen Rodríguez

Cyber Resilience – Fundamentals for a Definition

This short paper examines the concept of cyber resilience from an organizational perspective. Cyber resilience is defined as “the ability to continuously deliver the intended outcome despite adverse cyber events”, and this definition is systematically described and justified. The fundamental building blocks of cyber resilience are identified and analyzed through the contrasting of cyber resilience against cybersecurity with regards to five central characteristics.

Fredrik Björck, Martin Henkel, Janis Stirna, Jelena Zdravkovic

The Project of Electronic Monitoring of Insolvency Proceeding Results in the Czech Republic: Utilization of IT Reduces Information Asymmetry among Economic Subjects

The study provides information regarding the results of comprehensive statistical surveys of insolvency proceedings and especially the real results thereof. These surveys have been conducted in the Czech Republic over the last three years. The study first shows the significance of such statistics and the possibility of further analyses of the data gained. It then devotes attention to the gradual implementation of a unique project of systematic acquisition of information on insolvency proceedings and about the mechanisms by which this information will be made public in the near future. As a result, it will in essence be possible for anyone to compile, without any expense, data files online regarding results of those proceedings that are of interest to them from the perspective of their decision-making processes. This will be an entirely unique system which does not yet exist in any developed country. Its main contribution will be the reduction of the information asymmetry which exists in every national economy, where debtors and, among creditors, system creditors (especially banks) have a primary advantage in terms of information. Implementation of a freely accessible database, however, will enable all subjects without difference to acquire better documentation for decision-making processes.

Luboš Smrčka

Influence of Users’ Privacy Risks Literacy on the Intention to Install a Mobile Application

While users are increasingly embracing smartphones worldwide, these devices, storing our most personal information, are repeatedly found to be leaking users’ data. Despite various attempts to inform users of these privacy issues, they continue to adopt risky behaviors. In order to better protect users of these devices, we need to understand the antecedents of the decision to install potentially privacy-threatening applications on them. We ran a large experiment in order to determine the influence of privacy risks literacy on the intent to install a potentially privacy threatening application on a smartphone. Using partial least squares (PLS), we were able to find that more than the privacy risks literacy, it is the coping behavior literacy that influences the user’s decision to install a mobile application. These findings are relevant to help determine a course of action in order to increase the user’s privacy protection when using such devices.

Alessio De Santo, Cédric Gaspoz

Types of Linkages between Business Processes and Regulations

One of the essential issues in compliance management is to ensure that the business processes function according to regulations which are relevant in a particular business context. Intuitively it requires establishment of explicit relationships or linkages between the regulations and business process model elements. The analysis of compliance handling approaches shows that currently this linkage is mainly considered indirectly rather than directly. This lets to assume that, for more efficient compliance management, there is a need to extend existing or to design new approaches that can effectively handle direct linkages between business processes and regulations.

Andrejs Gaidukovs, Marite Kirikova

Evolution of Methodological Proposals for the Development of Enterprise Architecture

An Enterprise is any collection of organizations that has a common set of goals and/or a single bottom line. Nowadays Enterprise Architecture is always advancing, there is always something new to be invented or old techniques to be improved. Many tools have been created for enterprises to use, there are frameworks, that help maintaining and managing the architecture and then there are methodologies that are used to develop that architecture. In this article the main subject will be the evolution throughout the years of these methods. Only four methods were selected for this search, the Enterprise Unified Process (EUP), the UN/CEFACT Modelling Methodology (UMM), the TOGAF Architecture Development Method (ADM) and, finally, the Federal Enterprise Architecture Framework (FEAF). After analyzing the chosen models, there is going to be a small comparison between them.

Adriana Ferrugento, Álvaro Rocha

Step towards of a Homemade Business Intelligence Solution – A Case Study in Textile Industry

Nowadays, organizations are increasingly looking to invest in business intelligence solutions, mainly private companies in order to get advantage over its competitors, however they do not know what is necessary. Business intelligence allows an analysis of consolidated information in order to obtain more specific outlets and certain indications in order to support the decision making process. You can take the right decision based on the data collected from different information systems present in the organization and outside of them. The textile sector is a sector where concept of Business Intelligence it is not many explored yet. Actually there are few textile companies that have a BI platform. Thus, the article objective is present an architecture and show all the steps by which companies need to spend to implement a successful free homemade Business Intelligence system. As result the proposed approach it was validated using real data aiming assess the steps defined.

Sandrina Carvalho, Filipe Portela, Manuel Filipe Santos, António Abelha, José Machado

A REstfull Approach for Classifying Pollutants in Water Using Neural Networks

E-nose systems are composed by different type of sensors with the capacity of detecting almost any compound or combination of compounds of an odor. In this regard, software approaches can provide learning capacities for classifying odor information. The aim of this paper is to propose a novel approach for on-line classification of pollutants in the water. To extract information about different types of pollutants, a hand-held, lightweight and battery powered instrument with wireless communications has been designed. An Artificial Neural Network (ANNs) is also proposed to classify different types of pollutants in the water, and two different types of ANNs (feedforward with backpropagation) Learning algorithm and Radial Basis Function based networks) have been developed. Furthermore, to support online classification services, a novel framework for developing high performance web applications is also proposed. Two different approaches have been integrated in this framework: the component-based approach is applied to increase the reusability and modularity degree, while RESTful web services provides the architectural style to connect with remote resources. According to this proposal, a web-based application has been developed to detect pollutants in the water.

José Luis Herrero, Jesús Lozano, José Pedro Santos

Decentralized Human Resource Management in Public Service – The Italian National Research Council Approach

The Italian National Research Council is a public company with multiple locations and many decentralized human resources departments. With decentralized human resources management, each location controls its own individual personnel issues such as attendances and leaves. One of the goals of decentralized human resources departments is to give autonomy to different locations, allowing them to adapt to the specific scientific climate of the individual site. However, the actual result of this form of human resources management is often disorganization, failure to follow unified standards or an ineffective application of those standards.

This article describes a distributed and federated approach to Human Resource Management that is being adopted by the Italian National Research Council (CNR). In particular, this article describes the time and attendance software implemented by the Institute of Informatics and Telematics (IIT). The aim of the article is to describe the architecture of the system, the decision-making and choices regarding implementation adopted as part of the project, the main software components and their ways of interaction and communication.

Marco Andreini, Cristian Lucchesi, Alessandro Martelli, Maurizio Martinelli, Dario Tagliaferri

An Integrated Service Solution for Digital Discount Coupons Processing

This paper describes a digital coupons management system that aims to dematerialize the couponing process since its origin (the Issuer) to its final state (the Consumer), passing by the Retailer of the product, and allowing to reduce costs, prevent frauds, improve the procedural efficiency and get knowledge about patterns of usage of coupons by consumer profiles. The system provides quality control of dematerialized coupons and Consumers mobile access, allowing the connection and integration with the information systems of Issuers and Retailers, which makes it an innovative system with a high degree of portability of its components.

Jorge Araújo, Gonçalo Paiva Dias, Hélder Gomes, Jorge Gonçalves, Daniel Magueta, Fábio Marques, Ciro Martins, Mário Rodrigues

Storing Archival Emigration Documents to Create Virtual Exhibition Rooms

This paper is concerned with an information system built in the context of a municipal archive aiming at supporting the preservation of physical documents and at facilitating the information extraction and dissemination. Saving individual records, like emigration documentation, is important for end-users and History or Social Sciences Researchers as they can know more about each individual and they can learn about the society. The emergence of the Internet provides to anyone access to the desired information in anywhere and anytime. From this, many information sources like Libraries, Museums and other similar institutions began storing documents digitally. Thus, the information stored in digital format, may be brought forward on Web pages displaying it to the end-users and enabling them to learn and interact with the available information. So, this paper aims to present the


web-based information system in order to assist in the recovery of the emigrant’s documents.

Ricardo Martini, Mónica Guimarães, Giovani Librelotto, Pedro Henriques

Agent-Oriented Based Enterprise Architecture Implementation Methodology

Enterprise Architecture (EA) is managed, developed, and maintained by Enterprise Architecture Implementation Methodology (EAIM). There is ineffectiveness in existing EAIMs due to the complexities; these complexities come from EAIM’s processes, models, methods, and strategy. Consequently, EA projects may be faced with lack of support in the following parts of EA: requirement analysis, governance and evaluation, a guideline for implementation, and continual improvement of EA implementation. This research aims to represent an Agent-Oriented based EAIM. The proposed EAIM was evaluated by means of a case study. The results show that proposed EAIM could directly affect the effectiveness of EA implementation in following items: reducing the mismatch between business and IT, defining reachable goals for enterprise, employing easy implementation practices and easy learning procedure, using efficient documentation, applying appropriate communication among project team member, providing an effective environment for alignment of business and IT, and using effective plan for governance and migration plan. This research extends the application of Agent Technology, which provides new area of research for academics and provides effective EAIM, which can be employed by practitioners in EA project.

Babak Darvish Rouhani, Mohd Nazri Mahrin, Fatemeh Nikpay, Pourya Nikfard, Bita Darvish Rouhani

The Future of Mind Reading Computer

Many findings in both psychology and affective computing areas have reshaped the scientific understanding of human’s mental states. Mental states have been found to be one of the key factors that contribute to the quality of HCI. Currently, a number of experiments in affective computing and HCI areas are being conducted; the studies mainly focus on the development of a novel computer system called the mind reading computer. The mind reading computer, genuinely, can read people’s minds and can react to different emotions or commands. Researchers believe that, in the future, mind reading computer can be applied to many disciplines in the real world and can improve the quality of HCI. This exploratory research provides comprehensive insights into the background, models and implementation of this model. To accomplish that, this research will be based on secondary data sources and will mainly focus on its affect on individuals as the main end-users.

Sofia Marwah Zainal

Organizational Models and Information Systems


Maturity, Benefits and Project Management Shaping Project Success

Organisations are under constant pressure. Externally, they face a scenario of intense competition, coupled with a changing environment which is full of uncertainty. Internally, organisations have to deal with limited resources, whilst at the same time comply with increasing requirements and strategic demands. A key to success is the successful management of organisational projects. According to worldwide studies, information systems and information technology (IS/IT) projects have a relatively low success rate. To face these various business challenges, the authors suggest that emphasis should be put on the integration of various and disperse management tools. By combining project management maturity models with benefits management approaches, we expect to reinforce support for the drive to use organisational projects to fulfill organisations’ strategic plans that will enhance the control techniques of project management, whilst recognising the need for organisational change and for ensuring the interpersonal skills necessary to orchestrate the successful completion of a project.

Jorge Gomes, Mário Romão

‘Bricolage’ in the Implementation and the Use of IS by Micro-firms: An Empirical Study

The purpose of this research is to explore the implementation and the use of IS (information systems) by micro-firms. Micro-firms are characterized by simple IS and low formalization. Recent research shows that bricolage occurs in small business IS. Bricolage as a theoretical framework provides a new perspective for small business and IS literature. In this research, we focus on micro-businesses (less than five employees) and explore if and how bricolage occurs in the implementation and the use of IS. To do so, we realized 64 semi-directive interviews with French owner-managers. Results show a distinction in the implementation process and the use of the IS.

Annabelle Jaouen, Walid A. Nakara

Implementation of Information Systems Security Policies: A Survey in Small and Medium Sized Enterprises

Information has become organizations’ most valuable asset, thus being a potential target to threats intending to explore their vulnerabilities and cause considerable damage. Therefore, there is a need to implement policies regarding information systems security (ISS) in an attempt to reduce the chances of fraud or information loss. Thus, it is important to find the critical success factors to the implementation of a security policy as well as to assess the level of importance of each one of them. This paper contributes to the identification of such factors by presenting the results of a survey regarding information systems security policies in small and medium sized enterprises (SME). We discuss the results in the light of a literature framework and identify future works aiming to enhance information security in organizations.

Isabel Lopes, Pedro Oliveira

OPM3® Portugal Project – Information Systems and Technologies Organizations – Outcome Analysis

Increasing the maturity in Project Management (PM) has become a goal for many organizations, leading them to adopt maturity models to assess the current state of its PM practices and compare them with the best practices in the industry where the organization is inserted. One of the main PM maturity models is the Organizational Project Management Maturity Model (OPM3


), developed by the Project Management Institute. This paper presents the Information Systems and Technologies organizations outcome analysis, of the assesses made by the OPM3


Portugal Project, identifying the PM processes that are “best” implemented in this particular industry and those in which it is urgent to improve. Additionally, a comparison between the different organizations’ size analyzed is presented.

Keywords: Project Management, OPM3


, Maturity Models, Best Practices, Processes

David Silva, Anabela Tereso, Gabriela Fernandes, Isabel Loureiro, José Ângelo Pinto

Three New Dimensions to People, Process, Technology Improvement Model

People Process Technology is a holistic model for Process Improvement known and used cross industries. Introduced on a large scale about thirty years ago, the model is still used as is, without major changes. There were several attempts to update the triangle to the new realities of the market, with rather limited success. Through this article, the authors are aiming to bring additional dimensions which will ensure stronger business results through an increased focus on customer requirements and innovation.

Mircea Prodan, Adriana Prodan, Anca Alecandra Purcarea

The Associating Factors and Outcomes of Green Supply Chain Management Implementation – From the Technological and Non-technological Perspectives

The ISO 140001 manufacturing firms in Malaysia are contributing towards the problems in regards to environment sustainability as result of significant amount of energy and waste being generated. ISO 14001 is essentially a certification for the best practices been adopted for environmental management. Despite being certified with environmental related accreditation, the implementation of green practices in the scope of supply chain among the firms are inadequate. With the above challenges, the study on Green Supply Chain Management (Green SCM) in relation to eco-friendly information technology solution will be investigated. The study will establish the driving factors and performance outcomes associated with the implementation of Green SCM. The significant relationships between the constructs explains the inputs which are represented with a TOE Framework. The measurement of the theoretical framework yields that Green SCM positively influences both the environmental and technological performance outcomes of the respective firms. As part of the practical implication, the study contributes towards a guideline for firms intending to adopt green practices.

Savita K. Sugathan, Dhanapal Durai Dominic, T. Ramayah, Kalai Anand Ratnam

Letting Organizations to Find the Correct Way to Start in the Implementation of Software Process Improvements

Software Development Small and Medium Enterprises (SMEs) are considered an important worldwide piece in the software development industry. In this context it is important to guarantee the quality of their process to help them getting quality products. Unfortunately, most of the time implement improvements in SMEs become a path full of obstacles almost impossible to achieve. Most of the time, this situation arises because the lack of knowledge of where to start. In order to provide a solution, this paper shows a framework that allows organization to analyze their needs, features and the way they work, so that, it is possible to identify their main problems and to provide a guide to start an improvement.

Mirna Muñoz, Jezreel Mejia

Applying Action Research in the Formulation of Information Security Policies

Information Systems Security (ISS) is crucial in all and each of the services provided by organizations. This paper focuses on Small and Medium Sized Enterprises (SMEs) as although all organizations have their own requirements as far as information security is concerned, SMEs offer one of the most interesting cases for studying the issue of information security policies. Within the organizational universe, SMEs assume a unique relevance due to their high number, which makes information security efficiency a crucial issue. There are several measures which can be implemented in order to ensure the effective protection of information assets, among which the adoption of ISS policies stands out. This article aims to constitute an empirical study on the applicability of the Action Research (AR) method in information systems, more specifically through the formulation of an ISS policy in SMEs. The research question is to what extent this research method is adequate to reach the proposed goal. The results of the study suggest that AR is a promising means for the formulation of ISS policies.

Isabel Lopes, Pedro Oliveira

Sorting with One-Dimensional Cellular Automata Using Odd-Even Transposition

Cellular automata are both discrete complex and fully distributed computational systems, with arbitrarily simple local processing. One of the computations that can be performed using cellular automata is the sorting of numerical sequences. Sorting is a relevant topic and occurs widely as a fundamental computing problem. This paper presents the development of improved sorting algorithms based upon one-dimensional cellular automata. Two new versions with radii of 1 and 3 are proposed, inspired by the characteristics of the existing methods. These algorithms are presented and their advantages and performance with respect to the existing approaches are discussed.

Carlos E. P. de Carvalho, Pedro P. B. de Oliveira

Bayesian and Graph Theory Approaches to Develop Strategic Early Warning Systems for the Milk Market

This paper presents frameworks for developing a Strategic Early Warning System allowing the estimatation of the future state of the milk market. Thus, this research is in line with the recent call from the EU commission for tools which help to better address such a highly volatile market. We applied different multivariate time series regression and Bayesian networks on a pre-determined map of relations between macro economic indicators. The evaluation of our findings with root mean square error (RMSE) performance score enhances the robustness of the prediction model constructed. Finally, we construct a graph to represent the major factors that effect the milk industry and their relationships. We use graph theoretical analysis to give several network measures for this social network; such as centrality and density.

Furkan Gürpınar, Christophe Bisson, Öznur Yaşar Diner

Supporting Workflow and Adaptive Case Management with Language Technologies

Public organizations handle many request from citizens, some are routine requests, while others are more complex. To support the handling of requests workflow management (WfM) systems and Adaptive Case Management (ACM) systems may be used. However, in order to be more efficient both WfM and ACM systems may be augmented with other technologies. In this paper, we examine how the use of language technologies can support WfM and ACM in the public sector. Based on a set of case studies from Swedish public organizations, we have identified a set of language technology use cases. Each use case is describing a potential application of language technologies in public organizations, and how these technologies can be used to support WfM as well as ACM approaches in these organizations.

Martin Henkel, Erik Perjons, Eriks Sneiders

Decision Support System for the Agri-food Sector – The Sousacamp Group Case

The agri-food organizations evolution, and its inherent need for well supported decision making, has led to an increase on the integration of IT/IS into their business and operational processes. With this in mind, an analysis was performed to the use of systems designed to support business decisions, thus allowing to acknowledge that there was none that answered to the needs inherent to organization from the mushroom production sector. This led to the definition and implementation of a DSS system, with the necessary features and qualities, which would help the mentioned activity sector needs. This DSS was designed and tested in a real environment through the implementation of a case study within a eatable mushrooms production company named “Sousacamp Group”. With the implementation of the proposed DSS, the referred business group has reported an increased in their performance levels and decision making accuracy.

Frederico Branco, Ramiro Gonçalves, José Martins, Manuel Pérez Cota

Advanced System for Garden Irrigation Management

Water is a limited resource. It is crucial promoting water efficiency strategies to help decrease the amount of water wasted. This study describes a system (iRain), developed by the authors, which aims to ensure autonomous and efficient irrigation in urban gardens. It consists of a net of sensors and actuators with Zigbee communication and control software for storage and availability of monitoring data on a web portal in real time. In manual mode, the irrigation management is the responsibility of the irrigation manager in order to control the system from the portal. In automatic mode, the system prevents watering if there is a forecast of precipitation (up to 3 days) and if the humidity of the soil has not fallen below a critical value. In this mode the decision is made at the local station. The system also gives preference to irrigation at night and during periods of light wind in order to obtain more accurate and efficient irrigation.

Filipe Caetano, Rui Pitarma, Pedro Reis

Media Brokerage: Agent-Based SLA Negotiation

Media content personalisation is a major challenge involving viewers as well as media content producer and distributor businesses. The goal is to provide viewers with media items aligned with their interests. Producers and distributors engage in item negotiations to establish the corresponding service level agreements (SLA). In order to address automated partner lookup and item SLA negotiation, this paper proposes the MultiMedia Brokerage (MMB) platform, which is a multiagent system that negotiates SLA regarding media items on behalf of media content producer and distributor businesses. The MMB platform is structured in four service layers: interface, agreement management, business modelling and market. In this context, there are: (


) brokerage SLA (bSLA), which are established between individual businesses and the platform regarding the provision of brokerage services; and (


) item SLA (iSLA), which are established between producer and distributor businesses about the provision of media items. In particular, this paper describes the negotiation, establishment and enforcement of bSLA and iSLA, which occurs at the agreement and negotiation layers, respectively. The platform adopts a pay-per-use business model where the bSLA define the general conditions that apply to the related iSLA. To illustrate this process, we present a case study describing the negotiation of a bSLA instance and several related iSLA instances. The latter correspond to the negotiation of the Electronic Program Guide (EPG) for a specific end viewer.

Bruno Veloso, Benedita Malheiro, Juan C. Burguillo

An Information System for Management and Optimization of Materials in the Wood Industry

Considering the new paradigm in the industry, where clients request small amounts of a growing range of products, the times of crisis and the fierce competition for businesses survival, it is crucial for Organizations to be able to reduce the waste of raw materials. This waste could be minimized with the introduction of decision support systems in the process. In this paper we propose an information system that enables the optimization of purchases and consumption of materials, promoting waste reduction and boosting profitability.

Jaime Mendes, Óscar Oliveira, Carla Sofia Pereira, Pedro Fernandes

Aligning Business Requirements with Services Quality Characteristics by Using Logical Architectures

Derivation of logical architectures has been largely focused on the elicitation of functional requirements, disregarding the non-functional ones. Consequently, relevant business requirements content is not reflected in the information system architectural solution, so lowering its quality. Although research has recently been approaching this issue, much is left to do, especially regarding the alignment of business requirements with the logical architecture components. Following our proposed metamodel for relating processes, goals and rules (PGR), elicited from business requirements, and the 4SRS-SoaML method for the derivation of a logical architecture in SOA environments, we now aim to extend our work by generating the quality information associated to architectural services from business requirements. By extending our PGR metamodel to include the architectural services and associated quality characteristics, we hope to contribute to the improved alignment and traceability between the use cases problem-set and the logical architecture’s components solution.

Carlos E. Salgado, Ricardo J. Machado, Rita S. P. Maciel

Maximize System Reliability for Long Lasting and Continuous Applications

In this paper, we use software rejuvenation as a preventive and proactive fault-tolerance technique to maximize the level of reliability for continuous and safety critical systems. We take both transient faults caused by software aging effects and network transmission faults into consideration and mathematically analyze the optimal software rejuvenation period that maximizes system’s reliability. The theoretical result is verified through empirical studies.

Chunhui Guo, Hao Wu, Xiayu Hua, Shangping Ren, Jerzy M. Nogiec

Am I Really Who I Claim to Be?

Faced with both identity theft and the theft of means of authentication, users of digital services are starting to look rather suspiciously at online systems. The behavior is made up of a series of observable actions of an Internet user and, taken as a whole, the most frequent of these actions amount to habit. Habit and reputation offer ways of recognizing the user. The introduction of an implicit means of authentication based upon the users’ behavior allows websites and businesses to rationalize the risks they take when authorizing access to critical functionalities. This rationalization brings with it user trust as a result of increased security.

Olivier Coupelon, Diyé Dia, Yannick Loiseau, Olivier Raynaud

Intelligent and Decision Support Systems


An Overview of Decision Support Benchmarks: TPC-DS, TPC-H and SSB

Database management systems form the core of all business intelligence solutions, with information being the key to success in today’s businesses. Performance evaluation of database systems is a non-trivial activity, due to different architectures and functionalities tuned for specific requirements. In this paper we provide an overview and analyze the main properties of three benchmarks: TPC-DS, TPC-H and the SSB which are online analytical processing benchmarks in order to validate the performance of a Decision Support System.

Melyssa Barata, Jorge Bernardino, Pedro Furtado

The Yantai Urban Economic Development Assessment Simulator

Economic development simulation is an important component of policy crafting. The possibility of testing potential solutions and selecting the most applicable is the basis of successful development. The article deals with the Yantai Urban Economic Development Assessment simulator, which is based on a multi agent-based (ABM/MAS) simulation model that is comprised of the resource consumption function minimization algorithm. The aim of the simulation is to rank industries in conformity with resource consumption, influence on the environment and income paid into the budget of the city. The simulator allows the administration of the city to elaborate justified decisions about the promotion of successful enterprises and the closing unsatisfactory ones.

Egils Ginters, Artis Aizstrauts, Inita Sakne, Miquel Angel Piera Eroles, Roman Buil, Boyong Wang

Multi Criteria Decision Support System: Preference Information and Robustness

The robustness analysis is a substantial and debatable issue in the multi-criteria decision support systems. The multi-criteria decision support systems always depend on preference information from a decision maker. Although it is evident that elicited preference information has a great impact on the result, research on the preference robustness in multi-criteria decision support system rather narrow in contrast to the well-established research robust decision making. This paper focuses on the multi-criteria decision support systems based on the multiple criteria sorting methods. The paper proposes an analyzing approach based on the concept of the preference robustness. The approach involved in the multi-criteria decision support system can provide valuable insight to a decision maker on how preferences influence the choice among alternatives. Moreover, the approach can be combined with other decision aiding methods as an additional technique for deriving a consensus when there are multiple preferences.

Maria Kalinina

An Automated System for Offline Signature Verification and Identification Using Delaunay Triangulation

Offline signature is being used in a range of applications. When a fraud person replicates our signature and takes our identity then problem arises. Signature verification has been identified as a main competitor in the search for a secure personal verification system. Offline signature system is based on the scanned image of the signature. The proposed methodology consists of signature database, preprocessing, feature extraction and recognition. The proposed signature system functions based on Delaunay triangulation of a given signature sample. The false acceptance rate (FAR) and false rejection rate (FRR) rate in the proposed method is relatively reduced. The signature for genuine person is accepted while the forged signature is rejected.

Zahoor Jan, Hayat Muhammad, Muhammad Rafiq, Noor Zada

Based System on Decisions Making for Environmental Preservation

This document presents some technological solutions that succeed identify and simulate the problems the planet is facing as CO2 emissions, deforestation, floods and hurricanes. To reverse its effects will be used a combination of solutions that tends to join tools like geoprocessing, meteorology, pollutant emission data such as CO2, demonstrate that it is possible to quantify the behavior of forests and how can help recover an environment contaminated by pollutants. Another relevant factor on forest behavior identification is the distribution of water into the continent avoiding droughts and it’s relation with the sea, avoiding hurricanes. The purpose of this article is to simulate how those factors combine, can form a large complex network to be analyzed, assisting in the decision making and environmental preservation issues.

André Fabiano de Moraes, Daiani Lara de Assis, Líbia Raquel Gomes

Prototyping the One-Dimensional Cutting Stock Problem with Usable Leftovers for the Furniture Industry

The goal of this project is to provide an information system that enables the optimization of purchases and consumption of materials in the wood industry, which is mainly composed of small and medium-sized Organizations that in most cases have no information systems to support their daily activities. In this industry, the typical decision maker relies heavily on their experienced operators and knowledge of the industry that goes back many generations. To assess the acceptance of this information system in their routine, we have implemented five constructive heuristics and a generator of instances to simulate real problems in the furniture industry. The obtained results allowed us to demonstrate the benefits of this type of information systems to the decision makers.

Óscar Oliveira, Dorabela Gamboa, Pedro Fernandes

Big Data for Stock Market by Means of Mining Techniques

Predict and prevent future events are the major advantages to any company. Big Data comes up with huge power, not only by the ability of processes large amounts and variety of data at high velocity, but also by the capability to create value to organizations. This paper presents an approach to a Big Data based decision making in the stock market context. The correlation between news articles and stock variations it is already proved but it can be enriched with other indicators. In this use case they were collected news articles from three different web sites and the stock history from the New York Stock Exchange. In order to proceed to data mining classification algorithms the articles were labeled by their sentiment, the direct relation to a specific company and geographic market influence. With the proposed model it is possible identify the patterns between this indicators and predict stock price variations with accuracies of 100 percent. Moreover the model shown that the stock market could be sensitive to news with generic topics, such as government and society but they can also depend on the geographic cover.

Luciana Lima, Filipe Portela, Manuel Filipe Santos, António Abelha, José Machado

Using Data Mining Techniques to Support Breast Cancer Diagnosis

More than ever, in breast cancer research, many computer aided diagnostic systems have been developed in order to reduce false-positives diagnosis. In this work, we present a data mining based approach which might support oncologists in the process of breast cancer classification and diagnose. A reliable database with 410 images was used containing microcalcifications, masses and also normal tissue findings. We applied two feature extraction techniques, specifically the gray level co-occurrence matrix and the gray level run length matrix, and for classification purposes several data mining classifiers were also used. The results revealed great percentages of positive predicted value (approximately 70%) and very good accuracy values in terms of distinction of mammographic findings (>65%) and classification of BI-RADS


scale (>75%). The best predictive method and the best performance on the distinction of microcalcifications found was the Random Forest classifier.

Joana Diz, Goreti Marreiros, Alberto Freitas

BeSmart2: A Multicriteria Decision Aid Application

This paper presents an improved version of an application whose goal is to provide a simple and intuitive way to use multicriteria decision methods in day-to-day decision problems. The application allows comparisons between several alternatives with several criteria, always keeping a permanent backup of both model and results, and provides a framework to incorporate new methods in the future. Developed in C#, the application implements the AHP, SMART and Value Functions methods.

Anabela Tereso, João Amorim

Big Data Analytics and Applications


Ubiquitous Self-Organizing Map: Learning Concept-Drifting Data Streams

The Internet of Things promises a continuous flow of data where traditional database and data-mining methods cannot be applied. This paper presents a novel variant of the well-known Self-Organized Map (SOM), called Ubiquitous SOM (UbiSOM), that is being tailored for streaming environments. This approach allows ambient intelligence solutions using multidimensional clustering over a continuous data stream to provide continuous exploratory data analysis. The average quantization error over time is used for estimating the learning parameters, allowing the model to retain an indefinite plasticity and to cope with concept drift within a multidimensional stream.

Our experiments show that UbiSOM outperforms other SOM proposals in continuously modeling concept-drifting data streams, converging faster to stable models when the underlying distribution is stationary and reacting accordingly to the nature of the concept-drift in continuous real world data-streams.

Bruno Silva, Nuno Marques

Big Data in Cloud: A Data Architecture

Nowadays, organizations have at their disposal a large volume of data with a wide variety of types. Technology-driven organizations want to capture process and analyze this data at a fast velocity, in order to better understand and manage their customers, their operations and their business processes. As much as data volume and variety increases and as faster analytic results are needed, more demanding is for a data architecture. This data architecture should enable collecting, storing, and analyzing Big Data in Cloud Environment. Cloud Computing, ensures timeliness, ubiquity and easy access by users. This paper proposes to develop a data architecture to support Big Data in Cloud and, finally, validate the architecture with a proof of concept.

Jorge Oliveira e Sá, César Martins, Paulo Simões

Developing Health Analytics Design Artifact for Improved Patient Activation: An On-going Case Study

Health organizations have the opportunity to incorporate


data analytics to improve decision making related to healthcare. Yet, adoption of analytics techniques in healthcare lags behind other industries. Patient activation refers to patients playing an


role in managing their own health and health care, and their perceived confidence in their ability to manage their own health. Yet, little search exists to guide the design of big data analytics that identify, monitor and improve the healthcare organization’s efforts in patient activation. The overall goal of the research is to fill this important gap. The focus is to collect and analyze data and develop our case-research based framework and its application in the development of Health Analytics Design Artifact for Improved Patient Activation.

Mohammad Daneshvar Kakhki, Rahul Singh, Kathy White Loyd

NoSQL Databases: A Software Engineering Perspective

For over forty years, relational databases have been the leading model for data storage, retrieval and management. However, due to increasing needs for scalability and performance, alternative systems have started being developed, namely NoSQL technology. With increased interest in NoSQL technology, as well as more use case scenarios, over the last few years these databases have been more frequently evaluated and compared. It is necessary to find if all the possibilities and characteristics of non-relational technology have been disclosed. While most papers perform mostly performance evaluation using standard benchmarks, it is nevertheless important to notice that real world scenarios, with real enterprise data, do not function solely based on performance. In this paper, we have gathered a concise and up-to-date comparison of NoSQL engines, their most beneficial use case scenarios from the software engineer viewpoint, their advantages and drawbacks by surveying the currently available literature.

João Ricardo Lourenço, Veronika Abramova, Marco Vieira, Bruno Cabral, Jorge Bernardino

Software Systems, Architectures, Applications and Tools


VVSHOP - The Vinho Verde Electronic Retailers Directory Revisited

VVSHOP is an Information System based on a unified system of different wine brands with Denomination of Origin from the Vinho Verde Region. VVSHOP System was developed and implemented in accordance with a set of concepts associated with the integration of different electronic retailers, e-tailers, located in different countries, in a directory available through the Internet, with different forms of interface, particularly through the maps provided by Google (Google Maps). The system allows for automatic processing of information associated with management of the wine producers and their brands, linking them to other existing in the back office system, and articulated by a set of applications that facilitate interaction among system components like the producers, the products, the entity that certifies Vinho Verde and their worldwide electronic retailers.

José Luís Reis, Paulo Martins

Scalability of Facebook Architecture

Scalability is the ability of a system, network, or process to handle a growing amount of work in a capable manner or its ability to be enlarged to accommodate that growth. There are a lot of different elements behind the success of Facebook. Of course it’s mostly about vision, timing and the right people. It’s also about the availability of readily available technology and excellent technical architecture. This paper addresses and compares both scalable and non-scalable systems. The purpose of this paper is to present a scalable web interaction benchmarking setup, using as a case study a simulation of Facebook, and to explain Facebook’s main architecture.

Hugo Barrigas, Daniel Barrigas, Melyssa Barata, Jorge Bernardino, Pedro Furtado

Creating Smart Tests from Recorded Automated Test Cases

In order to shorten time to market many software development teams have adopted continuous integration and automated testing. Although user interface test automation is a suitable solution for Agile development, the resulting frequently changing application gives rise to challenging task, especially from the point-of-view of maintenance. In this paper, we present an approach bypassing those drawbacks through test recording enhanced by post-processing that creates smart tests that are easy to maintain. We have analyzed recorded tests and created step signatures that we then use to find a sequence of common steps. Based on this, we identify reusable parts which we consequently optimize using algorithms that are introduced in this paper to remove inefficient duplications in tests.

Martin Filipsky, Miroslav Bures, Ivan Jelinek

Model for Evaluation and Cost Estimations of the Automated Testing Architecture

In the automated testing, finding an optimal architecture of the scripts and level of structuring to reusable objects from an economic point of view could become a challenging task. The optimality of the architecture of automated test scripts is context dependent and many factors are playing role in the final economics of the test automation project. To support this task with an exact method, we propose a model for evaluation of the automated testing architecture. This model is based on principal structural elements, which can be identified in the automated test scripts. Elements are then composed the particular architecture and for each of them, a set of properties and metrics are defined. Using the proposed metrics, more accurate estimations of implementation and maintenance costs can be performed for the particular architecture and test automation case, especially reflecting duplicity and reuse ratio of the code.

Miroslav Bures

PCTgen: Automated Generation of Test Cases for Application Workflows

Functional testing of application workflows is one of the most frequently used testing methods. To reduce test design effort and decrease a possibility of human mistake, it is suitable to support the process by an automated method. In this paper we present the PCTgen, solution, which is open and flexible, available for free to the test designer community and do not assume existence of UML design documentation of certain quality and consistency on the project. The PCTgen is bringing several innovative features as well as the possibility to interchange data with design and test management tools. Together with this solution, an algorithm for the generation of workflow test cases is introduced.

Miroslav Bures

Gamification Proposal for a Software Engineering Risk Management Course

Risk Management Software Engineering is a discipline that deals with the identification and analysis of risks that may occur during the development of a software project. A Risk Management course currently is part of the Master of Software Engineering in Ecuadorian Army University (ESPE) in Latacunga, it is offered in a traditional way with a theoretical approach based on case studies. In order to increase student motivation has proposed the use of gamification techniques in dictating the course, which consists of strategies game unplayable environments, which has been raised based on the syllabus outline the subject and in the framework 6D Werbach and an outline of the proposal on two levels for the risk identification and analysis in the process of software development.

Fernando Uyaguari Uyaguari, Monserrate Intriago, Elizabeth Salazar Jácome

The Impact of Software Requirement Change – A Review

One of the main recommended practices to enhance software development process is dealing with requirement change. It represents risks to the success and completion of a project. In this paper a comprehensive review on the impact of Software Requirement Change has been conducted. The literature was written in a historical way and divided into four periods. The results of this review show that most addressed fields in this topic studied the requirement change impact on the time (schedule) and the cost of the software project. Furthermore open research directions in this field are proposed.

Abeer AlSanad, Azeddine Chikh

A Data Mining Framework for Primary Biodiversity Data Analysis

Analysis based on primary biodiversity data is essential to understand the climate changes impact on biodiversity. Two challenges are involved in the use of such data. The first challenge is the identification of essential aspect of occurrences, such as, localization, institution responsible, which is important to measure the suitability of such data to the analysis which will be carried on using such data. In this sense, we propose a framework to perform data mining analysis in order to obtain such information without previous knowledge about the database, which can be integrated with different data portals using web services. We performed an evaluation with a subset of primary data available at GBIF Data Portal, which showed trends in information about occurrence location and the institution which is responsible for the occurrence.

Suelane Garcia Fontes, Silvio Luiz Stanzani, Pedro Luiz Pizzigatti Correa

Open Source Backup Systems for SMEs

Information and data is the dominant value for today´s day-to-day corporate business, and its importance is still growing in this digital era and with the explosion in use of social media in recent years. However, ensure an effective and efficient security of data is largely neglected even if most companies assume that the corruption or loses of their data will be disastrous and a “life-threatening” situation, pointing the lack of financial resources for not assuring more proactive security measures. Backup systems are a precious resource, since its aim to recover data after its loss – data deletion or corrupted, and recover data from an earlier time. Open source represent a unique opportunity to save precious financial resources, especially for SMEs. In this paper, we made a comparison of Backup Open Source Systems features, evaluating if these tools are a viable alternative to the Proprietary software. The conclusions are positive. Bacula and Amanda, two of the three software systems reviewed, proved to be very comprehensive backup systems and real options for small companies and organizations.

Diogo Sampaio, Jorge Bernardino

Semi-automatic Generation of Virtual Reality Environments for Electric Power Substations

The use of Virtual Reality environments in power substations offers a new paradigm for supervisory control. The existence of a tridimensional model, geometrically compatible with the real substation, minimizes the difference between the mental model constructed by field operators and the control room, improving communication. Additionally, such virtual environments may be used as simulators and training environments. Nevertheless, the development process related to these applications is quite complex, including activities as computer programming, 3D modeling and usability studies with significant managing tasks. This paper presents some new software layers to make the development of 3D virtual power substations easier. By providing tight integration between traditional CAD software and 3D engines, the tools presented here were successfully applied in the construction of several virtual electric power substations.

Leandro Mattioli, Alexandre Cardoso, Edgard A. Lamounier, Paulo do Prado

iEcoSys – An Intelligent Waste Management System

At present, only a few small cities have implemented procedures for collecting rubbish in an innovative way. Thus, it is urgent to implement measures that initiate sustainable behavior, with the active participation of citizens, ensuring the conservation of resources through the reduction and recovery of waste. This paper describes the system iEcoSys (Intelligent Ecologic System), an intelligent waste management system, developed by the authors. It is a technological tool that identifies the waste produced individually, using RFID tags embedded in rubbish bags – the iBags. When depositing waste, the recycling center identifies and weighs each bag and the collected data is sent to a server system using ZigBee communication standard. When the information reaches the server system, it is inserted into the database management system, making it possible to know the deposited waste in the iEcoSys internet portal, and even order new iBags. Making the cities smarter and promote sustainability by changing the paradigm of receiving for the recycled rubbish instead of paying for the waste produced, is the contribution of this study.

Pedro Reis, Filipe Caetano, Rui Pitarma, Celestino Gonçalves

Dynamic Replication and Deployment of Services in Mobile Environments

Currently, mobile environments are gaining importance, and new promising paradigms, like

Mobile Cloud

, are arising. However, these environments pose new challenges and are mainly characterized, among others, by frequent changes in their execution context. This particularly is challenging for software architects in the design and implement this kind of systems. For instance, the Service Oriented Architecture (SOA) paradigm has not been devised to operate in dynamic network environments, where centralized services or static deployments must be avoided, in order to provide a higher availability of services. Therefore, in order to provide a reliable SOA implementation in mobile environments, it must be complemented with techniques and methods of Autonomic Computing. In this work, a self-adaptive SOA approach is proposed. This architecture will provide support to the dynamic replication and deployment of services in mobile environments. The approach is based on the context information modelling and management as key aspect to achieve the required architecture adaptation.

Gabriel Guerrero-Contreras, Carlos Rodríguez-Domínguez, Sara Balderas-Díaz, José Luis Garrido

Towards an Augmented Assistance Dog

Despite the recent progresses in robotics, autonomous robots still have too many limitations to reliably help people with disabilities. On the other hand, animals, and especially dogs, have already demonstrated great skills in assisting people in many daily situations. However, dogs also have their own set of limitations. For example, they need to rest periodically, to be healthy (physically and psychologically), and it is difficult to control them remotely. This project aims to “augment” the service dog, by developing a system that compensates some of the dog weaknesses through a robotic device mounted on the dog harness. The present article shows that the dog’s activity and some indexes of the animal emotional state can be successfully identified by the wearable device.

Yves Rybarczyk, Jérémie de Seabra, Didier Vernay, Pierre Rybarczyk, Marie-Claude Lebret

Exploring the Distance, Location, and Style of “Selfies” with the Self-Pano Mobile Application

Taking self-portraits, also known as “selfies”, has become one of the biggest activities among the social media users. The advances in camera technologies and online services in mobile phones have given rise to far greater opportunities for mobile posting and sharing. The selfie is often used as a user’s profile picture in social media sites. The selfie is not just a photographic portrait of the taker; an equally important aspect of the selfie is the background image, which often includes a scene of special interest or a group of friends: as so-called “wefie”. With the popularity of social media sites continuing to grow rapidly, mobile devices have become an almost real-time tool for building and maintaining relationships with distant family and friends through the use of selfies and other postings. Self-Pano, is a mobile application developed to maximize the information content of a selfie. This study analyzes the selfie images collected with Self-Pano Apps: the distance, location and style of selfies were explored. The outcome of this project can be used as the reference for future selfie stick development.

Teh Phoey Lee, William Kosasih

Metrics Selection for Testability of Object-Oriented Systems

Software tests are the most important quality assurance measure in the development of software systems. It is a well-known fact that, the later an error is discovered, the more costly it is to fix it. It is important to think about the testability of the software system first. Testability is the extent to which a software artifact can be tested. This paper investigates the testability of object-oriented system. In this paper, we selected a set of metrics used in an object-oriented system and defined them under quality factors based on their interdependence. We have used Analytic Hierarchy Process method to attain which metric is mostly used and is best for testability. In AHP the criteria are considered independently. The criteria involved in this study are the quality factors. Pairwise comparison matrixes are designed in questionnaires for determining the importance degree between criteria and metrics.

Priyanksha Khanna

STTP – A Supporting Tool for the Teaching of Physics Using Android Platform

This paper presents a methodology for teaching Physics’ formulas through a mobile application developed on ANDROID platform. In general, the methodologies used today by teachers and instructors, in all different teaching circumstances, are being increasingly assisted by emergent technologies. In line with this trend, pedagogues, Psychology practitioners and Education researchers seek ways to facilitate the development of education through new tools which are capable of meeting teaching demands. The junction between teaching demands and emerging technologies refers to an interesting research objective, seeking a technological solution that can facilitate the appropriation of contents and motivate the student to get acquainted with modern education tools.

A. L. T. Oliveira (PQ), M. P. S. Dantas (IC), P. V. S. Dantas (IC), A. de P.D. Queiroz (PQ), A. S. S. Neto (PQ)

Communication Technologies in the Analysis of Online Reputation and Job Search

This paper by the New Media Group at the University of Santiago de Compostela presents research into the online reputation of job websites in Germany, France, the United Kingdom and Switzerland and the development of communication management strategies to assist access by unemployed youth in the Mediterranean countries to the labour market. The project consisted of analyzing the online reputation of the websites and profesional offers in these four European countries using their own


methodological model in order to design a virtual tutorial tool called


. The project was developed from a case study on unemployed Spanish youths that migrate to the main European countries in search of jobs and the use of these new communication technologies to improve access to the labour market.

Xosé Pereira, Francisco Campos, Natalia Alonso

MoCaS: Mobile Carpooling System

Carpooling is a car sharing practice first adopted in the United States of America during the fuel crisis in the 1970s. Since then, and after some ups and downs, this practice has been growing in recent years, being currently used throughout the world. With the evolution of mobile technologies, carpooling had the opportunity to expand, especially through mobile applications and web pages. With these technologies, it is possible for anyone in any part of the globe to search for others that wish to go to the same place and want to share their car. With this practice, people intend to save money, help preserve the environment, reduce congestions in cities, increase the number of places available to park and meet new people. This paper introduces MoCaS – Mobile Carpooling System –, a carpool service offered for registered users. In this system, each user can enter his travels and make appointments, assign ratings, register vehicles and add travel preferences. All this is possible via a web interface and also via a mobile application that together give greater support to those seeking such services. MoCaS distinguishes itself from other systems by offering innovative services, namely in the mobile component, that through location services allows for the booking of trips in real-time; in other words, not only trips that have not started, but trips that are already underway and that end up intersecting the user’s position. Besides this novelty, this system provides a real-time map, where all trip stops are visible, as well as the location of carpoolers who are currently traveling. Both the web and the mobile applications were successfully developed, achieving good results in the performed tests, and are currently being prepared for deployment.

Álvaro Ribeiro, Daniel Castro Silva, Pedro Henriques Abreu

Multimedia Systems and Applications


An Efficient and Adaptive Threshold of Volumetric Segmentation

Image segmentation plays a crucial role in effective understanding of digital images. Among the many approaches in performing image segmentation, graph based approach is gaining popularity primarily due to its ability in reflecting global image properties. Volumetric image segmentation can simply result an image partition composed by relevant regions, but the most fundamental challenge in segmentation algorithm is to precisely define the volumetric extent of some object, which may be represented by the union of multiple regions. The aim of this paper is to present a new and efficient method to detect visual objects from color spatial images. The presented method uses a general-purpose volumetric segmentation algorithm, based on a unified framework that uses a virtual tree-hexagonal structure defined on the set of image voxels.

Dumitru Dan Burdescu, Marius Brezovan, Liana Stanescu, Cosmin Stoica Spahiu, Florin Slabu

Transparent Handover Using WiFi Network Prediction for Mobile Video Streaming

Loosely controlled Wi-Fi networks, such as deployed at festivals and events, typically do not provide a full coverage of the terrain. As a result, the user experience of a mobile user watching a video stream is rather erratic, because of frequent disconnections and re-connections. However, these periods of disconnections are generally rather short (10 - 50 seconds for a coverage area of around 70%). To ensure a continuous experience one needs to graciously handle these “disconnection gaps” until a client connects to the subsequent access point. In this paper we will introduce a disconnection prediction (DP) mechanism and pre-fetcher (PF) to successfully initiate a transparent handover ensuring a continuous user experience for video streaming.

The DP predicts disconnections and re-connections based on a collaboratively generated connectivity map. This information is used to activate the pre-fetcher, which will start to download video material to the users device, that will be needed during the disconnected period. When the mobile user physically disconnects from the network, the video will continue to play as the requested data has been buffered on the mobile device. When the user re-enters a connected area, the wireless network connection is restored and operation is resumed as normal.

Joost Geurts, Zhe Li, Yaning Liu, Jean-Charles Point

Computer Networks, Mobility and Pervasive Systems


Hybrid Distributed-Hierarchical Identity Based Cryptographic Scheme for Wireless Sensor Networks

There is a major concern in the field of Wireless Sensor Networks regarding information security in very large infrastructures, where classical security models cannot be applied due to the very strict resource constraints of network nodes. For this, identity based cryptographic schemes, distributed key generators and hierarchical security infrastructures have been intensively discussed but, from our knowledge, no proposals have been made that would also cover two very important issues regarding this type of security systems – secure key issuing and key escrow. We are proposing such a security scheme that would benefit from both hierarchical and distributed approaches, with special attention to key escrow and key issuing. Also, we are envisioning some applications that would benefit from such a security scheme.

Constantin Grumăzescu, Mihai-Lică Pura, Victor-Valeriu Patriciu

SPORANGIUM: Exploiting a Virtualization Layer to Support the Concept of Sporadic Cloud Computing with Users on the Move

In this paper we deploy

Sporadic Ad-hoc Networks

(SANs) over the devices of a group of always-on users who happen to be in a place. The goal is to develop tailor-made services that exploit the particular context of the users, the possible similarities among their preferences and the technological capabilities of their terminals to establish ad-hoc communications. In order to overcome the intrinsic limitations of mobile devices, we explore the new concept of

Sporadic Cloud Computing

(SCC) that is aimed at providing each terminal with extra resources by exploiting the capabilities of the rest of devices connected to each SAN. To abstract the complexity stemmed from the mobility scenarios, SCC works with a virtualization layer that deals with a few static virtual nodes instead of a higher number of mobile real nodes. This allows to turn our SANs into reliable and stable communication environments to promote interactions among potentially like-minded strangers.

Esteban F. Ordóñez-Morales, Yolanda Blanco-Fernández, Martín López-Nores, Jack F. Bravo-Torres, José J. Pazos-Arias, Manuel Ramos-Cabrer

An Approach for Resilient Hierarchical Routing in Mobile Networks

We show a cluster based routing protocol, in order to improve the convergence of the clusters and the network it is proposed to use a backup cluster head. A event discrete simulator is used for the implementation and the simulation of a hierarchical routing protocol called the Backup Cluster Head Protocol (BCHP). Finally it is shown that the BCHP protocol improves the convergence, availability and resiliency of the network through a comparative analysis with the Ad Hoc On Demand Distance Vector (AODV) routing protocol and Cluster Based Routing Protocol (CBRP).

Rommel Torres, Samanta Cueva

Improving Lifetime and Availability for Ad Hoc Networks to Emergency and Rescue Scenarios

A mobile ad hoc network is a temporary, self-configuring network composed of nodes that communicate through wireless interfaces that can change their geographic location randomly. It is ideal for emergency and rescue scenarios in which an immediate action is necessary and when it may be the first form of communication available. Therefore, it is necessary for the network to be active as long as possible. Additionally, the communication between network nodes must be efficient. The present research demonstrates, EBCHP, a proactive, hierarchical routing protocol in which, by using the association and support strategies of the main nodes, the performance of the mobile ad hoc network improves and its lifetime is extended compared with BCHP.

Rommel Torres, Francisco A. Sandoval, Liliana Enciso, Samanta Cueva

Black Virus Decontamination in Arbitrary Networks

In this paper we investigate the problem of decontaminating a network from the presence of a

black virus

(BV), a harmful entity capable both of destroying any agent arriving at the site where it resides, and of moving to all the neighbouring sites. This problem integrates in its definition both the harmful aspects of the classical static

black hole search

problem with the mobility aspects of the classical

intruder capture


network decontamination

problem. The initial location of the BV is unknown; the critical objective for a team of system agents is to locate and eliminate the BV with the minimum number of network infections and agent casualties. The decontamination process is clearly dangerous for the system agents and for the nodes infected by the spreading of the BV. The problem of

black virus decontamination

(BVD) has been investigated only for special classes of highly regular network topologies.

In this paper, we consider the BVD problem and show how to solve it in networks of arbitrary topology. The solution protocol is provably correct, deterministic, generic (i.e., works irrespective of the network topology), and worst-case


. In fact, we prove that our protocol always correctly decontaminates the network with the minimum number of system agents’ casualties and network infections. Furthermore, we show that the total number of system agents is asymptotically optimal.

Jie Cai, Paola Flocchini, Nicola Santoro

Analytical and Simulation Modeling to Analyze Reliability State of Wireless Sensor Networks

Wireless sensor networks have gained abundant interest due to their potential wide range of applications. Reliability of deployed wireless sensor network is defined by covered area of alive nodes and redundancy of data. Redundancy in data occurs because of overlapped sensed area. An initial reliable wireless sensor network switches to unreliable state because nodes perish in the field randomly. Consequently, the quality of data starts diminishing. It is imperative to know when the network will switch to unreliable state from reliable state so that proper action can be conducted in the field. Work of this paper analyzes and compares analytical and simulation modeling for reliability state of wireless sensor network. Multi-objective genetic algorithm based method is operated for analytical modeling which determines minimum number of nodes (randomly) that covers almost complete area while having required minimum overlapped area. Clustering algorithm, LEACH, is implemented in NS-2 for simulation modeling. Comparative results of analytical and simulation modeling are different because of their different nature but both highlights that reliability of wireless sensor network is salient.

Vipin Pal, Yogita, Girdhari Singh, R P. Yadav

Interacting and Making Personalized Recommendations of Places of Interest to Tourists

Nowadays, applications that are developed to support tourists should go much further than simply providing information about places or recommending places or routes based on the user location. They should be able to provide users with simple mechanisms to interact with places of interest and provide them with relevant information and recommendations about new relevant places of interest or tours according to their preferences and the preferences of other tourists with similar interests. In this work we describe a system that explores information about tourists’ interactions with places of interest and their opinions about each place, to recommend new places of interest, pedestrian tours and to promote products and services which are in accordance with their expectations. First experiments show that the system can help the tourists to interact with places of interest, helping them in their visits and also to promote shops and services.

Fábio Fernandes, Fernando Reinaldo Ribeiro

Use of Social Network Site´s Profile for the Employment Seeking Process

This paper investigates the use of social network site´s profile to present individuals to potential employers. Prestige is an important issue of social capital theory, social network sites members present their knowledge and skills to other members to have a benefit. The results of the survey of 440 people about their behaviour for the employment seeking process on social network sites are presented. The paper provides the analysis of reasons to present themselves in social network sites and defines the risks for organisations who use social network site´s profile to evaluate candidates. The purpose of the paper is to evaluate and explain the behaviour of individuals on social network sites for the employment seeking process and to test if the theoretical mechanism of the social capital theory can explain the operation of social network sites.

Tom Sander, Phoey Lee Teh, Biruta Sloka

Performance Evaluation of Long-Distance 802.11n Networks

Nowadays is a topic of interest the adaptation of WiFi technologies to long distances, named WiLD (WiFi based Long Distance). WiLD networks are well adapted to suburban and rural areas of developing countries, however present several complexities that limit their performance and range, including limitations due to inefficiency in temporary variables in the MAC layer, mainly in media access and wait for acknowledgments. Considering these problems, it is a clear need to evaluate new technologies to improve capacity in such conditions. IEEE 802.11n-2009 would be in line with the most promising technologies. In this article, we address this task with an assessment of the 802.11n features which is expected to enhance communications over long distances and performance testing on the frame aggregation technique that would have on these scenarios, using NS-3 as the simulation platform. The simulation results demonstrated the effectiveness of 802.11n PHY and MAC layer enhancement over long distances in relation to 802.11a.

Patricia Ludeña, Javier Simó, Katty Rohoden, Marco Morocho

Architecture for Wireless Grids

Evolving consumer expectations will require changes to the existing access network – next generation access networks (NGNs). Emerging services leads to a great increase in bandwidth demand. Another great challenge to access networks is mobility. By other side, wireless mobile devices have become an indispensable tool for households and businesses. The increase of wireless devices, motivated by the rapid decrease of the cost and ease installation, leads to the redesign of the way applications and services are delivered. So, the integration of wireless grids with NGNs is extremely important. This paper presents a new architecture to integrate wireless grids in access networks.

João Paulo Ribeiro Pereira

Human-Computer Interaction


Focus Group Foci: Employing Participants’ HCI and Application Domain Expertise in Interaction Design

The paper reports a study in which two aspects of interaction design − human interaction


technology and acting


technology − were analyzed in the context of participatory sessions having the focus group format. The sessions were conducted as a part of redesigning a novel digital artifact, a web-based project management tool. An initial prototype of the artifact was introduced to two different groups of participants possessing expertise in, respectively, human-computer interaction (HCI) and teaching and learning, a key target application domain for the tool. Re-design suggestions provided by each of the focus groups were found to address issues with both user interface and functionality of the prototype. The main difference between the groups was in whether they primarily focused on interaction efficiency or artifact’s integration into a larger social and technological context. Implications of the study for further development of participatory methods in interaction design are discussed.

Patrik Björnfot, Victor Kaptelinin

Understanding the Effect of Techno-interruptions in the Workplace

Technostress focuses on individuals’ struggle to deal with the cognitive and social requirements derived from the use of information and communication technologies (ICT). Researchers have explored this phenomenon by studying the demands imposed by the use of ICTs on workers (e.g. fear of losing one’s job due to ICT). However, researchers in this domain have not investigated how the stress generated with technology-mediated interruptions (henceforth to be referred to as “


”) can affect individuals at the workplace. This work in progress study aims to understand the mechanisms through which


affect individuals’ experiences with ICTs within the work context. In doing so, the study introduces the concept of Perceived Technorruption Severity (PTS) to measure individuals’ appraisal of


. In addition, it proposes a research model to understand the impacts of PTS on individuals using ICTs during a work meeting, which will be validated using a survey-based study and structural equation modeling techniques.

Sonia Camacho, Khaled Hassanein, Milena Head

New Solutions for Old Problems: Use of Interfaces Human/Computer to Assist People with Visual and/or Motor Impairment in the Use of DOSVOX and microFênix

The article describes two solutions to improve the interface to software tools developed at to Tércio Pacitti Institute – Federal University of Rio de Janeiro, for people with disabilities. The first is an interface that allows people with visual impairments who also have the inability to use hands, have access to DOSVOX system, one of the most used Assistive Technologies in Brazil. DOSVOX enables the interaction between the visually impaired and the computer, whose operation is originally performed solely by keyboard, now can be operated via a gyroscope coupled to the user’s head, whose movement is translated into a series of keystrokes. The other interface is a BCI – Brain Computer Interface to microFênix, software designed for helping people with severe motor impairment to interact with a microcomputer.

Adriano Joaquim O. Cruz, Henrique Serdeira, João Sergio S. Assis, José Antonio S. Borges, José Fabio M. Araújo, Márcia Cristina A. Soeiro, Marcos F. Carvalho, Mário Afonso S. Barbosa

Integrating Usability in the Software Development Process in the Main Development Companies of Ecuador (Quito, Guayaquil and Cuenca)

Now recognized as the usability attribute key to the success of a software product in the development stage quality. The techniques belonging to the field of HCI (Human Computer Interaction) are responsible for providing an adequate level of usability in software products and are not frequently applied in the software development process as usual. The problem addressed in this paper is the proper application of usability techniques in Software Development Companies of Ecuador. Therefore, which allow them to interact directly with the user during the software development stage to ensure the permanence of the product in the market. With the implementation of these techniques the developer is able to satisfy users with an effective and efficient website and the user find what you need in the shortest time possible.

Elizabeth Salazar Jácome, Fernando Uyaguari Uyaguari, Monserrate Intriago

Human Evaluation of Online Machine Translation Services for English/Russian-Croatian

This paper presents results of human evaluation of machine translated texts for one non closely-related language pair, English-Croatian, and for one closely-related language pair, Russian-Croatian. 400 sentences from the domain of tourist guides were analysed, i.e. 100 sentences for each language pair and for two online machine translation services, Google Translate and Yandex.Translate. Human evaluation is made with regard to the criteria of fluency and adequacy. In order to measure internal consistency, Cronbach’s alpha is calculated. Error analysis is made for several categories: untranslated/omitted words, surplus words, morphological errors/wrong word endings, lexical errors/wrong translations, syntactic errors/wrong word order and punctuation errors. At the end of this paper, conclusions and suggestions for further research are given.

Sanja Seljan, Marko Tucaković, Ivan Dunđer

Controlled Vacuum Packed Particles as a Part of Machine Interface System

New opportunities of the Vacuum Packed Particles (VPP) and its application as a part of human-machine interface system are investigated. VPP are conglomerates composed of grains encapsulated in a tight, flexible envelope and submitted to an internal underpressure. The underpressure value has a great impact on the behavior and properties of the medium. This property may be used in various novel applications. The experimental results related to the stiffness and the vibration analysis and the controllable mechanical properties of VPP are given. The possibilities of numerical simulation of VPP behavior in a function of internal pressure are shown. The potential and perspectives for novel applications of VPP as a part of human-machine interaction are pointed out.

Mariusz Pyrz

On the Boundaries of Telemedicine

For several years, telemedicine practitioners have struggled to establish their field as an autonomous scientific discipline, which is often reflected in diverging definitions of what telemedicine is. However, the telemedicine community depends on and draws heavily on established scientific areas such as medicine, economics, informatics and social science, which each have their own criteria for good scientific work. In addition, new technological innovations appear to pave the way for new applications within the field, thus challenging perceptions of what the field encompasses. The aim of this paper is to contribute to this debate by conceptualizing the notion of telemedicine as being something people do rather than numerous definitions of what the field is.

Gunnar Ellingsen

Can Transparency Enhancing Tools Support Patient’s Accessing Electronic Health Records?

Patients that access their health records take more care of their health and, when in therapy, commit more seriously to improve their condition. This leads to a more effective and more efficient healthcare management, and is also in agreement with European directives on data protection. However, accessing medical data can be risky. Security should be assured and it should be evident to the patients, who has access to what data and any violation to patient’s privacy requirements should be reported. We call this property



Precisely this work looks into the Transparency Enhancing Tools that have been proposed to increase people’s awareness about security and privacy on the Internet, and discusses to which extent these tools can empower transparency in healthcare.

Ana Ferreira, Gabriele Lenzini

A Kinect-Based Virtual Reality System for Parkinson Disease Rehabilitation

This work brings together the emerging virtual reality techniques and the natural user interfaces to offer new possibilities in the field of rehabilitation. The tool has been developed based on a low cost device (Microsoft Kinect T) connected to a personal computer. It provides patients having Parkinson Disease (PD) with a motivating way to perform several motor rehabilitation exercises to improve their rehabilitation. Preliminary results obtained so far make us to believe that the tool is suitable for rehabilitation purposes. Feedbacks coming from participants corroborate the hypothesis that the system can be applied not only in clinical rehabilitation centers but at home. The tool should help rehabilitators to take a more individualized and personalized follow-up of PD patients. This results in a more enriching rehabilitation process where motivation is highly encouraged.

Guillermo Palacios-Navarro, Sergio Albiol-Pérez, Iván García-Magariño

Evaluating the Usability of Library Websites Using an Heuristic Analysis Approach on Smart Mobile Phones: Preliminary Findings of a Study in Saudi Universities

The internet is expected to lead the next wave in the growth of data consumption and media content in Saudi Arabia in the next decades. This is because of reasons which include the increased growth a young and tech-centric population, the growing popularity of smart devices, and the improved technical capabilities of national telecom operators. Many public and private organizations in Saudi Arabia have advocated the increased use of smart mobile phones to facilitate their access to website services via the internet, and universities are no exception. However, developing usable websites that could be accessed through different technologies and platforms without usability assurance would not result in user satisfaction and high-quality services. This research conducts an evaluation of the usability of libraries’ websites using heuristic analysis on smart mobile phones across the public universities in Saudi Arabia in order to provide a comprehensive national view of users’ interaction and satisfaction. For this preliminary study, only two library websites of two public universities located in the western region in Saudi Arabia were evaluated. The initial results showed usability weaknesses in terms of links and navigation, helping users, data entry forms, visual design and accessibility for visually impaired users. In this regard, the Higher Education Ministry in Saudi Arabia, alongside the universities, need first to define clearly the websites’ current and future objectives, and then develop national standardized web metrics for the universities. A further important step is to work closely with Yesser (the Saudi e-Government Program) to develop national web metrics and to understand the potential development of the smart mobile phone market over the next few years.

Abdulhadi Eidaroos, Abdullah Alkraiji

Using Serious Games to Train Children and Elicit Fire Safety Behaviour

Serious Games are being increasingly used as a tool for various applications, contrary to the traditional entertainment purpose. One of their application domains is fire safety. Possible injuries from fires are a dangerous safety concern for children, for instance. Another important issue is the elicitation of behavioural knowledge to design and feed simulation models. The lack of human behaviour data is often referred to as a drawback to evacuation simulation designers. This paper addresses the aforementioned matters in respect to: i) acquiring valuable knowledge on children behaviour when facing the urgent need for evacuation; and ii) devising an educational tool. A group of 19 children from an elementary school played two different role plays using a Serious Game and the data of their behaviours was collected. Results were analysed and are here presented. Future work is two-fold: to expand and to refine data collection to other groups such as elderly; to use this data for crowd synthesis particularly for evacuation simulators.

João E. Almeida, Rosaldo J. F. Rossetti, Brígida Mónica Faria, António Leça Coelho

Health Informatics


Medical Data Integration with SNOMED-CT and HL7

Data integration of medical information to be stored in EMR, HER and/or PHR is a challenging issue, especially for the sensitive nature of such data. Here we present a simple tool for medical data integration for semi-structured sources (as healthcare provider websites); to cope with semantics, SNOMED-CT clinical terminology is exploited. Final results are formatted according to the HL7 standard to facilitate further integration.

Alessandro Longheu, Vincenza Carchiolo, Michele Malgeri

A Data Preparation Methodology in Data Mining Applied to Mortality Population Databases

It is known that the data preparation phase is the most time consuming phase in the data mining process. Between 50% or up to 70% of the total project time and the results of data preparation directly affect the quality of it. Currently, data mining methodologies hold a general purpose; one of the limitations being that they do not provide a guide about what particular task to develop in a particular domain. This paper shows a new data preparation methodology oriented to the epidemiological domain in which we have identified two sets of tasks: General Data Preparation and Specific Data Preparation. For both sets, the Cross-Industry Standard Process for Data Mining (CRISP-DM) is adopted as a guideline. The main contribution of our methodology is fourteen specialized tasks concerning such domain. To validate the proposed methodology, we developed a data mining system and the entire process was applied to real mortality databases. The results were encouraging, on one hand, we observed that the use of the methodology reduced some of the time-consuming tasks and, on the other hand, the data mining system showed findings of unknown and potentially useful patterns for the public health services in Mexico.

Joaquín Pérez, Emmanuel Iturbide, Victor Olivares, Miguel Hidalgo, Nelva Almanza, Alicia Martínez

LabsMóveis: Innovation in the University Classroom through Mobile Devices

Considering the significant increase in the presence of mobile devices in classroom and the opportunity to use these resources to improve and streamline the process of teaching and learning, this paper proposes a pedagogical use of mobile technologies, tablets, in class. The proposal was developed under the LabsMóveis Project, and was applied in graduating classes of the Health area. The use of tablets in the university classroom contributed to the expansion of general and specific cognitive abilities stimulating creativity and leadership, enhancing the initiative to solve problems.

Márcia Cristina Moraes, Leticia Lopes Leite, Raquel da Luz Dias, Ana Elizabeth, P. L. Figueiredo, Rosana Maria Gessinger, Cristina Moreira Nunes

Student Participation in Researching Projects: A Practical Experience

The relation between Computer Science and Information System lies in the fact that when a specific technology and tools are needed to solve problems, Computer Scientists design the tools whereas Information Systems professionals then apply the tools to achieve the maximum benefit for their particular organizations or tasks. In this work we have studied how students’ participation in researching projects can help to develop their cross-cutting competencies. We show an example where technology is adapted to solve health problems. The experience introduced has to do with a software development process.

Guillermo Palacios-Navarro, Sergio Albiol-Pérez, Iván García-Magariño

Information Technologies in Education


A Methodology to Use Market Simulation Software in the Teaching of Business Management

This work aims at presenting and analyzing the use of Enterprise Resource Planning Simulation Game software as a success case of the application of the Problem-Based Learning strategy classes in business management. Problem-Based Learning is a strategy focused on the process of self - learning through problem solving. The ERP Simulation Game is a simulation software market, developed by HEC Montréal. To achieve the desired educational goals, it was necessary to define a methodology for me - use software, which include all the essential characteristics of Problem-Based Learning. It was observed during the application of the methodology to use software that increases the level of involvement and pursuit of knowledge by students, motivated by competition that the software provides.

Fabio Correa Xavier, Beatriz Camasmie Curiati Salione, Solange N. Alves de Souza

Contemplation on Today’s Digital Education

This paper starts to address the affect and side-effects of social media on people’s live in a pure contemplation perspective. Social networks are revised and some issues regarding its impact on education was not forgotten such as the teacher role in the digital classroom, formal versus informal learning or Web 2.0 tools use. Since Moodle is the first Learning Management System whilst Facebook is the first social network in the world, a survey was accomplished with two independent classes of e-business students at University of Saint Joseph, Macao, China, on their attitudes toward both online services in a learning framework. In general, the results confirms to a certain extent others previous studies on the question of whether using Facebook as an educational tool is more effective than Moodle.

João Negreiros, Maria de Fatima Oliveira

New Algorithms for Smart Assessment of Math Exercises

This paper deals with the field of mathematics education where the aim is to generate exercises with randomized data. The process of exercise generation involves, first, identification of common errors that may be performed when solving the exercise, then, modeling of these errors by appropriate functions and recognition and distinction of errors through algorithms. Thus, although randomized, the exercise parameters must be chosen in such a way that it would be possible to understand the error the student committed and with it to guarantee a proper feedback on his answer. We call this process smart assessment. In this paper we present two algorithms for exercise generation admitting smart assessment.

Isabel Araújo, Irene Brito, Gaspar J. Machado, Rui M. S. Pereira, José João Almeida, Georgi Smirnov

Distance Education Evaluation: An Analysis of the β Factor from LV Model Subjectivity

This paper aims to present a pedagogical metrics


Factor of the Learning Vectors Model (LV Model) of procedural and formative evaluation as a pedagogical mediation mechanism in a Virtual Learning Environment (VLE) highlighting its contribution in the inherent subjectivities in the evaluation act. This research is characterized as a case study, as it encompasses situations experienced during an online learning course. It hopes to contribute to the improvement of tutoring by showing new routes and evaluative practices in the online learning processes and showcase strategies and pedagogical resources conducive to teaching professional development.

Eliana A. Moreira Leite, Adriano Machado, Gilvandenys Leite Sales, Herik Zednik, Silvania Maria Maia

Massive Open Online Course in Teacher Training: Between Limitations and Possibilities

This paper presents a preliminary discussion of the possibilities and limitations of Massive Open Online Courses (MOOC), a model of an online course whose contents are open and freely offered over the internet to anyone and anywhere. The main characteristic of this model is to allow an active engagement of tens or hundreds of thousands of students who self organize their own participation according to their goals, background, abilities and common interest. Advances in studies about Web Semantics and Adaptive Learning Systems are examples of research results in some of the various research areas of Computing in Education which are still incipient in MOOCs.

João Melo, Elda Melo


Weitere Informationen

Premium Partner