Skip to main content
main-content

Über dieses Buch

The five-volume set LNCS 9155-9159 constitutes the refereed proceedings of the 15th International Conference on Computational Science and Its Applications, ICCSA 2015, held in Banff, AB, Canada, in June 2015. The 232 revised full papers presented in 22 workshops and a general track were carefully reviewed and selected from 780 initial submissions for inclusion in this volume. They cover various areas in computational science ranging from computational science technologies to specific areas of computational science such as computational geometry and security.

Inhaltsverzeichnis

Frontmatter

Workshop on Land Use Monitoring for Soil Consumption Reduction (LUMS 2015)

Frontmatter

Demographic Changes and Urban Sprawl in Two Middle-Sized Cities of Campania Region (Italy)

A Measurement in the Cities of Benevento and Avellino

This paper aims to show a measure of the spatial expansion of the buildings in inland areas of Campania and, through this, an analysis of the most complex phenomenon of urban sprawl. This work is a pilot study aiming to test a research methodology. Thus, at this stage, the area of investigation was restricted to two medium-sized cities: Benevento and Avellino. So, the Author proposes to investigate whether there is a sprawl in this particular context, in line with the European trend, and proposes a physical and anthropic correlation index between the changes of the built areas, seen as a measure of the taken land, and the demographic changes, to analyze the phenomenon of urban sprawl in relation to housing demand. Therefore, for examined urban areas, the Author analyzes the correlation between the change in population density between the years 2001 and 2011, extracted from the Census of the population at the fractional scale, and the change in the building coverage ratio extracted from the RTC (Regional Technical Cartography) in 1998 and in 2005.

Massimiliano Bencardino

Coastal Monitoring: A Methodological Proposal for New Generation Coastal Planning in Apulia

This paper aims to provide a methodological dynamic management framework for ecological processes affecting the coastal strip, purveying tools and guidelines for the redaction of Municipal Coastal Plans in Apulia Region, and more specifically of Barletta-Andria-Trani Province. In detail, the objective is to develop added value products and services for the analysis of soil consumption and all the environmental and landscape parameters tailored on the coastal landscapes. This will be done on the basis of structure, management, control and monitoring of the coastal area to guarantee the right to access and enjoy the use of the public patrimony within and without the state-owned area. The model here proposed would include the implementation of a semi-automatic multi-scale, multi-time and multi-source tool, built on a set of indicators, which will be useful for a new generation coastal planning based on the key principles of sustainability, territorial vocation and local specificities.

Marco Lucafò, Giovanna Mangialardi, Nicola Martinelli, Silvana Milella

Ecosystem Services Assessment Using InVEST as a Tool to Support Decision Making Process: Critical Issues and Opportunities

The awareness of Ecosystem services concept has gained prominence in the decision making process. The inclusion of this issue strictly depends on the way in which may be incorporated in the development strategies of a region by the policy makers. The paper want to test one of the most used model to quantify the Ecosystem services, with a spatial distribution output, in order to recognize the critical issues and the opportunities to use it as a tool to support decision making process. The InVEST model was experimented for the Habitat Quality and Carbon Sequestration functions. The survey area is the Municipality of Lodi in the south part of Lombardy Region (north of Italy) due to the high accessibility to the database information and also to attempt the adaptability of the software to product reliable output at micro-scale.

Andrea Arcidiacono, Silvia Ronchi, Stefano Salata

Climate Change and Transformability Scenario Evaluation for Venice (Italy) Port-City Through ANP Method

The paper explores the consequences of resilience loss in some social-ecological components of Venice port-city in Italy and suggests integrated sustainable strategies on increasing their stress response capability. The urban model of Venice depends on two influential factors: the natural balance between land and water, and the social-economic dependence between mainland and islands. The authors adopt a broad framework of urban spaces and water environment, considering Venice as a complex regional unit, highly dynamic and sensitive. The adaptive balance historically reached by population and nature in Venice has been altered, because of a non-sustainable economic expansion, and tourism policy. The paper adopts a multi-dimensional approach, integrating the cognitive and evaluative dimensions with the technical and economic ones, in order to define possible strategies of action through Multi-Criteria Analysis and ANP method, able to play a strategic role in enhancing resilience at various scale.

Maria Cerreta, Daniele Cannatella, Giuliano Poli, Sabrina Sposito

The Evaluation of Landscape Services: A New Paradigm for Sustainable Development and City Planning

The paper provides a methodological framework in order to investigate, map and evaluate the landscape, understood as multi-dimensional complex system. A possible landscape knowledge-related approach seeks to recognize the

landscape services

spatial distribution on the territory. The management of multi-dimensional geospatial data, indeed, allows to face contemporary environmental matters regarding continuous biodiversity loss, ecological fragmentation and acceleration of climate change. The GIS provides helpful tools for systematization, management and analysis of spatial indicators, which describe and quantify the type of

ecosystem

and

anthropic

service

s. The paper proposes the development of a Spatial Decision-Making Support System (SDSS), in sync with GIS data-set and multi-criteria method AHP. The SDSS is useful to develop a

landscape services

thickness map and to define possible territorial transformation scenarios. The recognition of the

landscape services

can support the decision-making process referred to a sustainable management and planning, which are still an open investigation field.

Roberta Mele, Giuliano Poli

Workshop on Mobile Communications (MC 2015)

Frontmatter

A Quorum-Based Adaptive Hybrid Location Service for VCNs Urban Environments

Location information services or location management systems in VCN (Vehicular Communication Networks) are used to provide location information about vehicles such as current location, speed, direction and report this information to other vehicles or network entities that require this information. This paper designs firstly an EDQS (Extended Dynamic Quorum System) which is a logical structure for location management of VCNs. Secondly, we propose a QAHLS (Quorum-based Adaptive Hybrid Location Service) on the basis of the EDQS which uses direct location scheme mixed with indirect location scheme according to the location preference and the location mobility for a vehicle. The performance of QAHLS is evaluated by an analytical model and compared with that of existing GQS (Grid Quorum System) based and DQS (Diamond Quorum System) based location services.

Ihn-Han Bae, Jeong-Ah Kim

Advanced Persistent Threat Mitigation Using Multi Level Security – Access Control Framework

Bring Your Own Device (BYOD) concept has become popular amongst organization. However, due to its portability and information available through social network, BYOD has become susceptible to information stealing attacks such as Advanced Persistent Threat (APT) attack. APT attack uses tricky methods in getting access into the target’s machine and mostly motives and stand as a threat to politics, corporate, academic and even military. Various mitigation techniques have been proposed in tackling this attack but, most of them are relying on available information of the attacks and does not provide data protection. Hence, it is challenging in providing protection against APT attack. In this paper, we will investigate on the available mitigation techniques and its problems in tackling APT attack by looking on the root cause of the attack inside BYOD environment. Lastly, based on the information obtained we will propose a new framework in reducing APT attack.

Zakiah Zulkefli, Manmeet Mahinderjit Singh, Nurul Hashimah Ahamed Hassain Malim

Design of Disaster Collection and Analysis System Using Crowd Sensing and Beacon Based on Hadoop Framework

Currently, disaster data is collected by using site-based, limited regional collection. In this study, a system that collects location information of users that have a mobile device is proposed. The proposed system collects real-time disaster data by using crowd sensing, a user-involved sensing technology. In order to quickly and accurately determine a large amount of unstructured data, among big data frameworks, the Hadoop framework is applied as it efficiently sorts a large amount of data. Also, to enable fast local evacuation alert for users, a beacon-based ad-hoc routing interface was designed As an integrated interface of the proposed systems, a hybrid app based on HTML5, which uses JSON syntax.

Eun-Su Mo, Jae-Pil Lee, Jae-Gwang Lee, Jun-Hyeon Lee, Young-Hyuk Kim, Jae-Kwang Lee

Network Traffic Prediction Model Based on Training Data

Real-time audio and video services have gained much popularity in last decade, and now occupying a large portion of the total network traffic in the Internet. As the real-time services are becoming mainstream the demand for Quality of Service (QoS) is greater than ever before. To satisfy the increasing demand for QoS, it is necessary to use the network resources to the fullest. In this regards, the available bandwidth based routing is a promising solution. Unfortunately the instantaneous available bandwidth of a network is not enough as it may change the next moment in highly dynamic networks. To solve this issue, we present a prediction model for network traffic, on the basis of which network available bandwidth can be estimated. This paper utilizes the efforts done in regard to road traffic prediction to formulate a prediction model for network traffic.

Jinwoo Park, Syed M Raza, Pankaj Thorat, Dongsoo S. Kim, Hyunseung Choo

Clustering Wireless Sensor Networks Based on Bird Flocking Behavior

One of the most important issues in Wireless Sensor Networks (WSNs) is the efficient use of limited energy resources. A popular approach for efficient energy consumption is clustering. In this paper, we propose an energy efficient clustering algorithm, called Bird Flocking Behavior Clustering (BFBC). By adopting the bird flocking behavior, our clustering algorithm forms clusters with simple local interactions. With an improvement on the existing bio-inspired clustering algorithm, that forms a cluster using several messages, BFBC forms a cluster with only one message. Simulation results show that BFBC significantly decreases the number of messages for cluster head election, and also reduces the energy consumption for communication between cluster members and their dedicated cluster head.

Soon-Gyo Jung, Sanggil Yeom, Min Han Shon, Dongsoo Stephen Kim, Hyunseung Choo

Design and Implementation of an Easy-Setup Framework for Personalized Cloud Device

With the popularization of Public Cloud services, requirements for Personal Cloud services with more enhanced security and independent personalized storage are also increasing. In order to provide Personal Cloud services, a variety of Home Cloud devices available in the home have been introduced. Based on the analysis of these devices and previous researches, we extracted critical restrictions regarding the initial configuration such as the complex procedures, limitation of installation environment, and the problem that users wasted a lot of time for its setup. Thus, we propose a novel wireless network-based Easy-Setup framework using smartphones and Android-based personal cloud devices-HomeSync in order to improve these restrictions. In addition, we constructed an experiment test-bed to validate the effectiveness of our proposed framework. Finally, we present the results of the comparative experiments compared with previous researches in terms of consumption time and initial setup procedures.

Bonhyun Koo, Taewon Ahn, Simon Kong, Hyejung Cho

Security Improvement of Portable Key Management Using a Mobile Phone

Users often store sensitive information on their laptops, but it can be easily exposed to others if a laptop is lost or stolen. File encryption is a common solution to prevent the leakage of data from lost or stolen devices. For the management of strategies like this, key management is very important to protect the decryption key from attacks. Huang et al. proposed a portable key management scheme, whereby a laptop shares secret values with a mobile phone. Their scheme is convenient as well as practical because it is not reliant on a special device or password input. However, we found that it is still vulnerable to an attack if a laptop is stolen. In this paper, we analyse the security of Huang et al.’s scheme and propose a solution to the outstanding vulnerability. Our proposed scheme exploits two types of keys including a one-time symmetric key to protect the file decryption key. Additionally, the security improvement does not compromise the convenience of the portable key management scheme.

Jiye Kim, Donghoon Lee, Younsung Choi, Youngsook Lee, Dongho Won

Workshop on Quantum Mechanics: Computational Strategies and Applications (QMCSA 2015)

Frontmatter

States of a Non-relativistic Quantum Particle in a Spherical Hollow Box

In this article we derive the wave functions of a non-relativistic quantum particle confined in a spherical hollow box. Utilizing these states and deploying a Computer Algebra System (CAS) such as

Mathematica

[

1

] we display the three dimensional radial wave functions. We compute the energy levels and the position expectation values.

Haiduke Sarafian

Workshop on Remote Sensing Data Analysis, Modeling, Interpretation and Applications: From a Global View to a Local Analysis (RS 2015)

Frontmatter

Adaption of a Self-Learning Algorithm for Dynamic Classification of Water Bodies to SPOT VEGETATION Data

Within the ESA CCI “Fire Disturbance” project a dynamic self-learning water masking approach originally developed for AATSR data was modified for MERIS-FR(S) and MERIS-RR data and now for SPOT VEGETATION (VGT) data. The primary goal of the development was to apply for all sensors the same generic principles by combining static water masks on a global scale with a self-learning algorithm. Our approach results in the generation of a dynamic water mask which helps to distinguish dark burned area objects from other different types of dark areas (e.g. cloud or topographic shadows, coniferous forests). The use of static land-water masks includes the disadvantage that land-water masks represent only a temporal snapshot of the water bodies. Regional results demonstrate the quality of the dynamic water mask. In addition the advantages to conventional water masking algorithms are shown. Furthermore, the dynamic water masks of AATSR, MERIS and VGT for the same region are presented and discussed together with the use of more detailed static water masks.

Bernd Fichtelmann, Kurt P. Guenther, Erik Borg

Assessment of MODIS-Based NDVI-Derived Index for Fire Susceptibility Estimation in Northern China

Some satellite-based indices are useful for fire susceptibility estimation in some regions. However, the obtained results are region-dependent to some extent. The aim of this study is to assess the effectiveness of two NDVI-derived indices: the relative greenness index (RGI) and the vegetation danger index (VDI) applied to the northern China. Thus, the Moderate Resolution Imaging Spectroradiometer sensor (MODIS) data MYD13Q1, which is the 16-days composite product, were used. The results indicated that the RGI values were higher than 70% during the before-fire period from spring up to autumn, whereas it have decreased sharply when fire happened and after a period time and also it was below normal level during a period of after-fire. The VDI values were negative when fire happened and after a short period and that of the before fire and after fire were positive. Thus, it can be concluded that the two MODIS-based NDVI-derived indices have a higher possibility for fire susceptibility estimation even though it needs to combine other fire-related parameters.

Xiaolian Li, Antonio Lanorte, Luciano Telesca, Weiguo Song, Rosa Lasaponara

On the Use of the Principal Component Analysis (PCA) for Evaluating Vegetation Anomalies from LANDSAT-TM NDVI Temporal Series in the Basilicata Region (Italy)

In this paper, we present and discuss the investigations we conducted in the context of the MITRA project focused on the use of low cost technologies (data and software) for pre-operational monitoring of land degradation in the Basilicata Region. The characterization of land surface conditions and land surface variations can be efficiently approached by using satellite remotely sensed data mainly because they provide a wide spatial coverage and internal consistency of data sets. In particular, Normalized Difference Vegetation Index (NDVI) is regarded as a reliable indicator for land cover conditions and variations and over the years it has been widely used for vegetation monitoring. For the aim of our project, in order to detect and map vegetation anomalies ongoing in study test areas (selected in the Basilicata Region) we used the Principal Component Analysis applied to Landsat Thematic Mapper (TM) time series spanning a period of 25 years (1985-2011).

Antonio Lanorte, Teresa Manzi, Gabriele Nolè, Rosa Lasaponara

A New Approach of Geo-Rectification for Time Series Satellite Data Based on a Graph-Network

Earth observation is indisputably one of the most important data sources for current synoptic geo-data in diverse environmental applications. Nevertheless, obtained thematic information by means of Earth observation is only as good as the quality of the pre-processed data. Important pre-processing steps are e.g. data usability assessment, geo-referencing and atmospheric correction. While data usability assessment or atmospheric correction expects radiometric corrected data for a thematic interpretation, the accuracy of geo-location of interpreted data is ensured by geo-referencing. This applies especially to multi-temporal analyses of environmental processes. In this case, precise spatial allocation of data represents a prerequisite for a correct interpretation of process dynamics. The paper is dealed with a new geo-referencing algorithm. Corresponding graph-networks in reference data as well as in remote sensing data which are based on virtual object points (e.g. centroids) will be used for geo-rectification.

Bernd Fichtelmann, Erik Borg

Semi-supervised Local Aggregation Methodology

In this paper we propose a novel approach for automatic mine detection in SONAR data. The proposed framework relies on possibilistic based fusion method to classify SONAR instances as mine or mine-like object. The proposed semi-supervised algorithm minimizes some objective function which combines context identification, multi-algorithm fusion criteria and a semi-supervised learning term. The optimization aims to learn contexts as compact clusters in subspaces of the high-dimensional feature space via possibilistic semi-supervised learning and feature discrimination. The semi-supervised clustering component assigns degree of typicality to each data sample in order to identify and reduce the influence of noise points and outliers. Then, the approach yields optimal fusion parameters for each context. The experiments on synthetic datasets and standard SONAR dataset show that our semi-supervised local fusion outperforms individual classifiers and unsupervised local fusion.

Marzieh Azimifar, Ali Heidarzadegan, Yasser Nemati, Sajad Manteghi, Hamid Parvin

Workshop on Scientific Computing Infrastructure (SCI 2015)

Frontmatter

Development of the Configuration Management System of the Computer Center and Evaluation of Its Impact on the Provision of IT Services

This paper discusses the problem of development of the Configuration Management system and the Configuration Management Process in computer center which provides services based on the virtualization technologies. In view of strong integration of the IT components in infrastructure with virtualized part, control of such infrastructure becomes difficult. Following this IT service support becomes more complicated, which entails higher costs and lower quality of services provided. As a workaround of the problem of configuration management, a system that combines multiple information systems around the service desk is offered. Considerable attention is paid to the resolution of specific requirements to the system caused by the specifics of the data center: collective use of supercomputer resources and wide use of the virtualization. In addition, the article focuses on the analysis of the impact of the configuration management system on the computer center business processes related to the provision and support of its services. However, the main attention is focused on the development prospects of the system itself as well as a part of the ITSM complex.

Nikolai Iuzhanin, Tatiana Ezhakova, Valery Zolotarev, Vladimir Gaiduchok

Novel Approaches for Distributing Workload on Commodity Computer Systems

Efficient management of a distributed system is a common problem for university’s and commercial computer centres, and handling node failures is a major aspect of it. Failures which are rare in a small commodity cluster, at large scale become common, and there should be a way to overcome them without restarting all parallel processes of an application. The efficiency of existing methods can be improved by forming a hierarchy of distributed processes. That way only lower levels of the hierarchy need to be restarted in case of a leaf node failure, and only root node needs special treatment. Process hierarchy changes in real time and the workload is dynamically rebalanced across online nodes. This approach makes it possible to implement efficient partial restart of a parallel application, and transactional behaviour for computer centre service tasks.

Ivan Gankevich, Yuri Tipikin, Alexander Degtyarev, Vladimir Korkhov

Managing Dynamical Distributed Applications with GridMD Library

The open source C++ class library GridMD for distributed computing is reviewed including its architecture, functionality and use cases. The library is intended to facilitate development of distributed applications that can be run at contemporary supercomputing clusters and standalone servers managed by Grid or cluster task scheduling middleware. The GridMD library used to be targeted at molecular dynamics and Monte-Carlo simulations but at present it can serve as a universal tool for developing distributed computing applications as well as for creating task management codes. In both cases the distributed application is represented by a single client-side executable built from a compact C++ code. In the first place the library is targeted at developing complex applications that contain many computation stages with possible data dependencies between them which can be run efficiently in the distributed environment.

Ilya A. Valuev, Igor V. Morozov

A Parallel Algorithm for Efficient Solution of Stochastic Collection Equation

A parallel algorithm is presented for the efficient numerical solution of the stochastic collection equation. It is based on Bott flux method for numerical solution of the stochastic collection equation. The algorithm of Bott has been chosen as one of the most popular algorithms intended for calculation of the evolution of cloud particle spectra. The optimized algorithm makes it possible to use multiple CPU cores for computation acceleration without significant accuracy loss and remains free from mass defect. Acceleration is achieved at the cost of reduced accuracy, because the steps of strictly sequential algorithm are executed in parallel. Tests showed a 3.5 time speedup on PC with four CPU cores. Results of the numerical tests show that the parallel algorithm is very promising to be used in the numerical models of convective clouds for calculating the spectra of cloud particles, such as water drops and various types of ice crystals. Stochastic collection equation presents the most computationally expensive part of such models aimed at forecasting thunderstorms, heavy rains and hails. Thus elaborated parallel algorithm can become an efficient instrument in the hardware and software systems designed for operational forecast of dangerous weather phenomena.

Elena N. Stankova, Ilya A. Karpov

Profiling Scheduler for Efficient Resource Utilization

Optimal resource utilization is one of the most important and most challenging tasks for computational centers. A typical contemporary center includes several clusters. These clusters are used by many clients. So, administrators should set resource sharing policies that will meet different requirements of different groups of users. Users want to compute their tasks fast while organizations want their resources to be utilized efficiently. Traditional schedulers do not allow administrator to efficiently solve these problems in that way. Dynamic resource reallocation can improve the efficiency of system utilization while profiling running applications can generate important statistical data that can be used in order to optimize future application usage. These are basic advantages of a new scheduler that are discussed in this paper.

Alexander Bogdanov, Vladimir Gaiduchok, Nabil Ahmed, Amissi Cubahiro, Ivan Gankevich

Storage Database System in the Cloud Data Processing on the Base of Consolidation Technology

In this article we were studying the types of architectures for cloud processing and storage of data, data consolidation and enterprise storage. Special attention is given to the use of large data sets in computational process. It was shown, that based on the methods of theoretical analysis and experimental study of computer systems architectures, including heterogeneous, special techniques of data processing, large volumes of information models relevant architectures, methods of optimization software for the heterogeneous systems, it is possible to ensure the integration of computer systems to provide computations with very large data sets.

Alexander V. Bogdanov, Thurein Kyaw Lwin, Elena Stankova

Integrated Information System for Verification of the Models of Convective Clouds

The paper describes the information system that integrates heterogeneous meteorological information necessary for the verification of the numerical models of convective clouds. Data integration is realized on the base of consolidation technology. The first section of the article describes the implementation of the method of consolidation of meteorological data from heterogeneous sources (using PHP). The design and realization of the relational database (DB) (using MySQL) are described in the second section. The third section is concerned with the development of Web-based applications for verification of 1.5-D convective cloud model [1] using HTML 5, CSS 3 and JavaScript.

Dmitry A. Petrov, Elena N. Stankova

Hybrid Approach Perturbed KdVB Equation

The solution of nonintegrable nonlinear equations is very difficult even numerically and practically impossible by standard analytical technic. New view, offered by heterogeneous computational systems, gives some new possibilities, but also need novel approaches for numerical realization of pertinent algorithms. We shall give some examples of such analysis on the base of nonlinear wave’s evolution study in multiphase media with chemical reaction.

Alexander V. Bogdanov, Vladimir V. Mareev, Elena N. Stankova

Flexible Configuration of Application-Centric Virtualized Computing Infrastructure

Virtualization technologies enable flexible ways to configure computing environment according to the needs of particular applications. Combined with software defined networking technologies (SDN), operating system-level virtualization of computing resources can be used to model and tune the computing infrastructure to optimize application performance and optimally distribute virtualized physical resources between concurrent applications. We investigate capabilities provided by several modern tools (Docker, Mesos, Mininet) to model and build virtualized computational infrastructure, investigate configuration management in the integrated environment and evaluate performance of the infrastructure tuned to a particular test application.

Vladimir Korkhov, Sergey Kobyshev, Artem Krosheninnikov

Distributed Collaboration Based on Mobile Infrastructure

There are several types of infrastructures to allow people to interact distributed. Together with development of information technologies, the use of such infrastructures is gaining momentum at many organizations and opens new trend in scientific researches. On the other hand, rapid increase of mobile technologies allows individuals to apply solutions without need of traditional office spaces and regardless of location. Hence, realization of certain infrastructures on mobile platforms could be useful not only in daily purposes but also in terms of scientific approach. Implementation of tools based on mobile infrastructures can vary from basic internet-messengers to complex software for online collaboration equipment in large-scaled workgroups. Although growth of mobile infrastructures, applied distributed solutions in healthcare and group decision-making are not widespread. To increase mobility and improve usage of current solutions we propose innovative tools for real-time collaboration on smart devices, which will be described in this article.

Serob Balyan, Suren Abrahamyan, Harutyun Ter-Minasyan, Alfred Waizenauer, Vladimir Korkhov

Workshop on Software Engineering Processes and Applications (SEPA 2015)

Frontmatter

A Systematic Approach to Develop Mobile Applications from Existing Web Information Systems

Mobile computing is changing the way society accesses information. People are using mobile devices such as smartphones and tablets to make transactions and consume data more and more each day. Driven by this kind of use, many companies and institutions are developing mobile versions of their information systems. But what should these companies and institutions consider when developing a mobile versions of its information systems? It is important to note that the mobile version is not the redesign or copy of all functionalities from the existing information system. Considering this scenario of mobile application development from existing web information systems, we proposed a process named Metamorphosis. This process provides a set of activities subdivided into four phases - requirements, design, development and deployment -, to assist in the creation of mobile applications from existing web information systems. Thus, this article aims to present a case study of the Metamorphosis process in the development of SIGEventos Mobile, a mobile version based on SIGEventos web information system, approaching the feasibility of the use of this process.

Itamir de Morais Barroca Filho, Gibeon Soares de Aquino

An Ontological Support for Interactions with Experience in Designing the Software Intensive Systems

Nowadays, experience bases are widely used by project companies in designing the software intensive systems (SIS). The efficiency of such informational sources is defined by “nature” of modeled experience units and approaches that apply to their systematization. An orientation on a precedent model as a basic type of experience units and an ontological approach to their systematization are defined the specificity of the study described in this paper. Models of precedents are constructed in accordance with the normative schema when the occupational work is fulfilled by a team of designers. In creating the necessary ontology, the team should use a reflection of solved tasks on a specialized memory intended for simulating the applied reasoning of the question-answer type. The used realization of the approach facilitates increasing the efficiency of designing the SIS.

Petr Sosnin

Do We Need New Management Perspectives for Software Research Projects?

Creativity and originality are so important success factors so that many researchers are not interested in formal or official process. But uncertainty and risks should be managed for successful outcome of research. The main goal of our research is to initiate the practical and effective SW engineering techniques and tool which can tracking, monitoring, and managing the SW research life-cycle in R&D projects. Researchers are the knowledge works and research process is very similar with feature of adaptive case management. Also many product data management techniques can be applicable for managing research artifacts. With research descriptor configuration item and flexible process managing environment, SW R&D researchers can define their own creative research process or documents and keep tracking the progress at any time.

Jeong Ah Kim, Suntae Kim, Jae-Young Choi, JongWon Ko, YoungWha Cho

A Binary Fruit Fly Optimization Algorithm to Solve the Set Covering Problem

The Set Covering Problem (SCP) is a well known

$$\mathcal {N} \mathcal {P}$$

N

P

-

hard

problem with many practical applications. In this work binary fruit fly optimization algorithms (bFFOA) were used to solve this problem using different binarization methods.

The bFFOA is based on the food finding behavior of the fruit flies using osphresis and vision. The experimental results show the effectiveness of our algorithms producing competitive results when solve the benchmarks of SCP from the OR-Library.

Broderick Crawford, Ricardo Soto, Claudio Torres-Rojas, Cristian Peña, Marco Riquelme-Leiva, Sanjay Misra, Franklin Johnson, Fernando Paredes

A Teaching-Learning-Based Optimization Algorithm for Solving Set Covering Problems

The Set Covering Problem (SCP) is a representation of a kind of combinatorial optimization problem which has been applied in several problems in the real world. In this work we used a binary version of Teaching-Learning-Based Optimization (TLBO) algorithm to solve SCP, works with two phases known: teacher and learner; emulating the behavior into a classroom. The proposed algorithm has been tested on 65 benchmark instances. The results show that it has the ability to produce solutions competitively.

Broderick Crawford, Ricardo Soto, Felipe Aballay, Sanjay Misra, Franklin Johnson, Fernando Paredes

A Comparison of Three Recent Nature-Inspired Metaheuristics for the Set Covering Problem

The Set Covering Problem (SCP) is a classic problem in combinatorial optimization. SCP has many applications in engineering, including problems involving routing, scheduling, stock cutting, electoral redistricting and others important real life situations. Because of its importance, SCP has attracted attention of many researchers. However, SCP instances are known as complex and generally NP-hard problems. Due to the combinatorial nature of this problem, during the last decades, several metaheuristics have been applied to obtain efficient solutions. This paper presents a metaheuristics comparison for the SCP. Three recent nature-inspired metaheuristics are considered: Shuffled Frog Leaping, Firefly and Fruit Fly algorithms. The results show that they can obtainn optimal or close to optimal solutions at low computational cost.

Broderick Crawford, Ricardo Soto, Cristian Peña, Marco Riquelme-Leiva, Claudio Torres-Rojas, Sanjay Misra, Franklin Johnson, Fernando Paredes

Bug Assignee Prediction Using Association Rule Mining

In open source software development we have bug repository to which both developers and users can report bugs. Bug triage, deciding what to do with an incoming bug report, takes a large amount of developer resources and time. All newly coming bug reports must be triaged to determine whether the report is correct and requires attention and if it is, which potentially experienced developer/fixer will be assigned the responsibility of resolving the bug report. In this paper, we propose to apply association mining to assist in bug triage by using Apriori algorithm to predict the developer that should work on the bug based on the bug’s severity, priority and summary terms. We demonstrate our approach on collection of 1,695 bug reports of Thunderbird, AddOnSDK and Bugzilla products of Mozilla open source project. We have analyzed the association rules for top five assignee of the three products. Association rules can support the managers to improve its process during development and save time and resources.

Meera Sharma, Madhu Kumari, V. B. Singh

Zernike Moment-Based Approach for Detecting Duplicated Image Regions by a Modified Method to Reduce Geometrical and Numerical Errors

In the paper, the approach is focused on the Zernike Moment-based model of ROI image (Region of Interest) and its parameters for an efficient image processing in the forensic issue. By considering the factors affecting the identification of an duplicated image, the change of ROI’s size is determined through the proposed algorithm. The proposed technique has shown a good improvement in reducing significantly Geometrical Errors (G.E) and Numerical Errors (N.E) performed better than that of the Zernike-based traditional technique. The duplicated detection program has been written by C++ and supporting OpenCV and Boost libraries that help to verify the images authentication.

Thuong Le-Tien, Tan Huynh-Ngoc, Tu Huynh-Kha, Luong Marie

Requirements Engineering in an Emerging Market

The growing importance of requirements engineering (RE) in software development cannot be overemphasized. A faulty requirements gathering exercise and the emergent requirements document could mislead the entire software development drive, resulting in a software product that falls short of user expectation in terms of meeting needs and delivering within budget, time and scope. Achieving the objective of a well articulated and coordinated requirements document in an ideal economic environment is tasking let alone in an emerging market characterized by macro-economic variables such as high cost of doing business, weak institutions, poor infrastructure, lack of skilled and competitive workforce, among others coupled with micro-economic (personal) tendencies like resistance to change, vested interest, technophobia and insider abuse. This paper reports on industrial experience of designing and implementing an n-tier enterprise application in an African university using service oriented software engineering (SOSE) approach. The application is meant to facilitate the actualization of the 25-year strategic plan of the institution. We applied design and software engineering skills: Literature were examined, requirements gathered, the n-tier enterprise solution modeled using unified modeling language (UML), implementation achieved using Microsoft SharePoint and the results evaluated. Though success was recorded, the challenges encountered during the requirements engineering stage were quiet reflective of the challenges of software project management in a typical relatively unstable macroeconomic environment. The outcome of this study is a compendium of lessons learnt and recommendation for successful RE in the context of an emerging economy like Africa in the hope that this will guide would-be software stakeholders in such a business landscape.

Emmanuel Okewu

Efficient Utilization of Various Network Coding Techniques in Different Wireless Scenarios

The nodes of a communication network use network coding technique for generating packets for output links by systematically processing the packets received on its input links such that the original packets should be recovered by the destination nodes. Under low traffic conditions higher bandwidth efficiency and power efficiency can be achieved using network coding but the performance degrades as the traffic in the network increases. Overall performance of the network can be improved if the node is able to take a decision about whether to use or not to use network coding under the current condition of the network. This work presents a scheme for finding out a threshold value below which network coding should be used and above the threshold normal forwarding operation should be adopted by the nodes. The performance of the proposed algorithm on Cross topology under different network coding schemes has been tested under simulation environment i.e. on NS-3. Significant improvement has been observed as compared to only network coded systems as well as the traditional store and forward systems.

Purnendu Shekhar Pandey, Neetesh Purohit, Sanjay Mishra, Broderick Crawford, Ricardo Soto

T-Tuple Reallocation: An Algorithm to Create Mixed-Level Covering Arrays to Support Software Test Case Generation

A fact that is known both by researchers and by industry professionals is that exhaustive software testing is impractical. Therefore, one of the most studied activities of software testing process is the generation/selection of test cases. However, selecting test cases that reveal the greatest number of defects within a software is a challenging task, due to the significantly high amount of entries that the system can receive, and even due to the different characteristics of software products in several application domains. This work presents a new algorithm, called T-Tuple Reallocation (TTR), to generate Mixed-Level Covering Array (MCA) which is one of the techniques of combinatorial designs that aims at test case generation. After studying various algorithms/techniques to generate combinatorial designs, starting with pairwise design, TTR was proposed aiming at decreasing the amount of test cases produced to test a software product. The new algorithm was able to create shorter sets of test cases in comparison with classical algorithms/tools proposed in the literature. Although TTR, in general, demanded longer time to generate the sets of test cases, this rise in time can be compensated by a smaller number of test cases so that less time is required for executing them. In the end, this may imply less time for accomplishing the testing process as a whole.

Juliana Marino Balera, Valdivino Alexandre de Santiago Júnior

An Analysis of Techniques and Tools for Requirements Elicitation in Model-Driven Web Engineering Methods

Until now, is well-known that Requirements Engineering (RE) is one of the critical factors for success software. In current literature we can find several reasons of this affirmation. One particular phase, which is vital for developing any new software application is the Requirements Elicitation, is spite of this, most of the development of new software fail because of wrong elicitation phase. Several proposals exist for Requirements Elicitation in Software Engineering, but in the current software development market is focusing on the development of Web and mobile applications, specially using Model-Driven methods, that’s the reason why we asume that it is necessary to know the Elicitation techniques applied in Model-Driven Web Engineering. To do this, we selected the most representative methods such as NDT, UWE and WebML. We have reviewed 189 publications from ACM, IEEE, Science Direct, DBLP and World Wide Web. Publications from the RE literature were analyzed by means of the strict consideration of the current techniques for Requirements Elicitation.

José Alfonso Aguilar, Aníbal Zaldívar-Colado, Carolina Tripp-Barba, Sanjay Misra, Roberto Bernal, Abraham Ocegueda

Deriving UML Logical Architectures of Traceability Business Processes Based on a GS1 Standard

A good traceability business process (BP) regards a powerful tool for industrial and manufacturing organizations to pursuit effective productivity. However, there is a lack of a common understanding among its key stakeholders on implementing a proper traceability BP. In this paper, we propose the use of software engineering approaches, namely the design of a process-level logical architecture for the traceability BP. This logical architecture captures the main activities, responsibilities, boundaries, and services involved in the traceability BP. The logical architecture was derived based on a use case model that arose from the requirements elicitation of activities as the ones proposed by the GS1 standard.

Rui Neiva, Nuno Santos, José C. C. Martins, Ricardo J. Machado

Native and Multiple Targeted Mobile Applications

Together with the expansion of the WWW we are seeing the expansion of mobile devices that are becoming more and more pervasive. Mobile application development is becoming more and more complex as users of mobile applications are demanding more high quality software. Our contribution is to frame the positive and negative aspects of native and multiple targeted mobile applications that should be considered by the involved stakeholders more particularly the software organization decision-makers.

Euler Horta Marinho, Rodolfo Ferreira Resende

Crowdsourcing Based Fuzzy Information Enrichment of Tourist Spot Recommender Systems

Tourist Spot Recommender Systems (TSRS) help users to find the interesting locations/spots in vicinity based on their preferences. Enriching the list of recommended spots with contextual information such as right time to visit, weather conditions, traffic condition, right mode of transport, crowdedness, security alerts etc. may further add value to the systems. This paper proposes the concept of information enrichment for a tourist spot recommender system. Proposed system works in collaboration with a Tourist Spot Recommender System, takes the list of spots to be recommended to the current user and collects the current contextual information for those spots. A new score/rank is computed for each spot to be recommender based on the recommender’s rank and current context and sent back to the user. Contextual information may be collected by several techniques such as sensors, collaborative tagging (folksonomy), crowdsourcing etc. This paper proposes an approach for information enrichment using just in time location aware crowdsourcing. Location aware crowdsourcing is used to get current contextual information about a spot from the crowd currently available at that spot. Most of the contextual parameters such as traffic conditions, weather conditions, crowdedness etc. are fuzzy in nature and therefore, fuzzy inference is proposed to compute a new score/rank, with each recommended spot. The proposed system may be used with any spot recommender system, however, in this work a personalized tourist spot recommender system is considered as a case for study and evaluation. A prototype system has been implemented and is evaluated by 104 real users.

Sunita Tiwari, Saroj Kaushik

XML Schema Reverse Transformation: A Case Study

Reversing XML Schema to conceptual model has been a center of attention. Previous researchers use hierarchy structure of XML Schema component to generate class diagram. However, hierarchy structure involves only primary XML schema components. Our approach involves one more XML schema components, which are group of class and attribute with references. This additional component enriches class diagram results. In this paper, we construct circulation function of Library system to ensure that our formalization method has added a component of dependency relationship in class diagram. This rich class diagram may useful for other research such as evolution, modification and document adaptation in future.

Hannani Aman, Rosziati Ibrahim

Realisation of Low-Cost Ammonia Breathalyzer for the Identification of Tooth Decay by Neural Simulation

The human mouth contains many kinds of substances both in liquid and gaseous form. The individual concentrations of each of these substances could provide useful insight to the health condition of the entire body. Ammonia is one of such substances whose concentration in the mouth has revealed the presence or absence of diseases in the body. One of such is tooth decay (caries) which occurs when there is insufficient concentration of ammonia in the mouth. This paper proposes an affordable ammonia breathalyzer designed using metal oxide sensor for the detection and prediction of tooth caries in humans with a 87% overall success rate. Selection of appropriate sensor was done via simulation using feed-forward artificial neural network (ANN). The breathalyzer has been designed and constructed to be low-cost such that it can be used for early detection and prevention of tooth decay.

Ima O. Essiet

Novel Designs for Memory Checkers Using Semantics and Digital Sequential Circuits

Memory safety breaches have been main tools in many of the latest security vulnerabilities. Therefore memory safety is critical and attractive property for any piece of code. Separation logic can be realized as a mathematical tool to reason about memory safety of programs. An important technique for modern parallel programming is multithreading. For a multi-threaded model of programming (

Core-Par-C

), this paper introduces an accurate semantics which is employed to mathematically prove the undecidability of memory-safety of

Core-Par-C

programs. The paper also proposes a design for a hardware to act as an efficient memory checker against memory errors.

Mohamed A. El-Zawawy

Towards a Wide Acceptance of Formal Methods to the Design of Safety Critical Software: an Approach Based on UML and Model Checking

The Unified Modeling Language (UML) is widely used to model systems for object oriented and/or embedded software development, specially by means of its several behavioral diagrams which can provide different points of view of the same software scenario. Model Checking is a formal verification method which has been receiving much attention from the academic community. However, in general, practitioners still avoid using Model Checking in their projects due to several reasons. Based on these facts, we present in this paper a significant improvement of a tool that we have developed which aims to translate several UML behavioral diagrams (sequence, activity, and state machine) into Transition Systems to support software Model Checking. With all the changes, we have applied our tool to a real space software product which is under development for a stratospheric balloon project to show how feasible is our approach in practice.

Eduardo Rohde Eras, Luciana Brasil Rebelo dos Santos, Valdivino Alexandre de Santiago Júnior, Nandamudi Lankalapalli Vijaykumar

A Scheduling Problem for Software Project Solved with ABC Metaheuristic

The scheduling problems are very common in any industry or organization. The software project management is frequently faced with different scheduling problems. We present the Resource-Constrained Project Scheduling problem as a generic problem in which different resources must be assigned to different activities, so that the make span is minimized and a set of precedence constraints between activities and resource allocation to these activities are met. This Problem is a NP-hard combinatorial optimization problem. In this paper we present the model the resolution of the problem through the Artificial Bee Colony algorithm. The Artificial Bee Colony is a metaheuristic that uses foraging behavior of honey bees for solving problems, especially applied to combinatorial optimization. We present an Artificial Bee Colony algorithm able to solve the Resource-Constrained Project Scheduling efficiently.

Broderick Crawford, Ricardo Soto, Franklin Johnson, Melissa Vargas, Sanjay Misra, Fernando Paredes

On the Use of a Multiple View Interactive Environment for MATLAB and Octave Program Comprehension

MATLAB or GNU/Octave programs can become very large and complex and therefore difficult to understand and maintain. The objective of this paper is presenting an approach to mitigate this problem, based upon a multiple view interactive environment (MVIE) called

OctMiner.

The latter provides visual resources to support program comprehension, namely the selection and configuration of several views to meet developers’ needs. For validation purposes, the authors conducted two case studies to characterize the use of

OctMiner

in the context of software comprehension activities. The results provided initial evidences of its effectiveness to support the comprehension of programs written in the aforementioned languages.

Ivan M. Lessa, Glauco de F. Carneiro, Miguel P. Monteiro, Fernando Brito e Abreu

Design Phase Consistency: A Tool for Reverse Engineering of UML Activity Diagrams to Their Original Scenarios in the Specification Phase

In this paper, we present a tool that preserves phase consistency from specifications to the design phase by reverse engineering UML activity diagrams, designed from scenario specifications, back to scenarios to ensure that all of the original scenarios can be recreated. We use a set of action and action-link rules to specify the activity and scenario diagrams in order to provide consistency and rigor. Given an activity diagram depicting a common telecentre process (es), we present an algorithm that follows this set of action and action-link rules to reverse engineer this activity diagram back to their set of scenarios. The validation of this algorithm is achieved when, given a set of activity diagrams, the algorithm is able to recreate the original set of scenarios. Thus, all original specifications, in the form of scenarios, are ensured to be encapsulated within their activity diagram.

Jay Pancham, Richard Millham

Extracting Environmental Constraints in Reactive System Specifications

Reactive systems ideally never terminate and maintain some interaction with their environment. Temporal logic is one of the methods for formal specification description of reactive systems. For a reactive system specification, we do not always obtain a program that satisfies it because the reactive system program must satisfy the specification no matter how the environment of the reactive system behaves. This problem is known as realizability or feasibility. The complexity of deciding realizability of specifications that are described in linear temporal logic is double or triple exponential time of the length of specifications and realizability decision is impractical. To check reactive system specifications, Strong satisfiability is one of the necessary conditions of realizability of reactive system specifications. If a reactive system specification is not strong satisfiable, it is necessary to revise the specification. This paper proposes the method of revising reactive system specifications that are not strong satisfiable. This method extracts environmental constraints that are included in reactive system specifications.

Yuichi Fukaya, Noriaki Yoshiura

Implementation of Decision Procedure of Stepwise Satisfiability of Reactive System Specifications

Reactive systems ideally never terminate and maintain some interaction with their environment. Temporal logic is one of the methods for formal specification description of reactive systems. For a reactive system specification, we do not always obtain a program that satisfies it because the reactive system program must satisfy the specification no matter how the environment of the reactive system behaves. This problem is known as realizability or feasibility. The complexity of deciding realizability of specifications that are described in linear temporal logic is double or triple exponential time of the length of specifications and realizability decision is impractical. This paper implements stepwise satisfiability decision procedure with tableau method and proof system. Stepwise satisfiability is one of the necessary conditions of realizability of reactive system specifications. The proposed procedure decides stepwise satisfiability of reactive system specifications.

Noriaki Yoshiura, Yuma Hirayanagi

Cryptic-Mining: Association Rules Extractions Using Session Log

Security of gargantuan sized data has always posed as a challenging issue. This domain has witnessed a number of approaches being introduced to counter such issues. This paper first reviews approaches for investigation of mining algorithms in cryptography domain and sheds light on application of mining techniques and machine learning algorithms in cryptography. The paper presents key computation using parameters-only scheme for automatic variable key (AVK) based symmetric key cryptosystem. A cryptanalysis based on association rule mining for key and parameter prediction has been discussed using both analytical method and WEKA tool. The paper also presents some research questions regarding the design issues associated with the implementation of parameter based symmetric Automatic Variable Key (AVK) based cryptosystem.

Shaligram Prajapat, Ramjeevan Singh Thakur

Feature Based Encryption Technique for Securing Digital Image Data Based on FCA-Image Attributes and Visual Cryptography

Lossless pixel value encrypted images still maintains the some properties of their respective original plain images. Most of these cryptographic approaches consist of visual cryptographic techniques and pixel displacement approaches. These methods of cryptography are useful in cases as medical image security where pixel expansion is avoided in both the encryption and decryption processes. In this paper we propose a hybrid cryptographic encryption approach by using features generated from digital images based on Galois lattice theory and a visual cryptographic technique based on RGB pixel displacement. The features extracted from a plain image and a lattice was generated which was then used to generate a key used to encrypt the plain image. At the end of the process, there was no pixel expansion and the arithmetic mean, the entropy as well as the Galois lattice of both ciphered and plain image remained the same. The features extracted from the plain image were the same as that of the ciphered image irrespective of pixel displacement that occurred, this makes our approach a suit-able basis for image encryption and storage as well as encrypted image indexing and searching based on pixel values. The implementation was done using Galicia, Lattice Miner and MATLAB..

Quist-Aphetsi Kester, Anca Christine Pascu, Laurent Nana, Sophie Gire, Jojo M. Eghan, Nii Narku Quaynor

Empirical Studies of Cloud Computing in Education: A Systematic Literature Review

The purpose of this paper is to present the evidence about adoption of cloud computing in the education system in universities or higher education institutions. We performed a systematic literature review (SLR) of empirical studies that investigated the current level of adoption of cloud computing in the education systems and motivations for using cloud computing in the institution. Seven papers were included in our synthesis of evidence. It has been found that several universities are interested in using cloud computing in their education systems, and they have utilized different types of cloud computing service models (IaaS, PaaS, SaaS). The results of this SLR show that a clear gap exists in this research field: a lack of empirical studies focusing on utilizing cloud computing within educational institutions.

Mohamud Sheikh Ibrahim, Norsaremah Salleh, Sanjay Misra

A Review of Student Attendance System Using Near-Field Communication (NFC) Technology

The rapid growth of system development is no longer subtle and continuously improving today’s system. In education sector, the student attendance system is able to be applied by Near-Field Communication (NFC) technology. NFC can be referred to as a device that can detect information and/or command from a tag by bringing them together in a close proximity or even by touching together. Traditionally, the manual attendance system would require a lecturer to pass around an attendance sheet for students to sign beside their names and another method would require the lecturer to call out the students’ names one by one and register their attendance. The attendance system based on NFC is meant to improve the manual attendance system and therefore the aim of this paper is to review the existing research

Mohd Ameer Hakim bin Mohd Nasir, Muhammad Hazimuddin bin Asmuni, Norsaremah Salleh, Sanjay Misra

A Decision Support Map for Security Patterns Application

In software engineering, security concerns should be addressed at every phase of the development process. To do that, patterns based security engineering approach has been proposed and investigated becoming a very active area of research. Security patterns capture the experience of experts in order to solve a security problem in a more structured and reusable way. With the proliferation of security patterns, thus it is becoming harder to select which ones should be applied and in each case. In this paper, our main contribution consists in the proposition of a map layered security patterns. This map allows software engineer to select and apply patterns in a systematic manner in order to guide the security decisions.

Rahma Bouaziz, Slim Kammoun

Backmatter

Weitere Informationen

Premium Partner

Neuer Inhalt

BranchenIndex Online

Die B2B-Firmensuche für Industrie und Wirtschaft: Kostenfrei in Firmenprofilen nach Lieferanten, Herstellern, Dienstleistern und Händlern recherchieren.

Whitepaper

- ANZEIGE -

Best Practices für die Mitarbeiter-Partizipation in der Produktentwicklung

Unternehmen haben das Innovationspotenzial der eigenen Mitarbeiter auch außerhalb der F&E-Abteilung erkannt. Viele Initiativen zur Partizipation scheitern in der Praxis jedoch häufig. Lesen Sie hier  - basierend auf einer qualitativ-explorativen Expertenstudie - mehr über die wesentlichen Problemfelder der mitarbeiterzentrierten Produktentwicklung und profitieren Sie von konkreten Handlungsempfehlungen aus der Praxis.
Jetzt gratis downloaden!

Bildnachweise