Skip to main content

2016 | Buch

Information Technology: New Generations

13th International Conference on Information Technology

insite
SUCHEN

Über dieses Buch

This book collects articles presented at the 13th International Conference on Information Technology- New Generations, April, 2016, in Las Vegas, NV USA. It includes over 100 chapters on critical areas of IT including Web Technology, Communications, Security, and Data Mining.

Inhaltsverzeichnis

Frontmatter

Networking and Wireless Communications

Frontmatter
Understanding User’s Acceptance of Personal Cloud Computing: Using the Technology Acceptance Model

Personal Cloud Computing (PCC) is a rapidly growing technology, addressing the market demand of individual users for access to available and reliable resources. But like other new technologies, concerns and issues have surfaced with the adoption of PCC. Users deciding whether to adopt PCC may be concerned about the ease of use, usefulness, or security risks in the cloud. Negative attitudes toward using a technology have been found to negatively impact the success of that technology. The purpose of this study was to understand users’ acceptance of PCC. The population sample consisted of individual users within the United States between 18 and 80 years of age. The theoretical framework utilized in this study was based on the technology acceptance model (TAM). A web survey was conducted to assess the measurement and understanding of patterns demonstrated by participants. Our results shows that in spite of the potential benefits of PCC, security and privacy risks are deterring many users from moving towards PCC.

Mohamed Eltayeb, Maurice Dawson
Cognitive Spectrum Decision via Machine Learning in CRN

In this research, we propose cognitive spectrum decision model comprised of spectrum adaptation (via Raptor codes) and spectrum handoff (via transfer learning) in Cognitive Radio Networks(CRN), in order to enhance the spectrum efficiency in multimedia communications. Raptor code enables the Secondary User (SU) to adapt to the dynamic channel conditions and maintain the Quality of Service (QoS) by prioritizing the data packets and learning the distribution of symbols transmission strategy called decoding-CDF through the history of symbol transmissions. Our scheme optimizes the acknowledgement (ACK) reception strategy in multimedia communications, and eventually increases the spectrum decision accuracy and allows the SUs to adapt to the channel variations. Moreover, to enhance spectrum decision in a long term process, we use Transfer Actor Critic Learning (TACT) model to allow the newly joined SU in a network to learn the spectrum decision strategies from historical spectrum decisions of the existing ‘expert’ SUs. Experimental results show that our proposed model works better than the myopic spectrum decision which chooses the spectrum decision actions based on just short-term maximum immediate reward.

A. M. Koushik, Fei Hu, Ji Qi, Sunil Kumar
Elastic Edge-Overlay Methods Using OpenFlow for Cloud Networks

The virtualization of cloud network requires flexible and effective techniques to accommodate the rapid changes in the network configurations and updates. OpenFlow protocol has attracted attentions for cloud networks since it facilitates managing and sharing the network resources, and it can be utilized to create an overlay abstraction on top of the network infrastructure for the flow setup in the cloud. However, the traditional reactive flow setup of OpenFlow introduces higher flow latency and overhead on the network controller. This paper discusses the issues of the reactive flow setup and presents two optimized overlay network virtualization methods that leverage OpenFlow to control and forward the tenants’ traffic. The proposed methods enable tenants to use their own MAC/IP addresses in the cloud. We have implemented and evaluated the proposed overlay methods, and the experimental results show that our methods have less flow latency than the traditional reactive approach, and higher performance than the popular overlay tunneling protocols such as VXLAN, STT, and NVGRE.

Amer Aljaedi, C. Edward Chow, Jia Rao
An Approach to Generate Automatic Variable Key to Assure Perfect Security in Cryptosystem

Due to advancement and worldwide rapid deployment of computer networks, the security becomes a major issue. Many mechanisms have been devised and more new ideas are about to propose in coming era. But, all the existing techniques have only one important criterion that key must be sustained and protected at any circumstances. In this regard, the AVK is one of the novel and unbreakable approaches to fulfill such criteria as being experimented so far. In this paper, we have proposed a new technique to generate AVKs which can provide higher security by enhancing the randomness among the generated successive keys. To adhere to our claimed, the comparative studies with existing techniques have also been cited at the end.

Subhasish Banerjee, Manash P. Dutta, C. T. Bhunia
Airborne Networks with Multi-beam Smart Antennas: Towards a QOS-Supported, Mobility-Predictive MAC

Airborne networks require throughput-efficient MAC for mission-oriented communications. The use of multi-beam smart antennas (MBSAs) could provide the network with better throughput performance since all beams can send out data concurrently. In this paper, we propose a two-layer MAC design for MBSA-based airborne network. In the upper layer, we use TDMA-like schedule control to separate the packet collision domains in multi-beam data transmissions. In the lower layer, we use the CSMA/CA based scheme that is compatible with conventional 802.11 protocols. Such a two-layer scheme significantly reduces the packet contentions and thus improves the throughput. In addition, in order to support mission priorities in the network, we introduce the QoS-oriented MAC control in both upper and lower layers. Furthermore, a mesh network time synchronization method is proposed to guarantee the beam synchronization. A Hierarchical Dirichlet Process (HDP) enhanced Hidden Markov Model (HMM) is used for mobility prediction in each beam (direction) of a node. These approaches could better exploit the benefits of MBSAs. The simulation results show that our proposed MAC protocol outperforms the standard 802.11 DCF and general MBSA-based MAC designs. The validation of the QoS control, synchronization and prediction schemes are also evaluated. It turns out that these schemes could greatly improve the overall performance of the airborne networks.

Xin Li, Fei Hu, Lei Hu, Sunil Kumar
CR Based Video Communication Testbed with Robust Spectrum Sensing / Handoff

As radio spectrum is becoming congested, wireless communications require more efficient spectrum usage. Recently, the cognitive radio (CR) techniques have become attractive as they can utilize any unused spectrum. In this paper, we build a CR video communication test-bed, which implements spectrum sensing and spectrum handoff functionalities in USRP boards. We have implemented compressive spectrum sensing, intelligent spectrum handoff, multi-video-flow transmission, under TDMA scheduling and Raptor codes for reliable video transmissions. By using compressive sensing in spectrum sensing method, the spectrum detection accuracy is improved without much algorithm complexity, and it is also robust to the noise uncertainty due to the use of cyclostationary features. To realize intelligent spectrum handoff, we have designed a real-time jamming detection scheme, as well as synchronized spectrum switching method. We have also implemented a multi-point TDMA-based communication system, which enables any node to send out multiple video flows to different neighbors with pipelined and scheduled data transmissions. We also proposed a special rateless codes called prioritized Raptor codes for more reliable video transmission and implemented in the GNU Radio applications. The proposed CR video transmission testbed can be used for new protocol testing purpose.

Ji Qi, Fei Hu, Xin Li, A. M. Koushik, Lei Hu, Sunil Kumar
A Control-Message Quenching Algorithm in Openflow-Based Wireless Mesh Networks with Dynamic Spectrum Access

In recent years, Open Flow [1] is becoming a popular network architecture. As more and more users starts to join the conventional Internet, the drawbacks of the conventional networks are now gradually appearing. OpenFlow network provides us a brand new sight to the development of networks. This architecture separates the control plane and data plane from the hardware level(physically). OpenFlow has plenty of benefit compare to other network structures. Firstly, the control plane and data plane are decoupled, this means more flexible networks can be figure out with customized rules in the network. Secondly, OpenFlow provides us a new platform to design and test network protocols. Researchers could test new protocols in a real network environment. Thirdly, in OpenFlow architecture networks we could monitor flow traffic statistics. This fine-grained monitoring of flows enables us to better understanding the network protocols and scheme we applied.

Xin Li, Xiaoyan Hong, Yu Lu, Fei Hu, Ke Bao, Sunil Kumar
SINR Maximization in Relay-Assisted Multi-user Wireless Networks

This paper considers throughput maximization in relays based multi-user wireless networks by enhancing the worst signal-to-interference-plus-noise ratio (SINR) among multiple users. Unlike the existing approaches that use semidefinite relaxation coupled with Gaussian randomization (SDR-G), we utilize the d.c. (difference of two convex functions) structure of the resulting objective function to develop efficient iterative algorithms of low complexity. Numerical results demonstrate that the proposed algorithm locates solutions that are close to the upper bound by a few iterations, and hence, shows better performance than the other methods.

Umar Rashid, Faheem Gohar Awan, Muhammad Kamran
A Hybrid MAC for Long-Distance Mesh Network with Multi-beam Antennas

In recent years, with the development of multi-beam smart antennas (MBSA), directional mesh networks is becoming more and more popular. With the popularity of UAVs and environment surveillance applications, airborne networks (ANs) have become important platforms for wireless transmissions in the sky. In this work we propose a hybrid MAC scheme for a hierarchical airborne network, which consists of high-speed, long-link, multi-beam aircraft nodes (in the higher level) and short-distance, high-density UAVs (in the lower level). Simulation results show that compared to existing 802.11 MAC schemes, our MAC has better performance in terms of network throughput and packet delay.

Xin Li, Fei Hu, Ji Qi, Sunil Kumar
Future Approach to Find Business Model Orientation for Technological Businesses

Nowadays, the competition is not only among technological businesses but among business models as well. Previous researches have pursued the concept of future business model through future methodologies with respect to specific cases but the literature does not suggest the successful orientation of business model components under future plausible scenarios with the macro analysis level. This paper suggests an approach for making the best orientation of the new technological firms’ business model components to be clear for the future. In this regard, after reviewing related literature, the industry ecosystem uncertainties should be explored in order to introduce four scenarios based on expert ideas and secondary data. Subsequently, the priority of elements of business model components will be analyzed based on each scenario. Business model components are reduced to value proposition and revenue model in this paper since the core part of the business model is proposing the value and capturing it via revenue model. This study is a quantitative research supplemented with quantitative measures with regard to its data collection method. The successful orientation of future business model is induced through a prioritizing method (GAHP) to attain the priorities of business model attributes for each element of business model components in each scenario. Entrepreneurs can map the priorities of the business model elements on designing the specific revenue model and value proposition of their own company.

Sepehr Ghazinoory, Fatemeh Saghafi, Maryam Mirzaei, Mohammadali Baradaran Ghahfarokhi, Parvin Baradaran Ghahfarokhi
Social Media Coverage of Public Health Issues in China: A Content Analysis of Weibo News Posts

Content analysis is a useful tool for better understanding how different portrays of public health issues may affect news dissemination. In this research, we conducted a content analysis on health-related messages published by opinion-leading news outlets. We examined generic content attributes, including number of words, sentences, images, links, and published time, as well as content valence features. Our findings show that there are no significant differences in the amount words, sentences and hyperlinks used between the highly forwarded and lowly forwarded news posts when covering a public health issue. Whereas the topic and title length as well as content valence of highly forwarded new posts is different from the lowly forwarded ones. We also discuss ways to improve popularity of health-related news.

Jiayin Pei, Guang Yu, Peng Shan
Advertisement Through the Most Popular Twitter Users Based on Followers in Saudi Arabia

Advertising is a major commercial activity in the Internet. Nowadays, online social networks present a new way of disseminating advertisements. This new way is considered the fastest way to reach large groups of customers. Even so, there has been little research focus on advertising through users in social networks. In this research paper, a focus was made on one of today’s most popular social networks which is Twitter. Our intention is to assess the extent of the utilization of the most followed Saudi Arabian Twitter users in making advertisements. This study employs a survey method that measures the paper goal.

Abeer A. AlSanad, Abdulrahman A. Mirza
Sentiment Analysis for Arabic Reviews in Social Networks Using Machine Learning

In this emerging age of social media, social networks become growing resources of user-generated material on the internet. These types of information resources, which are an expansive platform of humans’ emotions, opinions, feedback, and reviews, are considered powerful informants for big industries, markets, news, and many more. The great importance of these platforms, in conjunction with the increasingly high number of users generating contents in Arabic language, makes maiming the Arabic reviews in social networks necessary. This paper applies four automatic classification techniques; these techniques are Support vector Machine (SVM) and Back-Propagation Neural Networks (BPNN), Naïve Bayes, and Decision Tree. The main goal of this paper is to find a lightweight sentiment analysis approach for social networks’ reviews written in Arabic language. Results show that the SVM classifier achieves the highest accuracy rate, with 96.06% compared with other classifiers.

Mustafa Hammad, Mouhammd Al-awadi

Security and Privacy in Next Generation Networks

Frontmatter
An Interactive Model for Creating Awareness and Consequences of Cyber-crime in People with Limited Technology Skills

Technology is being used every day by the people irrespective of their technology skills. People with limited technology skills are not aware of what actions are legal and what actions are illegal according to the laws. In this paper, an interactive model for creating awareness and consequences of cyber-crime in people with the limited technology skills is proposed.

Sheeraz Akram, Muhammad Ramzan, Muhammad Haneef, Muhammad Ishtiaq
Power Analysis Attack and Its Countermeasure for a Lightweight Block Cipher Simon

This study proposes a power analysis attack and a countermeasure for a lightweight cipher Simon. Simon can be embedded in the smallest area among lightweight block ciphers. In the proposed power analysis method, an analysis based on conventional power analysis attacks is applied to Simon. In the proposed countermeasure, random masks are applied to data resisters. Experiments revealed the vulnerability of the normal implementation method and verified the validity of the proposed countermeasure.

Masaya Yoshikawa, Yusuke Nozaki
Perpetuating Biometrics for Authentication
Introducing the Durable True-Neighbor Template

The number of biometrically-enhanced authentication applications has outstripped our limited number of biometric features and controversy surrounds the actual availability of biometric information, fueling skepticism regarding their effective use for authentication. These concerns emphasize the imperative of addressing the singular nature of biometrics, motivating the need for durable biometric credentials that allow perpetual use of biometric information for authentication, even in the face of compromise. Toward this end, this paper introduces the Durable True-Neighbor Template (DTNT), a novel enhancement of our existing True-Neighbor Template (TNT) approach to fingerprint-based authentication that overcomes the singular nature of fingerprint biometrics through diversification using standard encryption. The results of a standard benchmark experiment, conducted using the FVC protocol with the FVC2006 and FVC2002 databases, are used to evaluate the effectiveness of DTNT for authentication. DTNT shows respectable authentication performance compared with TNT with a generally-negligible loss in fingerprint distinguishability.

Fawaz E. Alsaadi, Terrance E. Boult
Certificate-Based IP Multimedia Subsystem Authentication and Key Agreement

While more and more mobile devices making requests to get access to the IP multimedia subsystem (IMS), authenticating these devices leads to heavy load on the IMS entities. So, how to shorten the IMS authentication procedure is an important issue. This paper proposes a certificate-based IMS authentication and key agreement scheme, abbreviated as C-IMS AKA. We modify the E-UTRAN attach procedure to obtain the certificate of a UE, and enable the home subscriber server (HSS) and serving call session control function (S-CSCF) to authenticate the UE by verifying the certificate as certification authority (CA) in the registration procedure. Therefore, the UE can be authenticated without the challenge-response procedure; moreover, the number of message exchanges will be reduced. In addition, we analyze and compare the C-IMS AKA with other existing schemes in terms of message propagation delay and queuing delay. Analytical results show that the proposed scheme could make the IMS registration more efficient than current alternatives.

Wei-Kuo Chiang, Ping-Chun Lin
Software Optimizations of NTRUEncrypt for Modern Processor Architectures

This paper describes software optimizations for the post-quantum public-key encryption scheme NTRUEncrypt. We build upon the, to the best of our knowledge, fastest open-source NTRUEncrypt library and optimize it by taking advantage of AVX2 and AVX512 SIMD instructions as well as the AES-NI built-in encryption functions.We show that, on modern processors, using AVX2 yields performance gains of 23% for encryption and 37% for the decryption operation. For the future AVX512 we use a publicly available emulator, since no supporting processor is on the market yet, and report that for the decryption only about half of the instructions compared to the current code are needed to be executed.Furthermore, we propose replacing the SHA hash functions by pipelined AES-NI for faster randomness generation. With both optimizations enabled, we achieve performance improvements of 1.82x for encryption and 1.74x for decryption with a parameter set that provides 256 bits of security.

Shay Gueron, Fabian Schlieker
Vulnerabilities and Mitigation Methods in the NextGen Air Traffic Control System

The air traffic control (ATC) systems have been modernizing and standardizing the automation platforms in recent years in order to control increased number of flights. In 2004, FAA started transforming the nation’s ground-based ATC system to a system which uses satellite-based navigation and other advanced technology, called NextGen. The NextGen system deploys Internet Protocol based network to communicate and heavily relies on computerized information system and digital data, which may introduce new vulnerabilities for exploitations. Many vulnerabilities of NextGen stem from the increased interconnection of systems through wireless networks. For instance, a critical part of the NextGen, Automatic Dependent Surveillance – Broadcast, which transfers essential information via wireless network without encryption, is an easy target for attackers. There have been some deployments of security measures but still lack in critical system. In this study, we present the potential vulnerabilities of the NextGen ATC systems and their possible solutions.

Sachiko Sueki, Yoohwan Kim
Pushing the Limits of Cyber Threat Intelligence: Extending STIX to Support Complex Patterns

Nowadays, attacks against single computer systems or whole infrastructures pose a significant risk. Although deployed security systems are often able to prevent and detect standard attacks in a reliable way, it is not uncommon that more sophisticated attackers are capable to bypass these systems and stay undetected. To support the prevention and detection of attacks, the sharing of cyber threat intelligence information becomes increasingly important. Unfortunately, the currently available threat intelligence formats, such as YARA or STIX (Structured Threat Information eXpression), cannot be used to describe complex patterns that are needed to share relevant attack details about more sophisticated attacks.In this paper, we propose an extension for the standardized STIX format that allows the description of complex patterns. With this extension it is possible to tag attributes of an object and use these attributes to describe precise relations between different objects. To evaluate the proposed STIX extension we analyzed the API calls of the credential dumping tool Mimikatz and created a pattern based on these calls. This pattern precisely describes the performed API calls of Mimikatz to access the LSASS (Local Security Authority Subsystem Service) process, which is responsible for authentication procedures in Windows. Due to the specified relations, it is possible to detect the execution of Mimikatz in a reliable way.

Martin Ussath, David Jaeger, Feng Cheng, Christoph Meinel
Analyzing Packet Forwarding Schemes for Selfish Behavior in MANETs

The selfish behavior of nodes in Mobile Ad hoc Networks (MANETs) is a serious issue, which negatively impacts the packet forwarding service, causes the issue of availability and degrades the quality of service in networks. Selfish nodes try to conserve their energy and bandwidth by avoiding participation in routing and packet forwarding. To address this problem; the research community has proposed many solutions. We have studied these solutions from two perspectives i.e. context based solutions and context free solutions. In this paper, we are going to present the analysis of some of the well-known protocols which offer solutions to mitigate selfish behavior of nodes.

Asad Raza, Jamal Nazzal Al-Karaki, Haider Abbas
Speed Records for Multi-prime RSA Using AVX2 Architectures

RSA is a popular public key algorithm. Its private key operation is modular exponentiation with a composite 2k-bit modulus that is the product of two k-bit primes. Computing 2k-bit modular exponentiation can be sped up four fold with the Chinese Remainder Theorem (CRT), requiring two k-bit modular exponentiations (plus recombination). Multi-prime RSA is the generalization to the case where the modulus is a product of r ≥ 3 primes of (roughly) equal bit-length, 2k/r. Here, CRT trades 2k-bit modular exponentiation with r modular exponentiations, with 2k/r-bit moduli (plus recombination). This paper discusses multi-prime RSA with key lengths (=2k) of 2048/3072/4096 bits, and r = 3 or r = 4 primes. With these parameters, the security of multi-prime RSA is comparable to that of classical RSA. We show how to optimize multi-prime RSA on modern processors, by parallelizing r modular exponentiations and leveraging “vector” instructions, achieving performance gains of up to 5.07x.

Shay Gueron, Vlad Krasnov
Privacy Preservation of Source Location Using Phantom Nodes

Sensor networks are widely used for subject monitoring and tracking. In this modern era of wireless technology, privacy has become one of the essential concerns of wireless sensor networks. The location information of sensor nodes has to be hidden from an adversary for the sake of privacy. An adversary may trace the traffic and try to figure out the location of the source node. This can expose a significant amount of information about the subject being monitored which may be further misused by the adversary. This paper proposes 2-Phantom Angle-based Routing Scheme (2PARS) designed to improve the Source Location Privacy. The proposed scheme considers a triplet for selecting the phantom nodes. A triplet is a group of three nodes formed on the basis of their distance from the sink, their location information and the inclination angle between them. For every single packet transmission, phantom selection is performed thereby creating alternative paths. As the path changes dynamically, the safety period increases without any significant increase in the packet latency. The analysis shows that this scheme performs better in terms of safety period as compared to single phantom routing and multi-phantom routing schemes.

Shruti Gupta, Prabhat Kumar, J. P. Singh, M. P. Singh
Implementing a Bitcoin Exchange with Enhanced Security

We design and implement a secure cryptocurrency exchange by combining secure web programming practices with Bitcoin. In this work, the Amazon EC2 cloud is used to store Bitcoin and host the web server. While building the system, we employ a comprehensive coverage of security features including symmetric key encryption, SSL certificates for digital signatures, SHA-512 for hashing, password revalidation against session hijacking attacks, and two factor authentication. The implementation is tested in an experimental setup and is compared with similar applications currently available. The tests demonstrate that our system proves to offer enhanced security outperforming its peers.

Ebru Celikel Cankaya, Luke Daniel Carr

Information Systems and Internet Technology

Frontmatter
REST-Based Semantic Annotation of Web Services

The advance in research conducted in the web services area has allowed organizations to integrate their applications, enabling the automation of business processes in several areas. However, it is necessary to provide solutions that favor the selection and integration of these web services at runtime effectively and efficiently. In the current web architecture many of these web services are designed according to the REST architectural style. Given this demand, this work presents a tool that aims to make the semantic annotation of restful web services. An example of use is also discussed, in which a WSDL document is semantically described.

Cleber Lira, Paulo Caetano
Synthetical QoE-Driven Anomalous Cell Pattern Detection with a Hybrid Algorithm

Owing to more attention on quality of service (QoS) for subscribers, the mobile operators should shift evaluation standard from QoS to Quality of Experience (QoE). However, most researches in the field focus on end-to-end metrics, few of them consider synthetical QoE in the whole network. For mobile carriers, it is more significative to improve the overall system performance at the lowest cost. Therefore, the comprehensive evaluation of all users is more suitable for network optimization. As voice is still the basic service, we consider anomaly detection about voice service in this paper. Firstly, two synthetical QoE parameters, quality of voice (QoV) and successful rate of wireless access (WA), are considered to identify abnormalities of cells from the aspect of integrality and accessibility respectively. Then, we use a hybrid algorithm combining self-organizing map (SOM) and K-means to classify abnormal data points into several categories. After that, the data points for cells are treated as time series to compute the proportions in each anomaly model, which form anomalous cell patterns. To location where the exception happened accurately, the other 5 Key Performance Indicators (KPIs) are selected by association Rule according to the correlation between two synthetical QoE parameters. They are used to identify specific classes of faults. The experiment shows that the proposed method is effective to visualize and analyze anomalous cell patterns. It can be a guideline for the operators to perform faster and more efficient troubleshooting.

Dandan Miao, Weijian Sun, Xiaowei Qin, Weidong Wang
A Mobile Group Tour Tracking and Gathering System

This work proposes a group tour tracking and gathering system to let tour leaders and travel agencies keep the tour organized and ensure their safety, and reassure the tourists. This system provides several important functions, including fast roll call, ease of congregation, and prevention of members’ involuntary separation from the group. We also design an algorithm for tourists to exchange and record their information. After they access to the Internet, they will send their information as well as the exchange information of other members to the database. Tourists will help each other to update the information, so that travel agencies and tour leaders can perform the tracking function.

Chyi-Ren Dow, Yu-Yun Chang, Chiao-Wen Chen, Po-Yu Lai
Model Based Evaluation of Cybersecurity Implementations

Evaluation of Cybersecurity implementations is an important issue that is increasingly being considered in the agenda of organisations. We present here a model for the evaluation of Cybersecurity requirements. We start by establishing a set of security requirements in the form of a hierarchical structure to obtain a requirement tree, as it is prescribed by the Logic Score of Preference (LSP) evaluation method. Security requirements have been taken from the ISO/IEC 27002 standard. This requirement tree and an aggregation structure, built into a later step, form our Cybersecurity evaluation model, which allows to obtain a numerical final result for each system under evaluation. These final indicators, ranging into the interval 0..100, clearly show the degree of compliance of the systems under evaluation with respect to the desired requisites.

Aristides Dasso, Ana Funes, Germán Montejano, Daniel Riesco, Roberto Uzal, Narayan Debnath
Accelerating the Critical Line Algorithm for Portfolio Optimization Using GPUs

Since the introduction of the Modern Portfolio Theory by Markowitz in the Journal of Finance in 1952, it has been the underlying theory in several portfolio optimization techniques. With the advancement of computers, most portfolio optimization are done by CPUs. Over the years, there have been papers that introduce various optimization methods including those introduced by Markowitz, implemented on the CPUs. In the recent years, GPUs have taken the front seat as a technology to take computational speeds to new levels. However, very few papers published about portfolio optimization utilize GPUs. Our paper is about accelerating the open source Critical Line Algorithm for portfolio optimization by using GPU’s, more precicely using NVIDIA’S GPUS and CUDA API, for time consuming parts of the algorithm.

Raja H. Singh, Lee Barford, Frederick Harris Jr.
Maximum Clique Solver Using Bitsets on GPUs

Finding the maximum clique in a graph is useful for solving problems in many real world applications. However the problem is classified as NP-hard, thus making it very difficult to solve for large and dense graphs. This paper presents one of the only exact maximum clique solvers that takes advantage of the parallelism of Graphical Processing Units (GPUs). The algorithm makes use of bitsets to reduce the amount of storage space needed and take advantage of bit-level parallelism in hardware to increase performance. The results show that the GPU implementation of the algorithm performs better than the corresponding sequential algorithm in almost all cases; performance gains tend to be more prominent on larger graph sizes that can be solved using more levels of parallelism.

Matthew VanCompernolle, Lee Barford, Frederick Harris Jr.
CUDA Implementation of Computer Go Game Tree Search

Go is a fascinating game that has yet to be played well by a computer program due to its large board size and exponential time complexity. This paper presents a GPU implementation of PV-Split, a parallel implementation of a widely used game tree search algorithm for two-player zero-sum games. With many game trees, it often takes too much time to traverse the entire tree, but theoretically, the deeper the tree is traversed, the more accurate the best move found will be. By parallelizing the Go game tree search, we have successfully reduced the computation time, enabling deeper levels of the tree to be reached in smaller amounts of time. Results for the sequential and GPU implementations were compared, and the highest speedup achieved with the parallel algorithm was approximately 72x at 6 levels deep in the game tree. Although there has been related work with respect to game tree searches on the GPU, no exact best move search algorithms have been presented for Go, which uses significantly more memory due to its large board size. This paper also presents a technique for reducing the amount of required memory from previous game tree traversal methods while still allowing each processing element to play out games independently.

Christine Johnson, Lee Barford, Sergiu M. Dascalu, Frederick C. Harris Jr.
Negotiation and Collaboration Protocol Based on EbXML Intended to Optimize Port Processes

This paper proposes the replacement of the EDI (Electronic Data Interchange) technology, used in ports for information exchange between ports and ships, by more effective, flexible and open alternative. The proposal is motivated to design a negotiation protocol based on EbXML (Electronic Business using eXtensible Markup Language) to reduce interaction among actors in uses cases flows of the system. The methodology was based on system analysis techniques and documents of the port of Santos (Brazil). The results were validated using uses cases flows by reducing the number of interactions among the different entities or actors of the system. As conclusion, the study confirms that the substitution of technology is feasible and can be applied to other port information systems with the advantage of optimizing port processes.

Vinícius Eduardo Ferreira dos Santos Silva, Nunzio Marco Torrisi, Rodrigo Palucci Pantoni
Reversible Data Hiding Scheme Using the Difference Between the Maximum and Minimum Values

Qu et al. proposed a PPVO scheme based reversible data hiding technique in 2015. Their scheme obtains the predicted error value by using context pixels, and uses current pixel, predicted error value and block pixels that except for the current pixel. The PPVO scheme has a complex process because of embedding and extraction methods. In this paper, we propose an improved reversible data hiding scheme by using the difference values between the maximum value and minimum value. In our scheme, the difference values are used for a secret message embedding and extraction. Also, our scheme satisfies the characteristic of reversible data hiding. In experimental results, the embedding capacity and image quality of the proposed scheme are superior in comparison with Qu et al.’s PPVO scheme.

Pyung-Han Kim, Kwang-Yeol Jung, In-Soo Lee, Kee-Young Yoo
The Design, Data Flow Architecture, and Methodologies for a Newly Researched Comprehensive Hybrid Model for the Detection of DDoS Attacks on Cloud Computing Environment

As cloud computing services are becoming more practical and popular for the sake of its convenience and being more economical, its’ security vulnerability has been the continuous threat for both cloud service providers and clients. The more the financial benefits of these services became so attractive and the need for uninterrupted services grows, the distributed denial of services (DDoS) attacks that degrade and down its’ service availability has been the major security concern. While researchers try to address this security threat and come up with lasting solutions for early detection of DDoS attacks, the degree of these attacks is getting higher and very sophisticated. The changing and aggressive nature of the attacks make it very severe threat and difficult to easily find remedy. On this research paper, we are presenting the design and dataflow architectures for a solution model that could contribute in resolving one of the major cloud security threat, the early detection of DDoS attacks, using its hybrid approach.

Anteneh Girma, Kobi Abayomi, Moses Garuba
Educational Gaming: Improved Integration Using Standard Gaming Genres

A common problem seen in educational games is a lack of tight integration of educational content within the ‘fun’ portion of the game. We propose and test an educational game using the Shoot’em Up genre as a template for gameplay in which the educational content within the game is constantly presented to the user with the goal of ensuring that this content feels integral to the game. The goal is to enhance the appeal and fun factor of the game in the hope of increasing a learner’s motivation to play the game, thus increasing their exposure to the contained educational content.

Ben Brown, Sergiu Dascalu
Designing a Web-Based Graphical Interface for Virtual Machine Management

In any cloud computing architecture, virtualization is very important. We can accomplish this with the creating of virtual machines, guest operating systems that run on top of a host operating system by means of a hypervisor. There are many hypervisors (Xen, KVM, Virtual Box), each having advantages and disadvantages. The goal of this project is to use the KVM hypervisor, which comes standard with many Linux OS distributions, to host virtual machines that are deployed from a web-based graphical user interface.

Harinivesh Donepudi, Bindu Bhavineni, Michael Galloway
Towards Providing Resource Management in a Local IaaS Cloud Architecture

Cloud Computing is a rapidly growing branch of distributed computing. A vertical implementation of a cloud architecture could be used to replace a traditional computer lab within an educational setting. However, to do this the architecture requires a middleware that can communicate across nodes. This paper discusses a middleware developed in Python that uses sockets to communicate between compute nodes and the head node within such an architecture. Specifically, the middleware uses a socket connection between a client program installed on the head node and server programs installed on each compute node to poll the compute nodes for information. It then uses that information to carry out a load balancing algorithm that checks the available resources on each compute node and starts a virtual machine(VM) on the node with the most available resources. This paper will discuss in detail how these functions are accomplished.

Travis Brummett, Michael Galloway
Integration of Assistive Technologies into 3D Simulations: An Exploratory Study

Currently, there are limited, commercially available video games for people with disabilities. Sim-Assist is a software system that aims to allow people with disabilities to interface with a three-dimensional simulation game of Air Hockey. This is accomplished through various integrated assistive technologies, such as brain-computer interfacing, voice commands, and speech-to-text capabilities. With Sim-Assist, users are able to play Air Hockey without depending on sight, and in a hands-free manner. We conducted an exploratory study to provide the foundation for integrating assistive technologies in 3D simulations, including scientific simulations and serious games. In this paper, we introduce the research and development of assistive technologies, focusing more on the BCI software component; outline our system design and implementation; provide a short walkthrough of the interface; and briefly discuss our preliminary results.

Angela T. Chan, Alexander Gamino, Frederick C. Harris Jr., Sergiu Dascalu
Data Profiling Technology of Data Governance Regarding Big Data: Review and Rethinking

Data profiling technology is very valuable for data governance and data quality control because people need it to verify and review the quality of structured, semi-structured, and unstructured data. In this paper, we first review relevant works and discuss their definitions of data profiling. Second, we offer a new definition and propose new classifications for data profiling tasks. Third, the paper presents several free and commercial profiling tools. Fourth, authors offer a new data quality metrics and data quality score calculation. Finally, authors discuss a data profiling tool framework for big data.

Wei Dai, Isaac Wardlaw, Yu Cui, Kashif Mehdi, Yanyan Li, Jun Long
Understanding the Service Model of Mobile Social Network in Live 4G Network: A Case Study of WeChat Moments

Driven by the rapid advancement of both cellular network technologies and mobile devices, mobile social network (MSN) services significantly facilitate personal and business communications and therefore inevitably consume substantial network resources and potentially affect the network stability. Despite of their huge user base and popularity, little work has been done to characterize the service model of MSN services. Thus, in light of many works on MSN services focusing on their various different aspects, this paper aims to provide a deep understanding of the service model of MSN services from different levels. In order to reach credible conclusions, our research is based on a large-scale data set collected inside a province wide 4G cellular data network in China. Specially, we choose WeChat Moments (WM) as a case study since it is one of the most popular MSN services in China with 600 millions of active users per month.

Weijian Sun, Dandan Miao, Xiaowei Qin, Guo Wei
Applying Scrum in an Interdisciplinary Project for Fraud Detection in Credit Card Transactions

This paper aims to describe the use of Scrum agile method for an academic prototype system development using Big Data, Internet of Things, and Mobile Devices, as a solution for detecting and reducing fraud on credit card transactions. It highlights the involvement of graduate students in the Brazilian Aeronautics Institute of Technology (Instituto Tecnologico de Aeronautica - ITA), during the first Semester of 2015. The major contribution is a Proof of Concept (PoC) development, by using an Interdisciplinary Problem Based Learning (IPBL), where students geographically dispersed, from three different courses, worked into a real problem and had to develop a solution within seventeen weeks. At the end of the project, it was possible to deliver an academic prototype system, where mobile devices were able to carry out transactions with performance, using Internet of Things (IoT), and enabling data mass analyses within a Big Data environment in a transparent way.

Mayara Valeria Morais dos Santos, Paulo Diego Barbosa da Silva, Andre Gomes Lamas Otero, Ramiro Tadeu Wisnieski, Gildarcio Sousa Goncalves, Rene Esteves Maria, Luiz Alberto Vieira Dias, Adilson Marques da Cunha
The Visual Representation of Numerical Solution for a Non-stationary Deformation in a Solid Body

One of the most interesting directions in the field of solid body deformation mechanics, having an applied nature, is the solution of half-plane non-stationary deformation problems, on a rectangular base of which the time-varying loads are superimposed. Such problem arises, for example, in the study of well environ stress-strain state, its design and development of piles hammering technology. The main idea of this paper is to show the use of numerical methods in order to visualize the results of solving the tasks about the non-stationary half-plane deformation.

Zhanar Akhmetova, Seilkhan Boranbayev, Serik Zhuzbayev

Software Engineering

Frontmatter
Developing Usability Heuristics for Grid Computing Applications: Lessons Learned

Several methods allow measuring the degree of usability of a software product. The heuristic evaluation is one of the most popular methods which allow finding more usability problems, in comparison to other methods. However, sets of generic usability heuristics may not evaluate (usability) aspects related to specific software. A set of usability heuristics for Grid Computing was developed using the methodology proposed by Rusu and others, in 2011. The methodology facilitated the heuristics’ design and specification, with stages that support the developing process and allow to formally specifying the heuristics. The paper highlights certain deficiencies detected when applying this methodology and suggests some improvements.

Daniela Quiñones, Cristian Rusu, Silvana Roncagliolo, Virginica Rusu, César A. Collazos
Refining Timing Requirements in Extended Models of Legacy Vehicular Embedded Systems Using Early End-to-end Timing Analysis

Model and component-based development approaches have emerged as an attractive option to deal with the complexity of vehicle software. Using these approaches, we provide a method to estimate and refine end-to-end timing requirements in vehicular distributed embedded systems that are developed by reusing the models of legacy systems. This method leverages on the early end-to-end timing analysis that can be performed at the highest abstraction level during the development of these systems. As a proof of concept, we conduct a vehicular-application case study to show the process of estimating and refining the timing requirements early during the development. The case study is modeled with the Rubus-ICE tool suite that is used for the software development of vehicular embedded systems by several international companies.

Saad Mubeen, Thomas Nolte, John Lundbäck, Mattias Gålnander, Kurt-Lennart Lundbäck
Integrated Metrics Handling in Open Source Software Quality Management Platforms

Software quality is of vital importance in software development projects. It influences every aspect of the system such as the functionality, reliability, availability, maintainability, and safety. In critical software projects, quality assurance has to be considered at each level of the initial concept to the software engineering process: from specification to coding and integration. At the lowest coding level, there are several tools that enable the monitoring and control of software quality. One of them is SonarQube, an open source quality management platform, used to analyse and measure technical quality. I can be extended through plugins for customization and integration with other tools. The specific conception and development of these plugins is a significant design effort that ensures the correct handling of the different phases involved in the software quality process. We present an initial design and development of an integrated analyser component for extending the functionality of the open source framework for software quality management.

Julio Escribano-Barreno, Javier García-Muñoz, Marisol García-Valls
Supporting the Development of User-Driven Service Composition Applications

One of the software engineering challenges is the development of applications that can adapt to the heterogeneous needs of users. Technical Dynamic Composition of Services Driven by User is a solution for developing applications capable of overcoming these challenges. This type of application which will call User-Driven Service Composition Application (UDSCA) allows to compose services during its execution, thus meeting the needs of users. But the lack of guidance on how to develop UDCAs can make development a complex task, because it may aggregate unknown solutions by developers, thus damaging the development team and the developed application. Therefore, the objective of this work is to be able to guide developers during the development of this kind of application. To develop the approach, it has been defined which activities should be undertaken during the development as well as the concepts, techniques, artifacts, technologies and tools needed to perform these activities. To evaluate the approach one conducted a case study in which a UDSCA was developed in the field of building maintenance services. The resulting application of the approach was shown to be able to adapt to the heterogeneous needs of the user, also the approach provided artifacts that promoted reuse. In conclusion, the approach guides the developer during the UDSCAs development and provides artifacts that reduce efforts for development.

Alex Roberto Guido, Antonio Francisco do Prado, Wanderley Lopes de Souza, Eduardo Gonçalves da Silva
Mining Source Code Clones in a Corporate Environment

Many researches around code clone detection rely on Open Source Software (OSS) repositories to execute their studies. These cases do not reflect the corporative code development scenario. Big Companies repositories’ are protected from the public’s access, so their content and behavior remain as a black box on the researchers’ viewpoint. This article presents an experiment performed on systems developed in a large private education company, to observe and compare the incidence of cloned code on proprietary software with other studies involving open source systems, using different similarity thresholds. The results indicate that the closed-source repository presents similar clone incidence as the OSS ones.

Jose J. Torres, Methanias C. Junior, Francisco R. Santos
Fuzzy Resource-Constrained Time Workflow Nets

The underlying proposal of this work is to express in a more realistic way the resource allocation mechanisms when human behavior is considered in Workflow activities. In order to accomplish this, fuzzy sets delimited by possibility distributions are associated with the Petri net models that represent human type resource allocation mechanisms. Additionally, the duration of activities that appear on the routes (control structure) of the Workflow process, are represented by fuzzy time intervals produced through a kind of constraint propagation mechanism. New firing rules based on a joint possibility distribution are then defined. Finally, the model is built using CPN (Colored Petri net) language supported by CPN Tools.

Joslaine Cristina Jeske de Freitas, Stéphane Julia
Performance Indicators Analysis in Software Processes Using Semi-supervised Learning with Information Visualization

Software development process requires judicious quality control, using performance indicators to support decision-making in the different processes chains. This paper recommends the use of machine learning with the semi-supervised algorithms to analyze these indicators. In this context, this paper proposes the use of visualization techniques of multidimensional information to support the labeling process of samples, increasing the reliability of the labeled indicators (group or individual). The experiments show analysis from real indicators data of a software development company and use the algorithm bioinspired Particle Competition and Cooperation. The information visualization techniques used are: Least Square Projection, Classical Multidimensional Scaling and Parallel Coordinates. Those techniques help to correct the labeling process performed by specialists (labelers), enabling the identification of mistakes in order to improve the data accuracy for application of the semi-supervised algorithm.

Leandro Bodo, Hilda Carvalho de Oliveira, Fabricio Aparecido Breve, Danilo Medeiros Eler
Techniques for the Identification of Crosscutting Concerns: A Systematic Literature Review

Modularization is a goal difficult to achieve in software development. Some requirements, named crosscutting concerns, cannot be clearly mapped into isolated source code units, and their implementations tend to cut across multiple units. Although several researches propose new approaches to identify crosscutting concerns, few works aim to provide analysis, synthesis and documentation of the aspect mining literature. To address this research gap, we conducted a systematic literature review to provide researchers with a state-of-the-art of the existing aspect mining techniques. We point out challenges and open issues on most of the techniques analyzed that could be improved in further researches.

Ingrid Marçal, Rogério Eduardo Garcia, Danilo Medeiros Eler, Celso Olivete Junior, Ronaldo C. M. Correia
ValiPar Service: Structural Testing of Concurrent Programs as a Web Service Composition

The testing of concurrent programs is essential to ensure the quality of such programs. One of the main challenges of such testing activity is to provide tools that enable it at a reasonable operational cost. The service orientation paradigm provides guidelines for the development of tools as services, addressing such requirement. The division of the structural testing of concurrent programs into services faces fundamental challenges. The objective of this paper is to provide structural testing of concurrent programs as a Web service composition. We divided this monolithic structure by defining the concepts, relations and parameters of the structural testing of concurrent programs to support this activity using a service composition. Our main contributions are in the definition of Web services for the structural testing of concurrent software, focusing on the service contracts and capabilities. The developed services present flexible contracts and capabilities that can be reused in different contexts.

Rafael R. Prado, Paulo S. L. Souza, Simone R. S. Souza, George G. M. Dourado, Raphael N. Batista
A Model-Driven Solution for Automatic Software Deployment in the Cloud

Through virtualization, cloud computing offers resources that reduce the costs in the institutions that use hardware and software resources. In this paper, we present a model-based approach to automatically deploy software in the cloud. To evaluate our approach, we conducted an experiment in an IT company in which their software developers used our solution instead of manually deploying software in the cloud. After that, they answered a survey, so we could investigate the following metrics: maintainability, learnability and reduction of developer’s workload to deploy software services. The results showed that our solution presented a positive impact of at least of 25% percent for all the metrics. Moreover, since our approach relies upon UML models, it requires less effort to deploy the services as well as it can be used by any professionals that have basic skills about UML.

Franklin Magalhães Ribeiro Jr., Tarcísio da Rocha, Joanna C. S. Santos, Edward David Moreno
Software Process Improvement in Small and Medium Enterprises: A Systematic Literature Review

The knowledge of characteristics and profile of a company is the key to plan its software process improvement. It helps focusing efforts to promote alignment with organizational culture and to support the consolidation of best practices already implemented. This paper presents a systematic literature review to identify evidences in the literature related to the challenges and opportunities of the adoption of software process improvement in small and medium enterprises. The results from the study indicate that there are relevant issues that can be considered in the effective adoption of software engineering best practices in small and medium enterprises.

Gledston Carneiro da Silva, Glauco de Figueiredo Carneiro
Investigating Reputation in Collaborative Software Maintenance: A Study Based on Systematic Mapping

[Background] Reputation systems have attracted the attention of researchers when it comes to collaborative systems. In the context of collaborative software maintenance, systems of this type are employed to facilitate the collection, aggregation and distribution of reputation information about a participant. GiveMe Infra is an infrastructure that supports collaborative software maintenance performed both by co-located and geographically distributed teams. In this last case, reputation is one of the factors that influence collaboration. Despite this recognized relevance, to the best of our knowledge, there is a shortage of tools providing reputation functionalities in the context of collaborative software maintenance. [Objective] However, GiveMe Infra needs to identify and correlate metrics, measures, criteria and factors (called parameters) that are used in defining the value of reputation of an entity. These parameters, used to determine the degree of reputation, can provide evidence of parameters to be used in the context of software maintenance and evolution. [Method] In order to achieve this goal, a systematic mapping was performed. Both the established protocol and the process adopted during the mapping are shown in this article. The parameters identified, as well as how to apply them in the context of software maintenance are demonstrated through an analysis scenario. [Results] Our goal has been achieved since the systematic mapping allowed the identification of the parameters used in defining reputation and even parameters that would not allow a correlation with the context of collaborative software maintenance. The contribution of this research is threefold, in its investigation carried out, the list of identified parameters and also the application in the collaborative software maintenance context.

Cláudio Augusto S. Lélis, Marco Antônio P. Araújo, José Maria N. David, Glauco de F. Carneiro
A 2-Layer Component-Based Architecture for Heterogeneous CPU-GPU Embedded Systems

Traditional embedded systems are evolving into heterogeneous systems in order to address new and more demanding software requirements. Modern embedded systems are constructed by combining different computation units, such as traditional CPUs with Graphics Processing Units (GPUs). Adding GPUs to conventional CPU-based embedded systems enhances the computation power but also increases the complexity in developing software applications. A method that can help to tackle and address the software complexity issue of heterogeneous systems is component-based development.The allocation of the software application onto the appropriate computation node is greatly influenced by the system information load. The allocation process is increased in difficulty when we use, instead of common CPU-based systems, complex CPU-GPU systems.This paper presents a 2-layer component-based architecture for heterogeneous embedded systems, which has the purpose to ease the software-to-hardware allocation process. The solution abstracts the CPU-GPU detailed component-based design into single software components in order to decrease the amount of information delivered to the allocator. The last part of the paper describes the activities of the allocation process while using our proposed solution, when applied on a real system demonstrator.

Gabriel Campeanu, Mehrdad Saadatmand

High-Performance Computing Architectures

Frontmatter
Block-Based Approach to 2-D Wavelet Transform on GPUs

This paper introduces a new approach to computation of 2-D discrete wavelet transform on modern GPUs. The proposed approach involves block-based processing enabling one seamless transform even for high resolution input data. Inside the blocks, two distinct methods can be used – either separable or non-separable 2-D lifting scheme. Furthermore, the paper presents a comparison of the proposed approach under different conditions to the best existing methods, whereas our approach consistently outperforms the other ones. Our methods are implemented using the OpenCL framework and tested on a wide range GPUs.

Michal Kula, David Barina, Pavel Zemcik
Deriving CGM Based-Parallel Algorithms for the Optimal Binary Search-Tree Problem

This paper presents a methodology to derive CGM (Coarse Grain Multicomputer) parallel algorithms for the cost of the Optimal Binary Search Tree Problem (OBST Problem). Depending on the parameter we want to optimize, we derive an algorithm accordingly. Therefore a load balancing, an efficient and a minimum communication rounds algorithms are obtained. Our CGM algorithms use p processors, each with O(n2/p) local memory. The best one in communication requires O(p1/2) communication rounds and O(n2/p) computations on each processor. Another one with the best time efficiency needs only O(n2/p) time steps. All local computations on processors are based on the Knuth sequential algorithm for the OBST problem.

Vianney Kengne Tchendji, Jean Frédéric Myoupo, Gilles Dequen
Experimental Evaluations of MapReduce in Biomedical Text Mining

In this paper, we demonstrate our development of two biomedical text mining applications: biomedical literature search (BLS) and biomedical association mining (BAM). While the former requires less computations, the latter is more computationally intensive. Experimental studies were conducted using Amazon Elastic MapReduce (EMR) with an input of 33,960 biomedical articles from TREC (Text REtrieval Conference) 2006 Genomics Track. Our experiment results indicated that both applications’ scalabilities were not linear in term of the number of computing nodes. Meanwhile, BAM achieved better scalability than BLS since BLS performed less computations and were primarily dominated by overheads such as JVM startup, scheduling, disk I/O, etc. These observations imply that existing MapReduce framework may not be suitable for on-line systems such as literature search that needs quick response.

Yanqing Ji, Yun Tian, Fangyang Shen, John Tran
Algorithmic Approaches for a Dependable Smart Grid

We explore options for integrating sustainable and renewable energy into the existing power grid, or even create a new power grid model. We present various theoretical concepts necessary to meet the challenges of a smart grid. We first present a supply and demand model of the smart grid to compute the average number of conventional power generator required to meet demand during the high consumption hours. The model will be developed using Fluid Stochastic Petri Net (FSPN) approach. We propose to model the situations that need decisions to throttle down the energy supplied by the traditional power plants using game-theoretic online competitive models. We also present in this paper the power-down model which has shown to be competitive in the worst case scenarios and we lay down the ground work for addressing the multi-state dynamic power management problem.

Wolfgang Bein, Bharat B. Madan, Doina Bein, Dara Nyknahad
Performance Evaluation of Data Migration Methods Between the Host and the Device in CUDA-Based Programming

CUDA-based programming model is heterogeneous – composed of two components: host (CPU) and device (GPU). Both components have separated memory spaces and processing units. A great challenge to increase GPU-based application performance is the data migration between these memory spaces. Currently, the CUDA platform supports the following data migration methods: UMA, zero-copy, pageable and pinned memory. In this paper, we compare the zero-copy performance method with the other methods by considering the overall application runtime. Additionally, we investigated the aspects of data migration process to enunciate causes of the performance variations. The obtained results demonstrated in some cases the zero-copy memory can provide an average performance on $$19\,\%$$ higher than the pinned memory transfer. In the studied situation, this method was the second most efficient. Finally, we present limitations of zero-copy memory as a resource for improving performance of CUDA applications.

Rafael Silva Santos, Danilo Medeiros Eler, Rogério Eduardo Garcia
Design of a Deadlock-Free XY-YX Router for Network-on-Chip

With the increasing number of cores in multiprocessor System-on-Chip (SoC), the design of an efficient communication fabric is essential to satisfy the bandwidth requirements of multiprocessor systems. Nowadays, scalable Network-on-Chips (NoC) are becoming the standard communication framework to replace bus-based networks. In this paper, we propose a deadlock-free XY-YX router for on-chip networks. In order to prevent deadlocks, we exploit additional physical channels in horizontal direction and optimize the priority of output channel allocation. Experimental results prove that the proposed deadlock-free router enhances the throughput of NoC.

Sang Muk Lee, Eun Nu Ri Ko, Young Seob Jeong, Seung Eun Lee
An FPGA Based Compression Accelerator for Forex Trading System

In this paper, we propose an FPGA based hardware accelerator for forex trading system. In the forex trading market, the trading volume of currencies is growing larger every year. In order to provide a real-time processing of large volume and high availability service, we focused on the two types of workload, where a bottleneck occurs. The bottleneck between an application server and an internal hard disk is caused by the overhead from storing the transaction logs, due to the bandwidth limitation of a hard disk. Our key idea is to suppress the overhead of transaction logging through the high throughput hardware compression. Compared to software compression, our hardware accelerator scored 6x better performance in compression throughput.

Ji Hoon Jang, Seong Mo Lee, Oh Seong Gwon, Seung Eun Lee

Agile Software Testing and Development

Frontmatter
An Academic Case Study Using Scrum

This paper aims to present an academic experience in which a real world software development problem can be tackled by four classes of students. Within this context, it was proposed by the professors the development of a project named SI-GAC, an acronym in Portuguese of what could be freely translated to English as “Integrated Accident and Crisis Management System”. This project, with social community benefits, had as its main objective to develop an embedded software system for crisis management. The Scrum framework and Agile Methods were used, with the purpose of training the students in these technologies with a real world problem. Four sprints were exercised and it was possible to show that, with some adaptations, even for geographically dispersed teams the results achieved were satisfactory.

Luciana Rinaldi Fogaça, Luiz Alberto Vieira Dias, Adilson Marques da Cunha
Distributed Systems Performance for Big Data

This paper describes a methodology for working with distributed systems, and achieve performance in Big Data, through the framework Hadoop, Python programming language, and Apache Hive module. The efficiency of the proposed methodology is tested through a case study that addresses a real problem found in the supercomputing environment of the Center for Weather Forecasting and Climate Studies linked to the Brazilian Institute for Space Research (CPTEC / INPE), which provides Society a work able to predict disasters and save people lives. In all three experiments involving the issue, using the Cray XT-6 supercomputer: (i) the first issue involves programming in Python and a sequential and monoprocessed arquitecture; (ii) the second uses Python and Hadoop framework, over parallel and distributed arquitecture; (iii) the latter combines Hadoop and Hive in a parallel and distributed arquitecture. The main results of these experiments are compared, discussed, and topics beyond the scope in this research are exposed as recommendations and suggestions for future work.

Marcelo Paiva Ramos, Paulo Marcelo Tasinaffo, Eugenio Sper de Almeida, Luis Marcelo Achite, Adilson Marques da Cunha, Luiz Alberto Vieira Dias
Towards Earlier Fault Detection by Value-Driven Prioritization of Test Cases Using Fuzzy TOPSIS

Software testing in industrial projects typically requires large test suites. Executing them is commonly expensive in terms of effort and wall-clock time. Indiscriminately executing all available test cases leads to sub-optimal exploitation of testing resources. Selecting too few test cases for execution on the other hand might leave a large number of faults undiscovered. Limiting factors such as allocated budget and time constraints for testing further emphasizes the importance of test case prioritization in order to identify test cases that enable earlier detection of faults while respecting such constraints. This paper introduces a novel method prioritizing test cases to detect faults earlier. The method combines TOPSIS decision making with fuzzy principles. The method is based on multi-criteria like fault detection probability, execution time, or complexity. Applying the method in an industrial context for testing a train control management subsystem from Bombardier Transportation in Sweden shows its practical benefit.

Sahar Tahvili, Wasif Afzal, Mehrdad Saadatmand, Markus Bohlin, Daniel Sundmark, Stig Larsson

Model-Driven Engineering for Cyber-Physical Systems

Frontmatter
Experimenting with a Load-Aware Communication Middleware for CPS Domains

This paper describes the design of a middleware that is able to perform hybrid load balancing in a distributed systfarm, ensuring timely load balancing and client service. The described middleware is based on a distributed, modular software architecture design that allows to interface with sensor nodes that operate on a physical environment. The core entity is a balancer module that can run different algorithms in order to obtain different results for parameters such as client response times, response dispatching at the servers, or interaction strategy with the sensor nodes. A software implementation over a specific object oriented communication infrastructure is achieved, and a validation scenario is also presented to show the suitability of our design. We test it in an actual multi-process and multi-threaded scenario, analyzing its latency on several work-load conditions.

Luis Cappa-Banda, Marisol García-Valls
Adaptive Message Restructuring Using Model-Driven Engineering

Message exchange between distributed software components in cyber-physical systems is a frequent and resource-demanding activity. Existing data description languages simply map user-specified messages literally to the system implementation creating the data stream that is exchanged between the software components; however, our research shows that the exchanged information is often redundant and would allow for runtime optimization. In this paper, we propose a model-based approach for adaptive message restructuring. Taking both design-time properties and runtime properties into account, we propose to dynamically restructure user-specified messages to achieve better resource usage (e.g., reduced latency). Our model-based workflow also includes formal verification of adaptive message restructuring in the presence of complex data flow. This is demonstrated by an automotive example.

Hang Yin, Federico Giaimo, Hugo Andrade, Christian Berger, Ivica Crnkovic
Towards Modular Language Design Using Language Fragments: The Hybrid Systems Case Study

Cyber-physical systems can be best represented using hybrid models that contain specifications of both continuous and discrete event abstractions. The syntax and semantics of such hybrid languages should ideally be defined by reusing the syntax and semantics of each components’ formalisms. In language composition, semantic adaptation is needed to ensure correct realization of the concepts that are part of the intricacies of the hybrid language.In this paper, we present a technique for the composition of heterogeneous languages by explicitly modelling the semantic adaptation between them. Each modelling language is represented as a language specification fragment (LSF): a modular representation of the syntax and semantics. The basis of our technique is to reuse the operational semantics as defined in existing simulators. Our approach is demonstrated by means of a hybrid language composed of timed finite state machines (TFSA) and causal block diagrams (CBD).

Sadaf Mustafiz, Bruno Barroca, Claudio Gomes, Hans Vangheluwe

Data Mining

Frontmatter
Data Clustering Using Improved Fire Fly Algorithm

Clustering is considered as one of the most important techniques for data mining that is used for data analysis in some areas such as text identification, image processing, economic science, and spatial data analysis. Several algorithms have been proposed for solving the problem clustering. These algorithms are using different techniques. Firefly algorithm was inspired by the process of producing twinkle lights of this insect, and is considered as one of the designed base on a collective behavior of insects. In this paper, the evolutionary algorithm of Firefly is used for solving clustering problem. The proposed algorithm is compared with firefly algorithm, differential evolution algorithm and k-means algorithm on some important data sets from database UCI. According to the results, this algorithm is more appropriate for better clustering rather than firefly algorithm.

Mehdi Sadeghzadeh
Intelligent Mobile App for You-Tube Video Selection

The usage of mobile devices has increased dramatically in recent years. These devices serve us in many practical ways and provide us with many services – many of them in real-time and “on demand”. The delivery of streaming audio, streaming video and internet content to these devices has become common place. One of the most popular sources of video/audio content is You-Tube – where billions of videos are uploaded and accessed each day. An increasing challenge is to locate the desired video from among many dozens of possibilities. This paper introduces an intelligent mobile application that utilizes the You-Tube Application Programming Interface (API) in developing a novel algorithm for selecting the most appropriate video associated with a song title and artist. The test case used in this application is the domain of all the top 40 popular songs (as provided by Billboard Inc.) from 1970 to present – resulting in approximately 14,000 possible songs. The application described in this work invokes the You-Tube API based on 3 different criteria – most popular video, most relevant video and a key word search of the video’s comments. These criteria are then merged in a voting algorithm thus providing the best possible video pertaining to a song title and artist. This system, implemented for iOS 9 using XCode and Swift, allows the user to choose from thousands of song titles and provide a “music on demand” system therefore playing the best possible video associated with the song title and artist. Results for the app consist of randomly selecting 50 songs from each year (1867 total) and verifying that the most appropriate video was selected.

Mark Smith
An Application of GEP Algorithm for Prime Time Detection in Online Social Network

Online Social network services, a web-based information sharing service which is utilized for public relationship and advertising promotion by brands and organizations. Since online social network is always altering, how to influence as much members as possible in community at appropriate moment will be a thoughtful problem. And strategy of article publishing decision will be important to every kind of online social network community operator.In this paper, we focus on developing an innovative GEP algorithm, which we called Social network Gene Expression Programming (SGEP), to detect the prime time of information spreading on online social network.

Hsiao-Wei Hu, Wen-Shiu Lin, I-Hsun Chen
Transport Logistic Application: Train’s Adherence Evolution and Prediction Based on Decision Tree and Markov Models

In this paper it is presented a method for modelling adherence and prediction of train evolution based on classification tree and Markov Models. In day-to-day of railway operational management of traffic it is recurrent to face a deviation from the theoretic planned schedule to a certain space-time. When such deviation happens, what we also call non-adherence, there is a need for a quick decision-making to avoid, on the one hand, that primary initial non-adherence leads to a whole cascade of secondary non-adherence of other trains over the planned schedule, on the other hand that event such as a train reaches another may occurs, what is a considerable issue. We present an effective mixed approach compounded of a stochastic process (Markov Models) and classification tree for estimating and predicting the reliability of the adherence of planned and realized traffic schedule of not only rail freight transport, but also public transport, considering some parameters that may cause deviation such as weather, environmental conditions, and so on. The experiments on real data for trains of Brazilian regions show that our model is fairly realistic and deliver good results in short processing time. Moreover, the model may run for an on-line scenario where updated data are massively collected from monitoring sensors. Withal, the prediction’s accuracy along with the evolution of probability distributions regarding all events over time have been evaluated. Therefrom the approach reveals to be good for predicting for train traffic evolution based on historic data and monitoring collected data as well as weather and environmental ones. As a result we obtain the increasing reliability of about 74% of prediction.

Steve Ataky T. Mpinda, Marilde T. P. Santos, Marcela X. Ribeiro
A Method for Match Key Blocking in Probabilistic Matching
(Research-in-Progress)

The pair-wise nature of Entity Resolution makes it impractical to perform on large datasets without the use of blocking. Many blocking techniques have been researched and applied to effectively reduce pair-wise comparisons in Boolean rule based systems while also providing 100 % match recall. However, these approaches do not always work when applied to probabilistic matching. This paper discusses an approach to blocking for probabilistic scoring rules through the use of match key indexing.

Pei Wang, Daniel Pullen, John R. Talburt, Cheng Chen
Accuracy Assessment on Prediction Models for Fetal Weight Based on Maternal Fundal Height
Applications in Indonesia

Delivery weight is a significant indicator of pregnancy outcomes. Both low birth weight and macrosomia ranges have negative impact on neonatal health yet low birth weight is the leading cause of neonatal mortality in most developing countries. Since fetal weight cannot be directly measured, the prediction has become increasingly important in routine antenatal care. Early birth weight estimation during expectancy assists medical practitioners to make an informed decision on whether intervention is required prior to delivery. Several prediction models for fetal weight have been developed in Indonesia based on clinical assessment of fundal height as an alternative to ultrasound. However, most prediction models only would provide weight estimates very close to delivery or at more than 35 weeks of gestational age. Some researchers carried out comparison study among these prediction models. However, there has been little discussion on their forecast accuracy measures. This paper aims to evaluate and compare the accuracy of existing prediction–based fundal height models for estimating neonatal delivery weight using fundal height measurements between 20 and 35 weeks of gestational age.

Dewi Anggraini, Mali Abdollahian, Kaye Marion
Ensemble Noise Filtering for Streaming Data Using Poisson Bootstrap Model Filtering

Ensemble filtering techniques filter noisy instances by combining the predictions of multiple base models, each of which is learned using a traditional algorithm. However, in the last decade, due to the massive increase in the amount of online streaming data, ensemble filtering methods, which largely operate in batch mode and requires multiple passes over the data, cause time and storage complexities. In this paper, we present an ensemble bootstrap model filtering technique with multiple inductive learning algorithms on several small Poisson bootstrapped samples of online data to filter noisy instances. We analyze three prior filtering techniques using Bayesian computational analysis to understand the underlying distribution of the model space. We implement our and other prior filtering approaches and show that our approach is more accurate than other prior filtering methods.

Ashwin Satyanarayana, Rosemary Chinchilla
Mining Persistent and Dynamic Spatio-Temporal Change in Global Climate Data

The potential impacts of climate change on natural and man-made systems can have a drastic effect on life on Earth. The application of data mining algorithms on global climate data can result in a better understanding of the climate system. Of which, change detection has proven to be a very useful approach when mining climate data. Understanding spatio-temporal change can give insight to interesting patterns that can be used to predict climate events. This paper proposes a method to generate spatial homogeneous regions that uses a novel indexing structure for the analysis of spatial change including homogeneous change and heterogeneous change. The resulting regions are then used to analyze persistent and dynamic regions at longer time scales. The efficacy of the approach was demonstrated on a real-world climate dataset and the results suggest interesting patterns that are explained by known climate phenomena.

Jie Lian, Michael P. McGuire
Open Source Data Quality Tools: Revisited

High data quality is defined as the reliability and application efficiency of data present in a system. Maintaining high data quality has become a key feature for most organizations. Different data quality tools are used for extracting, cleaning, and matching data sources. In this paper, we first introduce state of the art open source data quality tools, specifically Talend Open Studio, DataCleaner, WinPure, Data Preparator, Data Match, DataMartist, Pentaho Kettle, SQL Power Architect, SQL Power DQguru, and DQ Analyzer. Secondly, we compare these tools based on their key features and performance in data profiling, integration, and cleaning. Overall, DataCleaner scores highest among the considered tools.

Venkata Sai Venkatesh Pulla, Cihan Varol, Murat Al

New Trends in Wavelet and Numerical Analysis

Frontmatter
Some Outflow Boundary Conditions for the Navier-Stokes Equations

In numerical simulation of real-world flow problems, we often encounter some issues concerning artificial boundary conditions. An important example is the blood flow problem in the large arteries and the simulation is highly dependent on the choice of artificial boundary conditions posed on the outflow boundary. In the present paper, we examine some outflow boundary conditions including our new condition from the view-point of mathematical analysis.

Yoshiki Sugitani, Guanyu Zhou, Norikazu Saito
Successive Projection with B-spline

Space-time finite element methods often construct the approximate solution by time discontinuous Galerkin methods. Successive projection technique (SPT) with B-spline functions allows us to convert this approximate solution into time-continuous representation. We present the stability and error estimate of SPT.

Yuki Ueda, Norikazu Saito
Computer-Aided Diagnosis Method for Detecting Early Esophageal Cancer from Endoscopic Image by Using Dyadic Wavelet Transform and Fractal Dimension

We propose a new computer-aided method for diagnosing early esophageal cancer from endoscopic images by using the dyadic wavelet transform (DYWT) and the fractal dimension. In our method, an input image is converted into HSV color space, and a fusion image is made from the S (saturation) and V (value) components based on the DYWT. We apply the contrast enhancement to produce a grayscale image in which the structure of abnormal regions is enhanced. We can obtain binary images composed of multiple layers by low-gradation processing. We visualize abnormal regions by summing these fractal dimensions by computing the complexity of these images. We describe a process for enhancing, detecting and visualizing abnormal regions in detail, and we present experimental results demonstrating that our method gives visualized images in which abnormal regions in endoscopic images can be located and that contain data useful for actual diagnosis of early esophageal cancer.

Ryuji Ohura, Hajime Omura, Yasuhisa Sakata, Teruya Minamoto
Daubechies Wavelet-Based Method for Early Esophageal Cancer Detection from Flexible Spectral Imaging Color Enhancement Image

We propose a new method for detecting early esophageal cancer from Flexible spectral Imaging Color Enhancement (FICE) mode images based on the Daubechies wavelet transform. In our method, we convert an image into CIEL*a*b* color space and use the a* components. Next, we divide the a* components into small blocks, apply two types of Daubechies wavelet transforms to each block, and then obtain the low- and high-frequency components at each block. The histogram of the low-frequency components tends to be positioned at the right and the left of a particular value for abnormal and normal regions, respectively. The histogram of the high-frequency components for an abnormal region has a longer tail. We describe the detection procedure of the abnormal regions in detail and present experimental results demonstrating that our method is able to detect early esophageal cancer from FICE images based on these features.

Hiroki Matsunaga, Hajime Omura, Ryuji Ohura, Teruya Minamoto
A Refined Method for Estimating the Global Hölder Exponent

In this paper, we recall basic results we have obtained about generalized Hölder spaces and present a wavelet characterization that holds under more general hypothesis than previously stated. This theoretical tool gives rise to a method for estimating the global Hölder exponent which seems to be more precise than other wavelet-based approaches. This work should prove helpful for estimating long range correlations.

S. Nicolay, D. Kreit
A New Wavelet-Based Mode Decomposition for Oscillating Signals and Comparison with the Empirical Mode Decomposition

We introduce a new method based on wavelets (EWMD) for decomposing a signal into quasi-periodic oscillating components with smooth time-varying amplitudes. This method is inspired by both the “classic” wavelet-based decomposition and the empirical mode decomposition (EMD). We compare the reconstruction skills and the period detection ability of the method with the well-established EMD on toys examples and the ENSO climate index. It appears that the EWMD accurately decomposes and reconstructs a given signal (with the same efficiency as the EMD), it is better at detecting prescribed periods and is less sensitive to noise. This work provides the first version of the EWMD. Even though there is still room for improvement, it turns out that preliminary results are highly promising.

Adrien Deliège, Samuel Nicolay

Computer Vision, HCI and Image Processing/Analysis

Frontmatter
Short Boundary Detection Using Spatial-Temporal Features

Shot boundary detection is a major and challenging job in any video processing solutions. Different features are used for detecting shot boundaries in a video. Choice of features depends on the nature of processing. Mostly features are extracted from either spatial domain (visual features) or from temporal domain (motion information). To obtain more accuracy normally high level features are used but these features need complex computations. On the other hand low-level features like colour and brightness are easy to extract and compute but these features have less tolerance towards small changes in the frame. Similarly features in temporal domain have their limitations — they mostly ignore the spatial domain and hence give false results. In this proposed solution, features from both spatial and temporal domains are combined. In spatial domain brightness information is used that is easy to extract and fast to compute. From spatial domain motion information is used. Motion information is extracted from sub-blocks that add spatial information as well. Finally both the results are combined resulting in higher accuracy, more sensitivity and tolerance.

Muhammad Ali, Awais Adnan
A Set of Usability Heuristics and Design Recommendations for u-Learning Applications

Usability is one of the most relevant attribute when evaluating a software product, application or a website, as it is important to analyze how the interaction design facilitates or hinders the user to achieve an objective in concrete. Moreover, the user experience expresses its positive or negative perception of a particular application through a set of factors and elements relating to user interaction. The paper presents a set of heuristics to detect usability problems in u-Learning applications. It also proposes a set of design recommendations focused on the user experience.

Fabiola Sanz, Raúl Galvez, Cristian Rusu, Silvana Roncagliolo, Virginica Rusu, César A. Collazos, Juan Pablo Cofré, Aníbal Campos, Daniela Quiñones
Computer Input Just by Sight and Its Applications in Particular for Disable Persons

Computer input just by sight for disable persons is proposed together with its applications. It is confirmed that communication aids, phoning aids, service robot control, information collection aids, etc. are available by using the proposed computer input just by sight.

Kohei Arai
User Impressions About Distinct Approaches to Layout Design of Personalized Content

Nowadays, a single person may produce large quantities of pictures, text and other digital content in a variety of devices, adding to it information collected from the Web. Usually, those information items are organized later to be presented/shared in more than one output device. Solutions for organizing information into personalized documents with a suitable design have been proposed for text and images, either in a fully automatic or interactive fashion. To analyze the use of two distinct approaches, we implemented an application using two layout design algorithms and performed user studies to collect user impressions. We present here the two algorithms as well as the results and the analysis of data collected in the user studies showing their preferences.

Anelise Schunk, Francine Bergmann, Ricardo Piccoli, Angelina Ziesemer, Isabel Manssour, João Oliveira, Milene Silveira
Video Compression Using Variable Block Size Motion Compensation with Selective Subpixel Accuracy in Redundant Wavelet Transform

This paper proposed a high performance video coding system in redundant wavelet domain utilizing a multihypothesis variable size block motion compensation technique. The redundant wavelet transform (RDWT) provides several advantages over the traditional wavelet transform (DWT). The RDWT retains all the phase information of wavelet transform and provides multiple prediction possibilities in ME/MC in transform domain. The paper presents a new adaptive partitioning scheme and decision criteria that utilizes more effectively the motion content of a frame in terms of the size and shape of the blocks. The experimental results show that our proposed MH-VSBMC has a better performance and uses less number of partition blocks. In addition, the approach using selective strategy for subpixel accuracy achieves similar PSNR and other quality values while using less computation steps.

Ahmed Suliman, Robert Li
PPMark: An Architecture to Generate Privacy Labels Using TF-IDF Techniques and the Rabin Karp Algorithm

Layman and non-layman users often have difficulties to understand privacy policy texts. The amount of time spent on reading and comprehending a policy poses a challenge to the user, who rarely pays attention to what he or she is agreeing to. Given this scenario, this paper aims to facilitate privacy policy terms presentation regarding data collection and sharing by introducing a new format called Privacy Label. Using natural language processing techniques, a model able to extract information about data collection in privacy policies and present them in an automated and easy-to-understand way to the user was built. To validate this model we used a precision assessment method where the accuracy of the extracted information was measured. The precision of our model was 0.685 (69%) when recovering information regarding data handling, making it possible for the final user to understand which data is being collected without reading the whole policy. The PPMark architecture can facilitate the notice-and-choice by presenting privacy policy information in an alternative way for online users.

Diego Roberto Gonçalves de Pontes, Sergio Donizetti Zorzo
RGB and Hue Color in Pornography Detection

Pornography and other such obscene material are now easily available on the internet, social media and other video and document sharing sites. Tons of videos are uploaded and shared daily on the Internets that have made it impossible to tag and filter obscene matters manually. Only feasible solution is automatic system that can detect pornography other similar materials in multimedia formats. Unfortunately most of such solutions are complicated and computationally expansive. Most of these solutions work with high-level features either special or in time space. On the other hand, low-level features are easy to detect and are fast in the computations, but contain less contextual information. In this article, a set of low-level features are analyzed to find the possibility of their use in pornography detection. Experimental results show that most of these low-level features are not suitable in this domain. However if different ratio of the same features (mix of these features) are used intelligently, they can be successfully used for the high-level content recognition as nudity and pornography detection.

Awais Adnan, Muhammad Nawaz

Potpourri

Frontmatter
Sociology Study Using Email Data and Social Network Analysis

Nowadays data mining and social network analysis techniques are broadly being used to study social structure of the underlying community. Internet has crept into our lives because of more and more dependency on online resources. We can precisely identify how society behaves from their presence in the cyber space. People are leaving so much foot prints on the cyber space that their online social structure can be extracted. In this research we have gathered email data of 23 graduate level students and applied social network analysis techniques. Social perspective of all the students has been extracted and threshold is applied on upper and lower bounds of data. We have extracted roles among people in the social setting and identified that how people behave while fulfilling those roles. We have also gathered survey data for those 23 students and extracted the social perspective in the same way as done for email data. We have performed the survey to extract what role a person is performing and related the social roles with the extracted social perspectives. At the end we have performed the validation process by comparing our results with the survey data. Our results showed strong correlation between emails data and survey data. Social structure of the students in the cyber space and in the real life was nearly same. To the best of our knowledge this is the first work in which both emails data and actual data in the form of survey for validation purposes is being used.

Wajid Rafiq, Shoab Ahmed Khan, Muhammad Sohail
Evaluation of Usability Heuristics for Transactional Web Sites: A Comparative Study

Nielsen’s usability heuristics are the most commonly used assessment tool to perform a heuristic evaluation. However, when they are employed to evaluate the usability of transactional Web sites, they fail to cover all aspects of usability that are currently present in this kind of software. For this reason, we have developed a new proposal that is capable of providing more accurate results in this context. Our approach includes fifteen usability heuristics for the design of usable graphical user interfaces. This paper presents a comparative study between the classical Nielsen’s proposal and the new set of heuristics for transactional Web sites. The experimental case study was conducted following the Method Adoption Model which establishes the analysis of three dimensions: perceived ease of use, perceived usefulness and intention to use. For this purpose, forty-six undergraduate students were asked to perform a heuristic evaluation, in which both proposals were employed. The results showed that the new heuristics are easier to use than the traditional approach. This study provides the validation of an effective tool that can be used easily to perform heuristic evaluations in the context of transactional Web applications.

Freddy Paz, Freddy A. Paz, José Antonio Pow-Sang
Algorithm for Gaussian Integer Exponentiation

In this paper we introduce a novel algorithm for Gaussian integer exponentiation that significantly improves its performance. We compare the performance of Gaussian integer exponentiation (using new and existing algorithms) to real integer exponentiation both analytically and experimentally. We demonstrate that the algorithms based on the Gaussian Integer exponentiation have significant advantages over the corresponding algorithms based on the real integer exponentiation. Moreover, we show that the new algorithm is significantly faster. Therefore, the new algorithm could speedup Public Key Discrete Logarithm Based cryptographic algorithms by about 40% with a possibility for further improvements.

Aleksey Koval
Dynamic Simulation of the Flight Behavior of a Rotary-Wing Aircraft

In this paper we seek to present the mathematical model relating to flight dynamics of a rotary-wing aircraft. The procedure starts by linearizing the translational and rotational dynamics and rotational kinematic equations of motion based on perturbation theory. Some procedures are simplified and are implemented here, due to the complex modeling process. The second step is to obtain the linear form, which is fundamental in order to describe the stability and response of the small movements of the helicopter around a specific attitude. Finally, the dynamic behavior of two helicopters (AS355-F2 Squirrel and BO105-S123) was simulated employing MATLAB Simulink v7.6 (2008) [1]. The input data, status and control derivatives from an existing helicopter were employed. It is noteworthy that the linearized model developed here is valid for applications where a deeper representation of the aircraft is not required.

Sebastião Simões Cunha Jr., Marcelo Santiago de Sousa, Danilo Pereira Roque, Alexandre Carlos Brandão Ramos, Pedro Fernandes Jr.
Origami Guru: An Augmented Reality Application to Assist Paper Folding

Origami folders often find themselves unable to recall the entire sequences of foldings, and origami drill books have been the only resources for them to learn to fold. However, many people have difficulties understanding symbols and operations in drill books. The purpose of this research is to design and develop a mobile augmented reality application that acts as a technological aid to origami folders. Extensive user studies have been employed at every step of the implementation. The application recognizes the current state/shape of the paper and instantly overlays the diagrams and instructions of the next step(s) on the real paper through the smartphone’s camera view. Evaluation results show that the application can successfully assist folders most of the time, and the qualitative mean score is about the same level of most mobile applications. This work will benefit a lot of origami practitioners such as those in geometry classes and intellectual development classes.

Nuwee Wiwatwattana, Chayangkul Laphom, Sarocha Aggaitchaya, Sudarat Chattanon
Augmented Reality Approach for Knowledge Visualization and Production (ARAKVP) in Educational and Academic Management System for Courses Based on Active Learning Methodologies (EAMS–CBALM)

The education has configured itself by the transfer of teacher’s knowledge to the student, without the proper criticism or reflection by the student. Thereby, the main objective of the education, the student’s learning, is not being achieved and it has taken people to a constant questioning of this traditional education and giving space to the Active Learning Methodologies (ALM). Such methodologies use the questioning as the strategy of the learning, where the student builds his own knowledge, by problems that given to him, making this student a being reflexive and critical. In recent decades Information and Communication Technologies (ICT) are used more and more on education, being even more useful when in ALM environment. In this sense, the objective of this paper is to present an Augmented Reality Approach for Knowledge Visualization and Production (ARAKVP) in Educational and Academic Management System for Courses based on Active Learning Methodologies (EAMS–CBALM).

Helen de Freitas Santos, Wanderley Lopes de Souza, Antonio Francisco do Prado, Sissi Marilia dos Santos Forghieri Pereira
Designing Schedulers for Hard Real-Time Tasks

While synthesizing real-time schedulers on single processor systems using priority-based supervisory control of timed discrete-event systems (TDES), we came across the problem of state space explosion. In order to over-come it, a modified form of symbolic modeling methodology along with the pre-stable algorithm has been utilized in this work. The main contribution through this paper has been the development of an informal procedure for uniprocessor scheduler design with reduced state space for hard real-time tasks.

Vasudevan Janarthanan
An Autonomous Stair Climbing Algorithm with EZ-Robots

Every day we witness greater advances in robotic technologies appearing around us. From the creation of military robots built to travel across hostile territories to the implementation of drones in a domestic setting by Amazon.com, these robots continue to become a more important part of our lives every day. Due to this fact, the importance of investigating new methods for ensuring the safe travel from one location to another continues to grow as well. In the spirit of researching these methods, this research aims to develop the functionality for a robot to climb a set of stairs. The robot being used for this project is manufactured by EZ Robot and is provided with its functionality through the execution of scripts and animations created in an integrated development environment. Using this development environment, we developed the autonomous climbing algorithm to allow the robot to successfully climb a set of stairs.

Jason Moix, Sheikh Faal, M. K. Shamburger, Chris Carney, Alex Williams, Zixin Ye, Yu Sun
Toward Indoor Autonomous Flight Using a Multi-rotor Vehicle

The objective of the paper is to detail progress toward an indoor autonomous micro aerial vehicle (MAV) that uses a multi-layered hardware and software control architecture. We build a quadrotor MAV with design considerations for indoor use and document the design process. Our design uses a three-layered control system in both the hardware and software components. We focus on using a modular control architecture to allow for individual development of layers. For initial progress toward autonomous flight, we create an automated altitude control program that enables the MAV to hover stably at a given height. Experiments include hovering and directed flight through hallways with obstacles. Initial results provide optimistic feedback for future experiments concerning traversing indoor environments using an autonomous MAV with a three-layered control architecture.

Connor Brooks, Christopher Goulet, Michael Galloway
Using Tweets for Rainfall Monitoring

In Brazil, the summer season is the wettest period in which many disasters can happen, such as landslides and floods. In recent years, even with the rainy season, the metropolitan region of São Paulo (Brazil) suffers a severe water crisis. Given this scenario, monitoring of rainfall is fundamental for taking preventive actions and planning in the various business branches. Thus, the use of computers to develop tools that assist the rainfall monitoring can help extend the coverage of the existing solutions. Moreover, it is known that, every day, the number of social media users is increasing, and consequently increases the amount of content published in these medias. The objective of this study is to analyze the contents of the Twitter social media, especially the tweets related to rainfall events in order to determine whether this information can contribute to the monitoring of rainfall events in Brazil. More than 1 million tweets published in Brazil related to rainfall were collected in a period of 30 days. Gathered tweets were analyzed and evaluated taking into account the data collected by automatic weather stations (AWS or EMA). The results were satisfactory and indicate a relationship between the geolocated tweets and data from AWS.

Luiz Eduardo Guarino de Vasconcelos, Eder C. M. dos Santos, Mário L. F. Neto, Nelson Jesuz Ferreira, Leandro Guarino de Vasconcelos
Algorithms Performance Evaluation in Hybrid Systems

Over recent years a new branch of parallel computing has emerged: hybrid architectures composed of CPUs (Central Processing Unit) and GPU (Graphics Processing Unit). The objective of this work is to quantify the dimension of the performance gains comparing sequential processing of known algorithms with parallelized processing offered by hybrid architecture. It was used OpenCL (Open Computing Language) generated by the AMD Aparapi Framework to implement popular computational algorithms. Results show the performance using Aparapi Framework and the conclusions details the expected scale of algorithm speed up comparing with other works found in the literature using pure OpenCL library.

Rafael Manochio, David Buzatto, Paulo Muniz de Ávila, Rodrigo Palucci Pantoni
An Efficient Method for the Open-Shop Scheduling Problem Using Simulated Annealing

This paper presents a simulated annealing algorithm in order to solve the nonpreemptive open-shop scheduling problem with the objective of minimizing the makespan. The method is based on a simulated annealing algorithm that efficiently explores the solution space. The method was implemented and tested on various benchmark problems in the literature. Experimental results show that the algorithm performs well on the benchmarks. The algorithm was able to find an optimum solution in many cases.

Haidar M. Harmanani, Steve Bou Ghosn
Schematizing Heidegger

Philosophy is typically discussed and taught by abstract discussion of texts; however, a considerable body of research suggests that this method is unsuited to professionals and students who struggle to navigate philosophical writings. Diagrams are used in diverse areas of study to depict knowledge and to assist in understanding of problems. This paper aims to utilize schematic representation to facilitate understanding of certain philosophical works. The paper employs schematization as an apparatus of specification for clarifying philosophical language by describing ideas in a form familiar to computer science. Specifically, it is an attempt, albeit tentative, to schematize Martin Heidegger’s philosophical approach. His writings are notoriously difficult to understand, and the high-level representation described here seems a viable tool for enhancing the relationship between philosophy and computer science, especially in computer science education.

Sabah Al-Fedaghi
Constrained Triangulation of 2D Shapes

Algorithms for triangulating two dimensional shapes have been used as sub-problems in many application areas that include finite element analysis, geographic information systems, and geometric compression. We consider a constrained version of triangulation problem in which the objective is to increase the proportion of even degree vertices. We present an effective approach for generating triangulated polygons with increased number of even degree vertices. The proposed approach is based on the convex decomposition of polygon followed by ‘diagonal flipping’ operation.

Laxmi P. Gewali, Roshan Gyawali
Software Project and Analysis of a Training Screen Based System for Healthcare Professionals Working in the NICU in Brazil

Objective: to develop a screen-based training system (SBTS) for training of healthcare professionals working in NICU in Brazil. Materials and Methods: Using of Intelligent Tutoring Systems (ITS) and Production Rules Based System (PRBS) for the training of these professionals to learn and recognize the various normal and abnormal situations of the incubator interface as well as the associated alarms, what can significantly contribute to the increasing of patient safety. Results and Discussion: The system developed by applying an ITS assists a teacher in the training task of a new user or even experienced users with the purpose of training and update. The developed tool allows modeling of new cases in the system. Conclusion: The systematic adopted for the development of training systems allows that new ITS are developed from the reuse of classes already developed, easing the implementation process and increasing the effectiveness for new training.

Daniel Rocha Gualberto, Renata Aparecida Ribeiro Custódio, Alessandro Rodrigo Pereira Dias, Gabriel Bueno da Silva, Clarissa Gonçalves Eboli, Alexandre Carlos Brandão Ramos

Short Papers

Frontmatter
A Self-configuration Web-API for the Internet of Things

Due to the fast growth of the Internet of Things (IoT), many concepts of distributed systems need to be adapted to this new paradigm. Due to the large size of these networks, it is important that such systems present autonomic characteristics. In this context, this paper performs the implementation of an autonomic system emphasizing the self-configuration of devices, aided by cloud computing through a Web-API that was developed for this purpose. The initial results were obtained from tests performed on the Z1 device with the help of the ContikiOS, running inside the Cooja simulator. The results revealed that there is an overflow of the code developed for the tested devices, but after deep analysis, and more development, we can get a good level of energy consumption and it is possible to reach the self-configuration of devices on the IoT environment.

Eric Bernardes C. Barros, Admilson de Ribamar L. Ribeiro, Edward David Moreno, Luiz Eduardo C. Neri
Automated Behavioral Malware Analysis System

Nowadays, with the spread of internet and network-based services, malware has become a major threat to computers and information systems. Actually, different malware share similar behaviours, also they have different syntactic structures due to the incorporation of obfuscation techniques such as polymorphism, Oligomorphic and meta-morphism. The different structure of same behavioral malware poses a serious problem to signature-based detection techniques. In this paper we propose an automated prevention system based on malware behaviours. Our system has the ability to collect suspicious software from client computers, then to automatically analyses the behaviour of detected malware. Then agent then sends an alarm to all network clients. The results from an implementation of the proposed system show that our approach is effective in analysing detected malware in automated security systems.

Saja Alqurashi, Omar Batarfi
A Message Efficient Group Membership Protocol in Synchronous Distributed Systems

In distributed systems, a group of computer should continue to do cooperation in order to finish some jobs. In such a system, a group membership protocol is especially practical and important elements to provide processes in a group with a consistent common knowledge about the membership of the group. Whenever a membership change occurs, processes should agree on which of them should do to accomplish an unfinished job or begin a new job. The problem of knowing a stable membership view is very same with the agreeing common predicate in a distributed system such as the consensus problem. Based on the termination detection protocol that is traditional one in asynchronous distributed systems, we present the new group membership protocol in arbitrary wired networks.

SungHoon Park, SuChang Yoo, YeongMok Kim, SangGwon Lee, DoWon Kim
Ontology-Driven Metamodel Validation in Cyber-Physical Systems

This paper describes the development and use of an ontology to validate model integration and metamodel description in the cyber-physical systems engineering domain. A primary goal of the Defense Advanced Research Program Agency (DARPA)’s Adaptive Vehicle Make (AVM) program is to reduce product cost by reducing the design space early in the engineering life cycle, to enable a high-value focus on reasonable alternatives. The OpenMETA tool suite uses semantic constructs for model integration across engineering disciplines, and facilitates trade analysis across those disciplines (such as electrical, mechanical, thermal, fluid, and cyber). Independently, through expert interviews, the authors developed a lightweight ontology to represent important trades in the early system engineering design process, and have been using that ontology as both a communication and validation mechanism for the semantics embedded in the OpenMETA tools. The results are informing the tool development, providing an independent validation mechanism, and creating a layer of the ontology that is extremely useful to both people and machines that was not initially obvious. The ontology-based approach described in this paper facilitates the identification and encoding of the important relationships between design decisions and subsequent reasonable decision alternatives, and serves as an independent validation of the semantics and relationships encoded in the OpenMETA tool suite, both at very low cost.

Kevin Lynch, Randall Ramsey, George Ball, Matt Schmit, Kyle Collins
Developing Software in the Academic Environment
A Framework for Software Development at the University

This paper proposes a framework used for real-world software development in the academic environment of Fairleigh Dickinson University (FDU). The framework thus far has been used to establish the functional baseline for the Predictive Wastewater Management System (PWMS), a participatory sensing system for monitoring and predicting problems occurring in a municipality’s wastewater system for the Eastech Corporation. As software development becomes globally distributed, students need immersion in a realistic software development lifecycle (SDLC) and experience with a framework targeted toward this emerging paradigm.

William Phillips, Shruthi Subramani, Anusha Gorantla, Victoria Phillips
Automatic Reverse Engineering of Classes’ Relationships

Classes are the core of object oriented systems. Any maintenance activity includes performing a code change to one or more classes. Any code change to one class may affect other classes in the project. So, developers need to be aware and fully understand the structure and the relationships between classes. This paper proposes a technique to automatically extract various types of class’s relationships from source code. The proposed technique extracts relationships among classes and measures their involvements in relationships. Fan-in and Fan-out metrics are used to give developers more comprehensive picture about the current status of coupling for each class.

Maen Hammad, Rajaa Abu-Wandi, Haneen Aydeh
Developing Predictable Vehicular Distributed Embedded Systems on Multi-core

In this paper we address the challenges related to supporting model- and component-based development of predictable software for vehicular distributed embedded systems, utilizing multi-core platforms. We present a research plan for the development of new methods and techniques to deal with these challenges. The techniques will support various aspects such as modeling of the software architecture; supporting multiple criticality levels; verifying predictability of the system using end-to-end timing analysis; code generation; and providing a predictable run-time support on multi-core platforms by virtualizing a certified single-core real-time operating system. As a proof of concept, we will implement the newly developed techniques in a commercial tool chain (Rubus-ICE). The efficacy of the newly developed techniques and the extended tool chain will be demonstrated on the industrial case studies.

Saad Mubeen, Thomas Nolte, Kurt-Lennart Lundbäck
Formalizing the Process of Usability Heuristics Development

Heuristic evaluation is one of the widely used methods for evaluating usability. Several authors have developed different sets of usability heuristics in order to evaluate the usability of specific applications. In this regard, it is important to know if authors use a formal methodology to develop their heuristics. This paper presents several such methodologies highlighting the importance of a formal usability heuristics development process.

Daniela Quiñones, Cristian Rusu, Silvana Roncagliolo, Virginica Rusu, César A. Collazos
Analysis of a Training Platform for the Digital Battlefield, Based on Semiotics and Simulation

In the development of a system, the perception of usability is vital, moreover in this context, the role of designers is a fundamental pillar between the interaction of different interfaces with the users. This is how semiotics is appreciated as a factor in the expression of ideas at the moment of building a proposal regarding the implementation of a digital battlefield simulator for training prototype, taking into account other avant-garde experiences in this area.

Cristian Barría, Cristian Rusu, Claudio Cubillos, César Collazos, Miguel Palma
Usability Heuristics and Design Recommendations for Driving Simulators

User eXperience (UX) is one of the most important aspects when developing a simulator. It ensures that the final product is not just functional, but users find a value in what they are getting. The paper presents a set of heuristic to assess usability in driving simulators. It also defines a set of design recommendations focused on UX.

Aníbal Campos, Cristian Rusu, Silvana Roncagliolo, Fabiola Sanz, Raúl Gálvez, Daniela Quiñones
Model for Describing Bioprinting Projects in STL

The bioprinting is an essential element of biomanufacturing process and is an area of emerging and promising multidisciplinary research, which proposes the “manufacture” of tissues and organs by means of techniques and rapid prototyping capabilities, such as additive manufacturing. A model and a framework for the definition of projects that can be used in bioprinting living tissues and organs is set.

Luiz Angelo Valota Francisco, Luis Carlos Trevelin
The Fractal Nature of Mars Topography Analyzed via the Wavelet Leaders Method

This work studies the scaling properties of Mars topography based on Mars Orbiter Laser Altimeter (MOLA) data through the wavelet leaders method (WLM). This approach shows a scale break at $$\approx 15$$ km. At small scales, these topographic profiles display a monofractal behavior while a multifractal nature is observed at large scales. The scaling exponents are greater at small scales. They also seem to be influenced by latitude and may indicate a slight anisotropy in topography.

Adrien Deliège, Thomas Kleyntssens, Samuel Nicolay
Privacy Enhancement in E-Mail Clients

Privacy on top of security is a desired feature in web mail clients and requires more attention from research community. This paper describes a system which aims at providing improved privacy features in web mail clients so that a user can open emails even in presence of other people without disclosing anything about the subject or the sender of the mail or both as per the privacy requirement. The work has been implemented by tweaking the source code of squirrel mail open source.

Prabhat Kumar, Jyoti Prakash Singh, Rajni Kant Raman, Rohit Raj
Backmatter
Metadaten
Titel
Information Technology: New Generations
herausgegeben von
Shahram Latifi
Copyright-Jahr
2016
Electronic ISBN
978-3-319-32467-8
Print ISBN
978-3-319-32466-1
DOI
https://doi.org/10.1007/978-3-319-32467-8