Skip to main content
Top

2015 | Book

Innovations and Advances in Computing, Informatics, Systems Sciences, Networking and Engineering

insite
SEARCH

About this book

Innovations and Advances in Computing, Informatics, Systems Sciences, Networking and Engineering

This book includes a set of rigorously reviewed world-class manuscripts addressing and detailing state-of-the-art research projects in the areas of Computer Science, Informatics, and Systems Sciences, and Engineering. It includes selected papers from the conference proceedings of the Eighth and some selected papers of the Ninth International Joint Conferences on Computer, Information, and Systems Sciences, and Engineering (CISSE 2012 & CISSE 2013). Coverage includes topics in: Industrial Electronics, Technology & Automation, Telecommunications and Networking, Systems, Computing Sciences and Software Engineering, Engineering Education, Instructional Technology, Assessment, and E-learning.

· Provides the latest in a series of books growing out of the International Joint Conferences on Computer, Information, and Systems Sciences, and Engineering;

· Includes chapters in the most advanced areas of Computing, Informatics, Systems Sciences, and Engineering;

· Accessible to a wide range of readership, including professors, researchers, practitioners and students.

Table of Contents

Frontmatter
Performance Improvement in Public Administrations by Organizational Learning

Due to increased pressure for cost reduction and performance in public administrations and the growing requirements for optimized service quality, citizens orientation and effectiveness public administrations must improve always faster their services and organizations. They must elaborate or interpret, communicate, learn, use and continuously change and improve a lot of legal requirements, regulations, procedures, directives, forms and checklists. Thus, the management of appropriate regulations (relevant laws, directives, procedures, forms and checklists) is fundamental and key challenge for public management. Nevertheless, regulations are distributed commonly IT supported and the collaborators have great difficulty to find appropriate, actual regulations. They are hardly used as reference for solving ad hoc learning needs. In addition, change proposals, new ideas or questions are usually not related to the existing. Consequently, new regulations are often created as add-on. They can become in contradiction to the existing. Based on Foucault’s theory we structure all regulations in accordance to the ISO 9001 standard for quality management, prepare them accordingly didactical principles and publish them on an organizational learning system. This innovative work-place integrated organizational learning concept is best supported by a confidence-based open corporate culture. The results of our case studies in different medium-sized administrations suggest that the concept was useful to promote practice-oriented regulations, workplace integrated need-oriented learning and the continual performance improvement of the public administrations.

Margareth Stoll, Dietmar Laner
From Information Security Management to Enterprise Risk Management

Organizations are faced with increasing complexity, uncertainty and enhanced threats from a wide range of forces. Depending on how this situation is handled, it can become risk or opportunity to erode or enhance business value. In addition, organizations have to meet most different stakeholders’, legal and regulatory risk management requirements. Thus, comprehensive enterprise risk management has become key challenge and core competence for organizations’ sustainable success. Given the central role of information security management and the common goals with enterprise risk management, organizations need guidance how to extend information security management in order to fulfill enterprise risk management requirements. Yet, interdisciplinary security research at the organizational level is still missing. Accordingly, we propose a systemic framework, which guides organizations to promote enterprise risk management starting from information security management. The results of our case studies in different small and medium-sized organizations suggest that the framework was useful to promote enterprise risk management in an effective, efficient, cost-effective and sustainable way. New insights for practice and future research are offered.

Margareth Stoll
Information System Engineering Promotes Enterprise Risk Management

Organizations are faced with increasing complexity, uncertainty and enhanced threats from a wide range of forces. Depending on how this situation is handled, it can become risk or opportunity to erode or enhance business value. In addition, organizations have to meet most different stakeholders’, legal and regulatory risk management requirements. Thus, comprehensive enterprise risk management has become key challenge and core competence for organizations’ sustainable success. Despite there were studied several approaches to systematically secure information systems against information security breaches, we found no approach, which guides organizations to promote enterprise risk management by system engineering. Interdisciplinary information security research at the organizational level is still missing. Accordingly, we propose a systemic approach for system engineering requirement analysis in order to promote enterprise risk management. The results of our case studies suggest that the approach was useful to promote enterprise risk management in an effective and sustainable way. Legal/regulatory compliance and risk awareness were enhanced. New insights for practice and future research are offered.

Margareth Stoll, Dietmar Laner
Simulation Study on the Performance of Reactive and Position-Based Routing Protocols in MANET

Recently, Mobile Ad Hoc NETwork (MANET) has drawn the attention of the research community particularly in routing protocols, such as the proactive, reactive and position-based routing. Invariably, the primary objective of routing protocols is transmitting the data packets from the source to the destination node. Therefore, these protocols can be distinguished based on the processes of searching, maintaining and recovering the routing path. A potential problem in MANET is identifying the best routing protocol. In this paper, we present performance evaluation study of reactive; Ad Hoc On-Demand Distance Vector (AODV) and position-based; Location-Aided Routing (LAR1). The performance evaluation study performed using QualNet v5.1 simulator. Additionally, the performance of those routing protocols investigated based on the throughput, delay, average jitter and energy consumption metrics varying the number of nodes. This results showed that the AODV has a better performance than LAR1 in terms of average jitter and throughput. While LAR1 performed better than AODV in terms of average end-to-end delay and energy consumption.

Khaldoon Al-Shouiliy, Raed Alsaqour, Mueen Uddin
A New Approach for Buffer Queueing Evaluation Under Network Flows with Multi-scale Characteristics

In this paper, we propose a new analytical expression for estimating byte loss probability at a single server queue with multi-scale traffic arrivals. In order to make the estimation procedure numerically tractable without losing the accuracy, we assume and demonstrate that an exponential model is adequate for representing the relation between mean square and variance of Pareto distributed traffic processes under different time scale aggregation. Extensive experimental tests validate the efficiency and accuracy of the proposed loss probability estimation approach and its superior performance for applications in network connection with respect to some well-known approaches suggested in the literature.

Jeferson Wilian de Godoy Stênico, Lee Luan Ling
Polynomial Compensation of Odd Symmetric Nonlinear Actuators via Neural Network Modeling and Neural Network Describing Function

Paper deals with polynomial compensation of odd symmetric nonlinear actuators. Most actuators are nonlinear sporting odd symmetric nonlinearities such as deadzone and saturation. One way of dealing with such actuators is to try to compensate for nonlinearities by polynomial representation of inverse nonlinearity. Compensated actuators can improve behavior of the system, but then arises the problem of stability analysis because compensated nonlinearity is now complex nonlinearity not described in common literature. One way of dealing with such problem is to perform stability analysis via describing function method. Paper describes the method for compensating nonlinearities, recording describing function and performing stability analysis.

O. Kuljaca, K. Horvat, J. Gadewadikar, B. Tare
Stream Collision Management in MIMO Ad-Hoc Network Sustaining the Lower Bound of QoS

The ad hoc wireless network that operates in the rich multipath propagation environment is considered. It is supposed that the node-to-node communication link are performed with multi-element antennas and MIMO spatial multiplexing transmitting-receiving strategy in use. The signal-to-interference-plus-noise-ratio (SINR) in the presence of interpath interference is analyzed. The cooperation among transmitted node is used to avoid the stream collision through the concurrent paths when the performance criteria is the quality of service (QoS) of peer-to-peer links is considered. The revised water pouring algorithm (RWPA) that helps to redistribute the power among the transmitted antennas according to QoS performance criteria is discussed. The proposed approach of the stream collision management allows to avoid the interpath interference as well as to maintain at least lower bound of QoS. The simulation part analyzes scenario with two pairs of node-to-node communication links.

Viktor Zaharov, Angel Lambertt, Olena Polyanska
On Query-Based Search of Possible Design Flaws of SQL Databases

System catalog, which is a part of each SQL database, is a repository where the data in its base tables describes the SQL-schemas (schemas) in the database. The SQL standard specifies the Information Schema, which must contain virtual tables (views) that are created based on the base tables of the system catalog. In this paper, we investigate to what extent one can find information about possible design flaws of a SQL database by querying the tables in its Information Schema and possibly tables in its other schemas. We do this based on a set of SQL database design antipatterns, each of which presents a particular type of database design flaw.

Erki Eessaar
An Architecture for Mobile Context Services

Determining the context of what a mobile user is doing currently, and in the near future is central to personalizing a user’s experience to what is most relevant to them. Numerous methods and data sources have been used to try and garner this information such as GPS traces, social network data, and semantic information to name a few. In this paper we propose an architecture for combining various forms of data and processing into a service for providing a mobile user’s context to applications. The goal of this work is to establish an architecture that can provide a more complete model of the information relevant to a mobile user and making this data available to interested applications.

Chad Williams, Jisna Mathew
An Actuated Tail Increases Rapid Acceleration Manoeuvres in Quadruped Robots

The cheetah (

Acinonyx jubatus

) is arguably one of the most manoeuvrable terrestrial animals. For future time-critical missions, legged robots will need to possess capabilities similar to the cheetah. In this paper, a rapid acceleration quadruped system is designed and is found to be limited in manoeuvrability. However, we show that by the addition of an actuated tail, a considerable increase in stride-averaged acceleration is obtained.

Amir Patel, M. Braae
PlantsVN: An User-Friendly Software for Creating and Managing Personal Plant Database and for Plant Family Identification

PlantsVN is a software that combines the plant database functions with the utility of family identification of plant specimen. PlantsVN allows users to create and manage personal plant database and identify plant specimen. Apart from the software accompanied key, the users can easily create own keys using the key file child window. The result of the identification is documented by displaying all the taxa of the plant database matched with the characters of the plant specimen.

This paper describes the technique of integrating the plant identification utility into the plant database software that has been used in PlantsVN-software. The paper also document on the functionality of the PlantsVN-software that is free available at the website of the Institute of Ecology and Biological Resources, Vietnam Academy of Science and Technology (

http://iebr.ac.vn/pages/1PlantsVN.asp

).

Nguyen Van Sinh
An Automated Tool to Support the Software Design Process

In this paper we present FESA, a Forward Engineer and System Analyzer, that supports the software design process based on the Model Driven Engineer paradigm. We show that by following the proposed ORDEREXP design rules during the elaboration of the ER (Entity Relation) diagram, FESA is able to construct the software prototype. ORDEREXP supports the software designer by providing more information about the modeled process and how final user graphic interface of the system will be, while preserving the DB scheme normal form. Design patterns are used by FESA for supporting the analysis of the ER diagram, in order to obtain additional recommendations and optimizations to the model.

Fernando Uceda-Ponga, Gerardo Ayala-San Martin
A Low-Cost Force Measurement Solution Applicable for Robotic Grippers

Industrial robots find profound usage in today’s industries and an important characteristic required of such robots in pick-and-place operations is determining the gripping force when picking objects. This paper presents the use of a FlexiForce force sensor for measuring the gripping force when picking work-pieces using the two-finger gripper of a pick-and-place robot. This thin and flexible analogue force sensor is capable of measuring both static and dynamic forces. It has an output resistance that is inversely proportional to the applied force and easily calibrated and interfaced to a microcontroller with an in-built analogue-to-digital convertor. Through experimentation, a relationship between the weight of work-piece to be gripped and the force to be applied for gripping was determined. This was used to successfully manipulate work-pieces of various shapes and sizes up to the robot payload of 0.5 kg. Analysis of various forces acting on the work-piece was also carried out.

R. V. Sharan, G. C. Onwubolu
Analyzing Operating Systems’ Behavior to Crafted Packets

Operating Systems are vulnerable to malicious packet injection because of their inherent design and implementation flaws. TCP/IP stacks in different operating systems are especially vulnerable to this. Using crafted packets, we can analyze how each operating system responds to malicious packet injection. The main goal of this study is to analyze behavior of different operating systems to specially crafted packets. In this experiment, we crafted four types of packets: TCP SYN packets with data, packets with IP options, overlapping fragments, and tiny fragments. We use “Scapy” [1], a powerful packet crafting tool using Python to craft packets with customized headers and payloads. Results indicated that Windows and Linux behaved differently to these packets. Windows showed more vulnerability when receiving data in SYN packets, while Linux responded to packet with IP options. Both systems also handled overlapping fragments differently.

Thusith Abeykoon, Kasun Abeykoon, Tirthankar Ghosh
An Access Control Model for a Grid Environment Employing Security-as-a-Service Approach

There is a continuous effort at addressing security challenges of large scale service oriented computing (SOC) infrastructures like grids. A lot of research efforts towards development of authentication and authorization models for grid systems have been made because existing grid security solutions do not satisfy some desirable access control requirements of distributed services; such as support for multiple security policies. However, most of these security models are domain and/or application specific. Domain/application-specific approach to providing security solution is a duplication of effort, which also increases the cost of developing and maintaining applications. This paper presents the design of an access control model for grid-based system that employs security as a service (SecaaS) approach. By SecaaS approach, each atomic access control function (such as authentication, authorization) will be provided as a reusable service that can be published and subscribed to by different grid entities. In this approach, each admin domain will no longer need to have its own domain-specific access control logic built into it. Whenever an access control service is required the domain administrator subscribes to this service from SecaaS. This approach has a number of benefits, including making changes to security policies dynamically on the fly.

E. K. Olatunji, M. O. Adigun, E. Jembere
Comparison of Collaborative-Filtering Techniques for Small-Scale Student Performance Prediction Task

Collaborative-filtering (CF) techniques were successfully used for student performance prediction, however the research was provided mainly on large and very sparse matrix representing (student, task, performance score) triples. This work investigates the usability of CF techniques in student performance prediction for small universities or courses with only a few of students. We compared several CF techniques on a real-world dataset collected at our university which is very small and not so sparse. The experiments show that in such cases the predictive accuracy of these models is not so good and we need to utilize more information about students or tasks.

Štefan Pero, Tomáš Horváth
Career Center System Software Architecture

In today’s world, thousands of job seekers are looking for a new job. On the other hand, thousands of employers are trying to find new employees. So, this is a chaotic matching problem and it does not have a certain answer. Companies are searching career centers and web-based career software to find an answer for the question of “”Should we find a convenient worker for a certain role and hire this person or not?” Solution is that simple; just have a look at the beginning of the story: university career centers. In this study, a Career Center System Software has been designed and implemented for matching students with their ideal job. Career Center System Software (CCSS) is programmed on C#, MS-SQL and .NET platform. CCSS has been developed on Visual Studio 2010. CCSS is implemented in a way so as to enable the user to apply for the job announcements and to monitor courses and to call for conferences and seminars. Furthermore, CCSS enables companies to view the applicant’s curriculum vitae. All job announcements, educations, seminars and CVs are stored on the database. Software quality and testing shows that CCSS is implemented successfully and ready to use tool as Career Center Software.

Taner Arsan, Safa Çimenli, Erhan Güneş
Mobile Camera Source Identification with SVD

A novel method for extracting the characterising sensor pattern noise (SPN) from digital images is presented. Based on the spectral decomposition technique of Singular Value Decomposition, the method estimates the SPN of each image in terms of its energy level by first transforming the image/signals into a linear additive noise model that separates the photo response non-uniformity (PRNU) of the associated camera from the signal subspace. The camera reference signatures of the individual cameras are computed from a sample of their respective images and compared with a mixture of image signatures from a set of known camera devices. The statistical properties of the method were studied using the Student’s

t

-test constructed under the null hypothesis formalism. Our studies show that it is possible to determine the source device of digital images from camera phones using such method of signature extraction, with encouraging results.

A. -R. Soobhany, K. P. Lam, P. Fletcher, D. J. Collins
Comparison of Manual and Image Processing Methods of End-Milling Burr Measurement

This paper compares the results for manual method of burr height measurement with the image-processing technique for end-milled work-pieces under various conditions. The manual method refers to the traditional way where a few readings are taken at random locations using a microscope and the burr height is approximated with an average value. In contrast, the image processing technique analyzes the whole burr profile as seen through the lens of the microscope and captured using a digital camera. With the results obtained using the image processing method as reference, the results show a significant difference between the two average readings in most cases and generally the percentage error is greater for work-pieces with irregular burrs.

R. V. Sharan, G. C. Onwubolu
Automatic Image Annotation for Description of Urban and Outdoor Scenes

In this paper we present a novel approach for automatic annotation of objects or regions in images based on their color and texture. According to the proposed generalized architecture for automatic generation of image content descriptions the detected regions are labeled by developed cascade SVM-based classifier mapping them to structure that reflects their hierarchical and spatial relation used by text generation engine. For testing the designed system for automatic image annotation around 2,000 images with outdoor-indoor scenes from standard IAPR-TC12 image dataset have been processed obtaining an average precision of classification about 75 % with 94 % of recall. The precision of classification based on color features has been improved up to 15 ± 5 % after extension of classifier with texture detector based on Gabor filter. The proposed approach has a good compromise between classification precision of regions in images and speed despite used considerable time processing taking up to 1 s per image. The approach may be used as a tool for efficient automatic image understanding and description.

Claudia Cruz-Perez, Oleg Starostenko, Vicente Alarcon-Aquino, Jorge Rodriguez-Asomoza
Autonomous Mapping and Navigation Through Utilization of Edge-Based Optical Flow and Time-to-Collision

This paper proposes a cost-effective approach to map and navigate an area with only the means of a single, low-resolution camera on a “smart robot,” avoiding the cost and unreliability of radar/sonar systems. Implementation is divided into three main parts: object detection, autonomous movement, and mapping by spiraling inwards and using A* Pathfinding algorithm. Object detection is obtained by editing Horn–Schunck’s optical flow algorithm to track pixel brightness factors to subsequent frames, producing outward vectors. These vectors are then focused on the objects using Sobel edge detection. Autonomous movement is achieved by finding the focus of expansion from those vectors and calculating time to collisions, which are then used to maneuver. Algorithms are programmed in MATLAB and JAVA, and implemented with LEGO Mindstorm NXT 2.0 robot for real-time testing with a low-resolution video camera. Through numerous trials and diversity of the situations, validity of results is ensured to autonomously navigate and map a room using solely optical inputs.

Madhu Krishnan, Mike Wu, Young H. Kang, Sarah Lee
NS2IT: Simplification of Computer Network Simulation

Nowadays, Internet applications generate more and more communication traffic that needs to be wisely handled by computer networks. Therefore there is a big demand on communication protocols to have less payload and bigger efficiency. In this paper, we briefly explain our solution of enhanced simulation and visualization of network protocols based on ns-2 simulator, called NS2IT. Our goal is to simplify the simulation’s process, data visualization and measurements’ comparison. Proposed prototype teaches students about characteristics of given protocols and informs developers about protocols’ performance in specific network situations. All these information are presented by rich web interface.

Martin Nagy, Peter Magula
Survey on Decentralized Modular Robots and Control Platforms

Swarm robotics is a relatively new field that has utilized significant progress in the area of multi-agent robotic systems over the last two decades. At times, Swarm robotic systems adopt a decentralized approach in which the desired collective behaviors emerge from local decisions made by robots themselves according to their environment. On the other hand, traditional multi-robot systems basically use centralized communication control in coordinating each robot. The fact that typical swarm of robots consists of relatively simple and homogeneous robots allows the group to self-organize or dynamically reorganize the way individual robots are deployed. Therefore, the swarm approach is considered to be highly robust to the failure of individual robots. The decentralized approach not only addresses the fact that there is a shortage of available software frameworks for distributed control systems/robotics but also introduces system software for controlling multiple expandable and reconfigurable swarm agents. We investigate the behavior of many swarm systems that have been proposed in the literature. In this survey we provide a detailed summary of systems that have been classified under four main categories of the multi-robot system platforms namely; self-reconfigurable, modular, self-replicating, and swarm systems. We present a preliminary taxonomy for swarm robotics and classify existing studies into this taxonomy.

Tamer AbuKhalil, Tarek Sobh, Madhav Patil
Utilising Fuzzy Rough Set Based on Mutual Information Decreasing Method for Feature Reduction in an Image Retrieval System

Content-Based Image Retrieval (CBIR) system has become a focus of research in the area of image processing and machine vision. General CBIR system automatically index and retrieve images with visual features such as colour, texture and shape. However, current research found that there is a significant gap between visual features and semantic features used by humans to describe images. In order to bridge the semantic gap, some researchers have proposed methods for managing and decreasing image features, and extract useful features from a feature vector. This paper presents an image retrieval system utilising fuzzy rough set based on mutual information decreasing method and the Support Vector Machine (SVM) classifier. The system has training and testing phases. In order to reduce the semantic gap, the propose retrieval system used relevance feedback to improve the retrieval performance. This paper also compared the proposed method with other traditional retrieval systems that use PCA, kernel PCA, Isomap and MVU for their feature reduction method. Experiments are carried out using a standard Corel dataset to test the accuracy and robustness of the proposed system. The experiment results show the propose method can retrieve images more efficiently than the traditional methods. The use of fuzzy rough set based on mutual information decreasing method, SVM and relevance feedback ensures that the propose image retrieval system produces results which are highly relevant to the content of an image query.

Maryam Shahabi Lotfabadi, Mohd Fairuz Shiratuddin, Kok Wai Wong
IR-UWB with Multiple-Access Differential Detection Receiver

Non-coherent communication receivers have simple design, but they always incur bit error rate (BER) performance loss up to 3

dBs

compared to coherent receivers. In this paper, a non-coherent impulse-radio ultra-wideband (IR-UWB) receiver is proposed that supports multiple access (MA) for IR-UWB signals. The signals are transmitted using code division (CD) differential phase shift keying (DPSK) technique. Although UWB is a multipath channel, our proposed communication system is analyzed under additive white Gaussian noise (AWGN) corrupted MA channel to test its applicability. Its average probability of error is derived analytically. Its BER performance is compared against an existing reference coherent receiver. Simulation results are provided to affirm its practicality.

Walid Mahmoud
Digital System for the Study of Fast Processes

Evolution of data processing numeric systems has led to deployment of high performance data acquisition systems. For slow processes are required acquisition systems with low processing speed which are easy to design and implement, but for fast processes required acquisition systems become more complex, with high speed data acquisition and transmission rate.

Florin Grofu, Constantin Cercel
A Location-Based Movie Advisor Application for Android Devices

Android is one of the world’s most popular mobile platforms. There are more than 600,000 applications available today’s market place. Movie advisor applications are also available in Google Play, but there is no location-based movie advisor application for Android devices in Google Play and any other marketplace. A Location-Based Service is a mobile computing application that provides information and functionality to users based on their geographical location. In this study, a location-based movie advisor which is a special application for Android devices to find nearest movie theaters, is developed and implemented. Android devices are getting smarter with new features. By using these devices, we can use new technologies and new ideas. Location-based services are one of these ideas. Wherever you are, you can search and find new possibilities for almost everything. The aim of the location-based movie advisor application for Android devices is to give a brief summary about movies, movie times and also nearest location information of the movie theaters depending on the location of the user.

Taner Arsan, Aykut Çayır, Hande Nur Umur, Tuğçe Güney, Büke Panya
Eye Tracking and Head Movements Detection to Assist People with Disabilities: Unveiled

Many researchers have been devoted in the past two decades to develop technologies that assist people suffering motor disabilities associated with problems in verbal communication. Eye tracking and head movement detection and their use in empowering people with disability have remained interesting subjects especially in the current digital era. Generally, eye tracking involves manipulation through measuring the eye motion or monitoring the activities of the eye. Various researches have used different methods of eye tracking providing evidence that the science is of value to the society in general, and to the disabled individuals, in particular. Their methods have worked with individuals as well as groups. Head movement detection tracking has been found to be a natural way of expressing direction, and as a control technique it has been shown to be simple and effective. This paper surveys the literature on eye-tracking and head movement detection techniques for helping the disabled group.

Amer Al-Rahayfeh, Miad Faezipour
A Comparative Study of Simulation Based Performance Evaluation of Routing Protocol for Ad-Hoc Networks

This study introduces comparison of two simulation-based performance evaluation papers, the first paper called “A Performance Comparison of Multi-Hop Wireless Ad Hoc Network Routing Protocols” and the second one called “Simulation-based Performance Evaluation of Routing Protocols for Mobile Ad Hoc Networks”. This paper criticizes the choices selection for each simulation. These two papers present a performance evaluation of four typical routing protocols of ad-hoc networks which are DSDV, TORA, DSR, and AODV using the simulation technique. As the performance of an ad-hoc network protocol can vary significantly with different mobility model the first and the second paper chooses a different mobility model “Waypoint” and “Gauss-Markov” model, respectively. This leads to a different behavior of each model results. The first paper found out to be more systematic, realistic, and its performance evaluation has a better level of details including the MAC and link layer details in the simulation rather than the second paper which is more superior mainly in choosing “Gauss-Markov” mobility model. Furthermore, it found out the first paper’s “end-to-end delay” metric is more proper choice over the second paper’s “path optimality” metric since it depends on the algorithm more than the load.

Ola Alsaqour, Raed Alsaqour, Tariq Alahdal, Rashid Saeed, Mohammed Al-Hubaishi
Vulnerability Studies of E2E Voting Systems

In the recent years, the existence of end-to-end voter-verifiable (E2E) voting system had increased significantly. Some of the prospective ones have been used in a medium to large scale elections. We have also developed one (eVote). In this paper we review their capabilities to provide an individual and universally verifiable voting system, incoercibility and receipt-freeness to ensure election integrity. We compare some properties along with its resistance against malicious attacks.

Lauretha Rura, Biju Issac, Manas Haldar
Reliability Assessment of an Intelligent Approach to Corporate Sustainability Report Analysis

This paper describes our efforts in developing intelligent corporate sustainability report analysis software based on machine learning approach to text categorization and illustrates the results of executing it on real-world reports to determine the reliability of applying such approach. The document ultimately aims at proving that given sufficient training and tuning, intelligent report analysis could at last replace manual methods to bring about drastic improvements in efficiency, effectiveness and capacity.

Amir Mohammad Shahi, Biju Issac, Jashua Rajesh Modapothala
Ubiquitous Text Transfer Using Sound a Zero-Infrastructure Alternative for Simple Text Communication

Even in these days where data networks has increased much in terms of speed, bandwidth and penetration, the need for a low power, low bandwidth, ubiquitous networks is more pronounced than ever before. As the devices get smaller, their power supply is also limited, in according to the definition of “dust”, “skin” and “clay” in the ubiquitous computing paradigm. The possibility of these devices to be present in real world depends a lot on the key capability they must possess, which is to be network enabled, ubiquitously. This paper looks at the possibility of using the ever present signal “sound” as a ubiquitous medium of communication. We are currently experimenting on various possibilities and protocols that can make use of sound for text transmission between two electronic devices and this paper looks at some attempts in this direction. The initial phase of the experiment was conducted using a very large spectrum and encoding the entire ASCII text over audible sound spectrum. This gave a very large spectrum spread requirement which a very narrow frequency gap. The experimental results showed good improvement when the frequency gap was increased.

Kuruvilla Mathew, Biju Issac
Web Based Testing in Science Education

The paper describes the analysis and characterization of partial results (mathematics) research, which focused on the issue of detection of key knowledge and skills of pupils and students in primary and secondary schools in selected regions in Slovakia and the Czech Republic (the border area between the two countries). The aim was to determine whether there are regional differences and gender differences in mathematical competence. At the same time develop and test a suitable tool to effectively detect these differences. The results are currently being further analyzed in the context of educational programs and policies, along with cross-curricular links with a view to identifying the causes of the existing differences. The results of this research are also applied in the implementation of research-oriented implementation of a virtual excursions, supported by a grant KEGA.

M. Hostovecky, M. Misut, K. Pribilova
Determination of Optimal Hyper- and Multispectral Image Channels by Spectral Fractal Structure

Multiband aerial mapping technology—for the high spectral and high spatial resolution images—also the phenomenal traditional aerial mapping techniques, which are more reliable, compared to data obtained during the foundation stages of the process. Over the last decade, aircraft data recording technology has developed considerably due to its applications in the field of research and has become an increasingly central theme of multiband and high spatial resolution integrated processing. This has a significant impact on assessment results. Using practical examples, the author’s show that a properly selected spectral fractal structure based on data reduction and data selection procedures, significantly contributes to the hyper- and multispectral data cube optimum exploitation of additional information.

Veronika Kozma-Bognár, József Berke
ICT as a Mean for Enhancing Flexibility and Quality of Mathematical Subjects Teaching

Some results of educational experiments conducted as a part of long-term research project are described in this paper. Primary goal of the research was to improve the quality and flexibility of mathematical education by implementing of new teaching model with ICT support. The model was designed with an aim to increase flexibility and quality of teaching mathematical subjects at the Faculty of Materials Science and Technology of Slovak University of Technology. Described experiments were oriented onto verifying the proposed teaching model.

Mária Mišútová, Martin Mišút
Efficient High-Level Coding in a PLC to FPGA Translation and Implementation Flow

In the automation industry, PLCs have been the preferred implementation platform for many decades, due to their reliability, robustness and flexibility characteristics. However, the advances of the electronics industry have always kept automation engineers busy, looking for alternative platforms, proposed for the most demanding applications. Recently, the introduction of powerful and energy efficient FPGA devices has turned their interest towards methodologies to implement PLC applications with FPGAs, in automated or semi-automated ways. This paper evaluates such a methodology, which involves a fresh and productivity boosting technology, C-based FPGA programming. As FPGAs have made hardware designs wider accepted (compared to ASICs), C-based FPGA programming promises to make them even wider accepted (compared to HDL programming), provided specific, hardware related C-level coding guidelines are followed, that can greatly improve quality of results. The proposed methodology in this paper starts form low level PLC code (IL/STL) and after a disassembly-like phase, generates C code ready for FPGA programming. Through experimentation with demanding applications that involve floating point calculations, it is shown that when proper C-level coding guidelines are followed, performance gains (faster hardware) of up to 90 % can be achieved.

Christoforos Economakos, George Economakos
Modeling a Cold Rolling Mill for Optimization

A mathematical model of a rolling mill is produced using models of its constituent components based on physical laws. These models are combined to form a nonlinear model for the whole rolling mill process. In order to design control systems for rolling mill, a multidimensional set of nonlinear differential equations is linearized and the behavior of the resulting linear differential equations are compared with the response of the nonlinear model. This was done by simulations in which the gap opening and the rolling speed were varied and their effect on gap opening and the inter-strip length variation.

Meshack M. Nzioki
Technological Development in Therapeutic Applications of Alternating Electric Fields: Review

A number of bacteria, virus and other unhealthy cells need to be killed for getting rid of them. For more than a century antibiotics have been effectively used for killing bacterial pathogens and chemical drugs against the cancer cells. However, there are bacteria and cancer cells that are drug resistant. This may have to be overcome by other stronger drugs, higher dosage. These can have detrimental side effects. Other non drug methods to aid the effect of these drugs have always been in research. Electrochemotherapy, a method of using electric fields along with the drug to be used topically has been one of the successful approaches. One of the most recent methods of Tumor Treating Frequencies (TTF) for a brain cancer has been FDA approved. This article details the use of TTF. The article also details some other latest research where alternating fields are used as antibacterial agents.

S. Talele
Multi-Touch Gesture Recognition Using Feature Extraction

We are motivated to find a multi-touch gesture detection algorithm that is efficient, easy to implement, and scalable to real-time applications using 3D environments. Our approach tries to solve the recognition for gestures with the use of feature extraction without the need of any previous learning samples. Before showing our proposed solution, we describe some algorithms that attempt to solve similar problems. Finally, we describe our code to accomplish off-line gesture recognition.

Francisco R. Ortega, Naphtali Rishe, Armando Barreto, Fatemeh Abyarjoo, Malek Adjouadi
Model Driven Testing for Cloud Computing

In this paper, the authors present a proposal to support the creation of test cases for software systems under cloud computing environments. This approach is based on Model Driven Engineering (MDE). A Methodology and metamodels are proposed to support the generation of test cases. Specific metamodels for cloud computing environments are provided. Business models are created conform to UML (including profiles) and test cases are created conform to a metamodel that is independent testing platform. Both models are manipulated by model transformation that generates test cases for cloud computing environments. These metamodels are used in conjunction with the tool MT4MDE and SAMT4MDE for developing testing models. An illustrative example helps to understand the proposed approach.

Jéssica Oliveira, Denivaldo Lopes, Zair Abdelouahab, Daniela Claro, Slimane Hammoudi
Implementing a Sensor Fusion Algorithm for 3D Orientation Detection with Inertial/Magnetic Sensors

In this paper a sensor fusion algorithm is developed and implemented for detecting orientation in three dimensions. Tri-axis MEMS inertial sensors and tri-axis magnetometer outputs are used as input to the fusion system. A Kalman filter is designed to compensate the inertial sensors errors by combining accelerometer and gyroscope data. A tilt compensation unit is designed to calculate the heading of the system.

Fatemeh Abyarjoo, Armando Barreto, Jonathan Cofino, Francisco R. Ortega
Introducing Problem-Based Learning in a Joint Masters Degree: Offshoring Information Technologies

A young offshore software industry has grown up in Morocco. The University of Brest has set up a network of major software companies and Moroccan universities, providing two mobility schemes towards France. Both schemes include a final internship on the French side of global companies, with pre-employment on the Moroccan side—a successful internship being the key that opens the door to recruitment. Student heterogeneity, and student reluctance to move towards a professional attitude are important barriers to employability. Hence, we redesigned a significant proportion of our technical courses to use a problem-based learning (PBL) approach. The PBL approach is illustrated through drawing parallels with the production of a TV series. Three aspects of the approach are presented: (1) set-up of the studio in which sessions are run, i.e. a real software project, its work products and its software development environment; (2) pre-production tasks including the screenwriting of problem-based learning scenarios and the procurement of input artefacts; and (3) acting, i.e. students’ interpretation of characters (roles) and teacher direction.

Vincent Ribaud, Philippe Saliou
Composition of Learning Routes Using Automatic Planning and Web Semantics

This article describe how to combine automatic planning techniques and web semantics technology to organize, design and customize (personalize) by composing learning route objects for e-learning. The purpose of automatic planning is an unanticipated creation of a sequence plan subordinated to learning objects (i.e. learning route) which is attained from other pre-existing objects (i.e. composition); on the other hand, the paradigm of web semantics is used to deal with the significant heterogeneity expressed among a diversity of mental models of faculty members, students and authors of their own learning objects.

Ingrid-Durley Torres, Jaime Alberto Guzmán-Luna
A Novel Dual-Error Approach to System Identification

Single error system identification techniques are widely used to estimate the parameters of dynamic mathematical models that are needed in a range of industrial applications. A novel Dual-Error system identification technique is proposed. It is based on a modification of the traditional single-error methods and shown to offer better accuracy for the estimation of model parameters. The benefits of the proposed method are demonstrated by a comparison with traditional methods when applied to both a simulated system and a DC motor.

H. Greeff
On Verification of the Software Development Process

The implementation of the best software engineering process do not guarantee the best result. The ratio of failures is still very high although the formal review of the processes shows no defects. The most common problem is the amount of additional work discovered at the very end of the projects leading either to delays and extra cost or to sufficient decrease in the product quality. All this is unacceptable in the modern rapidly evolving world with a high level of competition and demand to produce software in time with acceptable level of quality. The paper describes how the software process can be examined and verified to ensure that it is not only established, but also followed and all potential risks and uncertainties are resolved right when they occur instead of suppressing.

Deniss Kumlander
Control-Flow Checking Using Binary Encoded Software Signatures

Correct execution of a program source code is an essential part of the modern information systems. Due to various external causes, the process execution can fail and lead to unpredictable consequences. Proposed solution detects control-flow errors caused by a faulty execution of jump instructions, by the means of program execution controlling technique based on inserting software signatures to a source code. In this paper we are proposing a new algorithm for control-flow checking called CFCBS.

H. Severínová, J. Abaffy, T. Krajčovič
Transformation of the Software Testing Glossary into a Browsable Concept Map

Authors propose a transformation method of the glossary “Standard glossary of terms used in Software Testing” created by ISTQB document into a basic concept map. By applying natural language processing techniques and analyzing the discovered relations between concepts the most essential aspects of the software testing domain are elicited and integrated. As the result a browsable concept map is created. Browsable concept map can be used as a learning support tool.

Guntis Arnicans, Uldis Straujums
Metallographic Image Processing Tools Using Mathematica Manipulate

The objective of this research is to present digital image processing (DIP) modules specifically designed for use with metallographic images. The goal of the application is to make digital processing algorithms accessible to users with limited background in programming, a specific interest in metallurgical applications of DIP, and the need to setup interactive, easily modified modules.

Sara McCaslin, Adarsh Kesireddy
Using Mathematica to Accurately Approximate the Percent Area of Grains and Phases in Digital Metallographic Images

The objective of this paper is to present an effective methodology to find out the cumulative percentage area of grains and phases present in a digitally captured metallographic image using image processing commands available in Mathematica 8.

Adarsh Kesireddy, Sara McCaslin
Real-Time Indexing of Complex Data Streams

The paper deals with indexing of a complex type data stream stored in a database. We present a novel indexing schema and framework referred to as ReTIn (Real-Time Indexing), the objective of which is to allow indexing of complex data arriving as a stream to a database with respect to soft real-time constraints met with some level of confidence for the maximum duration of insert and select operations. The idea of ReTIn is a combination of a sequential access to the most recent data and an index-based access to less recent data stored in the database. The collection of statistics makes balancing of indexed and unindexed parts of the database efficient. We have implemented ReTIn using PostgreSQL DBMS and its GIN index. Experimental results presented in the paper demonstrate some properties and advantages of our approach.

Petr Chmelar, Michal Drozd, Michal Sebek, Jaroslav Zendulka
Self-Organized Teams: A Contradictory Technique to Motivate Personnel

The self-organized team is a core technique in the modern software development methodologies known as agile techniques. It improves the ability of a team to make efficient decision and move a project toward the ultimate goal. This technique has been recognized as a motivational practice in many companies, while in others applying it produced a sufficient negative impact. In this article the meaning of the practice is revised. We study cases when applying it would have positive or negative effect and how we can improve the efficiency of that even in a negative case.

Deniss Kumlander
Pre-MEGa: A Proposed Framework for the Design and Evaluation of Preschoolers’ Mobile Educational Games

With the spread of mobile games targeting preschoolers there is an increased need for the creation of high-quality, research-based content for this age group. But how can “quality” be defined here? To answer this question, an extensive review of literature and available rating systems was needed which resulted in a detailed set of attributes which constitute a fun, usable, beneficial and, above all, successful mobile learning game targeting preschoolers. This framework (Pre-MEGa) is presented in this paper with the aim of facilitating the process of translating research into concrete, measurable characteristics for designing and evaluating this type of software.

Laila Shoukry, Christian Sturm, Galal H. Galal-Edeen
A Review on Three Dimensional Facial Averaging for the Assessment of Orthodontic Disorders

The introduction of rapid, non-invasive, and reproducible imaging technology such as three dimensional (3D) surface scanners, Cone Beam CT (CBCT) and low-dose CT have made it possible to develop tools for evaluating facial morphology and accurately planning dentofacial surgery. Facial averaging using 3D facial data is one such effective tool. This paper comprehensively reviews different techniques of facial averaging and their clinical applications. All the approaches are classified according to their methodologies and their scope and limitations are discussed. Future research directions are also identified.

Syed M. S. Islam, Mithran S. Goonewardene, Mauro Farella
Reducing Stereotypes of Women in Technology Through Analysis of Videogame Blog Entries

In this paper we analyze some of the most frequent stereotypes about women found in videogames. One way to negate these beliefs is to look at real data. Since contemporary students prefer blog entries to hardcopy or longer articles, in this paper we examine some inaccurate but widely held beliefs found in the videogame industry and analyze blog entries that can be used to negate them. This idea grew out of a mixed-gender class in game programming at Northern Illinois University in Spring 2012.

Reva Freedman, Georgia Brown
Improving Student Learning While Converting a Computer Architecture Course to Online Format

A required Computer Architecture course for Computer Science majors was converted to online form in Spring 2011. In this paper we discuss the changes made to make the course successful online, especially in content preparation, course organization, and the construction and handling of assignments. We discuss why we feel the revised course produces improved student learning in terms of the basic principles of scaffolding, self-explanation and multimodal learning. We also discuss how we made the course practical to administer on an ongoing basis. We hope that this experience will be helpful to other faculty members planning to convert courses in Computer Science or Engineering to an online format.

Reva Freedman
Bioclimatic Modelling: A Machine Learning Perspective

Many machine learning (ML) approaches are widely used to generate bioclimatic models for prediction of geographic range of organism as a function of climate. Applications such as prediction of range shift in organism, range of invasive species influenced by climate change are important parameters in understanding the impact of climate change. However, success of machine learning-based approaches depends on a number of factors. While it can be safely said that no particular ML technique can be effective in all applications and success of a technique is predominantly dependent on the application or the type of the problem, it is useful to understand their behaviour to ensure informed choice of techniques. This paper presents a comprehensive review of machine learning-based bioclimatic model generation and analyses the factors influencing success of such models. Considering the wide use of statistical techniques, in our discussion we also include conventional statistical techniques used in bioclimatic modelling.

Maumita Bhattacharya
nFTP: An Approach to Improve Performance of FTP Protocol on the Virtual Network Environment in the Same Physical Host

The future of the Internet is applications and services based on virtualization environment. The strong points of virtualization technology are high availability, high flexibility and the effective cost of application management. Currently, network applications are always running on some physical network. Based on virtualization technology there are two main approaches for network virtualization. First one is based on network device virtualization, the second one is based on network virtualization where connected network devices and servers are virtual machines. In the second approach, transmitting information between virtual servers is performed primarily based on the traditional network protocols. However, when the virtual machines of a virtual network are located in a same physical host, the traditional network protocols don’t take full advantage of virtualization technology. The time for packets routing through the network devices (virtual machines) on the routing path is not reduced even though all virtual network devices are on the same physical host. In this paper, the authors offer a new approach to improve speed/performance of the network protocols on the virtual environment by directly copying data from one virtual machine to the other. Within the scope of this paper, to illustrate this idea, the authors focus on improving the performance of the traditional FTP protocol. The results of our experiments show that the performance of improved FTP protocol (nFTP) has increased significantly in the virtual network environment. This approach opens a wide range of research topics to improve performance of network protocols on virtual networks.

Nguyen Tan Cam, Huynh Van Tho, Nguyen Hoang Sang, Cao Dang Tan
Hardware Architecture Review of Swarm Robotics System: Self Reconfigurability, Self Reassembly and Self Replication

Swarm robotics is one the most fascinating and new research areas of recent decades, and one of the grand challenges of robotics is the design of swarm robots that are self-sufficient. This can be crucial for robots exposed to environments that are unstructured or not easily accessible for a human operator, such as the inside of a blood vessel, a collapsed building, the deep sea, or the surface of another planet. In this paper, we present a comprehensive study on hardware architecture and several other important aspects of modular swarm robots, such as: self-reconfigurability, self-replication, and self-assembly. The key factors in designing and building a group of swarm robots are cost and miniaturization with robustness, flexibility and scalability. In robotics intelligence, self-assembly and self reconfigurability are among the most important characteristics as they can add additional capabilities and functionality to swarm robots. Simulation and model design for swarm robotics is highly complex and expensive, especially when attempting to model the behavior of large swarm robot groups.

Madhav Patil, Tamer Abukhalil, Sarosh Patel, Tarek Sobh
Internet and Transdisciplinary Based Teamwork Formula for Elaborating a Bachelor’s or a Master’s Thesis

The last step in a student’s schooling experience is the presentation of his/her bachelor’s or master’s thesis. This constitutes a good opportunity for the students to “show off” their capabilities, their individual performances related to the theme of the thesis, and the amount of knowledge that they have accumulated over their years of study in a higher educational institution. Such reality has fueled an important question across the educational system of whether it is enough to expect that from a bachelor’s or master’s thesis, or it is time to change the methodology of thesis elaborating by implementing new methodologies. The paper examines possible methods that aim to shift from the traditional way of writing a bachelor’s or a master’s thesis to a modern approach, based on national/international teamwork. Such a method would include intensive usage of the Internet for information, documentation and collaboration, and would also valorize the methodology of transdisciplinarity. The discussion begins with the explanation of the major challenges that students with different majors, cultural, religious, and ethnic backgrounds face when they attempt to write a good thesis. This is followed by arguments that support the idea of teamwork. The paper finally concludes with examples of actions that will help academic authorities to implement such a new, complex and innovative methodology. The authors intend to turn this article into a good starting point for a pilot project, which aims to create a partnership between several universities/entities that are ready to embrace and implement this initiative.

Liciniu A. Kovács, Mihai F. Talpoş
A Dynamic Pricing Algorithm for Super Scheduling of Computational Grid

In this paper, we propose a dynamic pricing strategy which is based peer-to-peer grid computing used for a SLA_based super scheduling and greedy backfilling resource allocation mechanism. Our goal is making load balancing in the overall grid system, by changing the value of resources. Without any pricing policy in grid, because of increasing load of cheaper resources and lack of demands for expensive resources, system will be out of balance.

In this paper, our attention is towards the load factor in the overall system and the load factor on each resource. The rates of demand and supply for each resource are two other criteria for overall system price balancing in system. If the price of a resource is high its demand becomes low hence a price adjusting algorithm must be active to insure acceptable utilization of resources. On the other hand is the price of a resource is low it will receive many demands which in turn will increase the turnaround time of the requests. These algorithms are developed in this research and the overall effectiveness of the algorithm is evaluated. In result, we almost receive balancing state at overall system.

The key advantages of our approach are (1) increase in the total earning of resource owners and users, (2) increase in the number of resources being used, (3) increase in the number of accepted jobs, and (4) decrease in the rate of budget spent of users.

Reihaneh Bazoubandi, Fatemeh Abdoli
A Comparative Analysis of Bayesian Nonparametric Inference Algorithms for Acoustic Modeling in Speech Recognition

Nonparametric Bayesian models have become increasingly popular in speech recognition for their ability to discover data’s underlying structure in an iterative manner. Dirichlet process mixtures (DPMs) are a widely used nonparametric method that do not require a priori assumptions about the structure of the data. DPMs, however, require an infinite number of parameters so inference algorithms are needed to make posterior calculations tractable. The focus of this work is an evaluation of three variational inference algorithms for acoustic modeling: Accelerated Variational Dirichlet Process Mixtures (AVDPM), Collapsed Variational Stick Breaking (CVSB), and Collapsed Dirichlet Priors (CDP).

A phoneme classification task is chosen to more clearly assess the viability of these algorithms for acoustic modeling. Evaluations were conducted on the CALLHOME English and Mandarin corpora, consisting of two languages that, from a human perspective, are phonologically very different. In this work, we show that these inference algorithms yield error rates comparable to a baseline Gaussian mixture model (GMM) but with a factor of 20 fewer mixture components. AVDPM is the most attractive choice because it delivers the most compact models and is computationally efficient, enabling its application to big data problems.

John Steinberg, Amir Harati, Joseph Picone
Requirements Based Estimation Approach for System Engineering Projects

In this paper the requirements are used for the purpose of the estimation. Requirements are cluster according to their complexity. The new approach in the papers is based on requirements analysis. The complexity of the requirements is set and Total Requirements Points Value (TRP) is calculated. The Total Requirements Points are modified by the technical and environmental factors, which described the problem domain and the development team experiences. The Total Requirements point can be used as a coefficient for the system size. According this approach system-engineering project can be compared and priced.

Radek Silhavy, Petr Silhavy, Zdenka Prokopova
Improvement of the Time Calculation of Cloud Radiance of One Atmosphere by the Method TDMAP

In this work we used the TDMAP model (Tree Driven Mass Accumulation Process) which is the generalization of a suitable wavelet decomposition of fractional Brownian motion (MBF), to improve CPU calculation time of the radiative transfer equation solutions. In the case of radiances, these wavelets are called luxlets. We calculated radiances for a cloudy atmosphere. The calculation is performed on altocumulus with 1.5 g per cubic centimeter of Liquid Water Content (LWC). The results were compared to the same atmosphere with the results of SHDOM (Spherical Harmonic Discrete Ordinate Method). After a statistical study we have good correlation of results from both models and have improved the CPU computation time by 9.3 %.

Bouya Diop, Adoum M. Moussa, Abdou K. Farota
Process of Transformation, Storage and Data Analysis for Data Mart Enlargement

Creation of the information system is the complex and long process but it is only the first step in its existence. Most of information systems go through a certain development during its life cycle. Very often the users have defined the requests for the enlargement of the functionality and the volume of the displayed data. The process of the transformation, storage and analysis for data mart enlargement based on the users’ requests; concretely the data mart of the personal transport is presented in the paper. The expressions from the field of the Business Intelligence and the systems used for gaining data, data analysis or creation of forms and reports are explained.

Zdenka Prokopova, Petr Silhavy, Radek Silhavy
Exact Algorithm for Matrix-Based Multilevel Project Planning Problems

Besides network planning methods matrix-based methods can also be used in project planning and scheduling. In this case either importance or probability of task completions can be described, and thus either importance or probability of possible project scenarios and project structures can be determined and ranked by their importance or probabilities. When using matrix-based project planning methods the main challenge is to select the project scenario and project structures regarding the management claims. This approach can also be applied, if a most desired/most probable project portfolio or a multi project has to be specified instead of a single project planning. In this study fast, exact algorithms are introduced in order to select the most important project scenarios or the least cost/time demanded project structures. The new algorithm is a framework algorithm, which can be a fundamental basis of a project expert system and decision-makings.

Zsolt T. Kosztyán
Operation of a Microgrid System with Distributed Energy Resources and Storage

In a heat-power system, the use of distribute energy generation and storage will improve system’s efficiency, reliability, and emission. This paper is focused on the operation of a microgrid consisting of a PV system, a hydrogen fuel cell stack, and a PEM electrolyzer. As a grid-tied system, there is a two-way power flow between the system and the grid. The microgrid generates electric power needed for the local electric load and heat for the local heat need. With the proposed performance indexes, the system under an electricity-led scenario is simulated. The price, emission, service quality, and the overall performance indexes are all between 0.8 and 0.9. Moreover, the electric demand is 100 % met and heat demand 45.8 %.

Linfeng Zhang, Xingguo Xiong, Junling Hu
The SOC Estimation of a Lead Acid Rechargeable Battery

A model of a lead-acid battery is presented with an equivalent circuit and the parameters are determined with experiments. An inductor is added into the circuit with the consideration from the output of impedance spectrum. Extended Kalman filter for the nonlinear system is used to estimate three state variables and further to calculate the state-of-charge. The algorithm is simplified and it can be implemented for the real-time estimation with an error less than 1 % in the testing with the SOC range between 60 % and 100 %. The testing is also conducted with a battery not fully charged and the estimation error is higher, close to 2 %.

Linfeng Zhang, Xingguo Xiong
Fast Computation of Frobenius Map Iterates in Optimal Extension Fields

The

j-th

iterate of Frobenius map is required in computing field multiplication and inversion that are necessary for code-theoretic and cryptographic applications in elliptic curve cryptography. In this paper, we propose a fast method for computing Frobenius map operation in optimal extension fields

GF

(

p

m

) using polynomial basis representation for the field elements. In comparison with other existing approaches in the literature, our approach is associated with infinitesimal execution time in exchange for slight increase in space requirements.

Walid Mahmoud
Product Owner Responsibilities in the Project Assurance Process: Bridging Uncertainties Gaps

The product owner role in software engineering by the agile principle is a key one. He has many traditional responsibilities such as initiating a project, defining and accepting functionality and the entire product. At the same time his other responsibilities remain hidden or undefined in many projects. Unfortunately that is exactly the reason of failure for many projects and therefore here we debate and review his responsibilities and duties in other project elements and stages in order to show how important his input is to the project and other team members activities, primarily project verification and validation activities. The product owner’s contribution to the team domain knowledge, prototyping, tests planning and project backlog is crucial and cannot be omitted in companies desiring success through meeting actual customers’ expectations.

D. Kumlander
Micro Design and Value Analysis. The Selection of the Material for Die

The selection of a material for an application in engineering or its replacement with another material, superior in terms of engineering, economics and environmental impact is an important stage in the design process of a product. The present paper presents a modern and original method for optimizing the selection of the material needed to obtain a new product—for maximizing its performance and minimizing its cost—to obtain the sustainable development objectives. The work strategy involves setting the functions of the product, the matrix and the related programs to select the optimal material from the existing database, applying the value analysis approach for obtaining a material for the selected product, for optimizing the product’s value–performance/cost ratio. For the automation of calculations and ease of design work, the authors have developed calculus programs/software. The literature presents techniques for selecting materials in order to obtain a new product or for finding a new material for a given product but the complexity of the problems specific to the selection process leads to new research. Based on our experience in the field, on the relevant examples for applying the Value Analysis approach to industrial products, we, the authors, submit a paper that is a challenge for different fields of expertize and which contributes to this new emerging field of “material selection”.

Florin Chichernea, Ana Vețeleanu
The Internet of Things in Community Safety and Crime Prevention for South Africa

One of the major tasks of the South African (S.A.) government is to reduce crime levels on a year-to-year basis. The use of information and communications technologies (ICTs) is capital in facilitating the process to finding solutions to crime. This paper is about taking advantage of a particular subset of ICTs that is referred to as the internet of things (IoT) and is integrated with biometric technologies in the fight against crime. The paper identifies not only the sectors of the economy that fall under community safety and crime prevention such as police efficiency and accountability, and partnerships between the police and communities, but also a number of IoT including biometric applications that can be of value in these sectors. By drawing on the characteristics of identified IoT including biometric applications, the research came up with the architecture of an integrated biometric IoT system for tracking parolees who have violated their bail conditions, as a case study for the S.A. environment. Parolees are tagged with tracking devices which are GPS-enabled for location of the parolee at any point in time.

Nomusa Dlodlo, Paul Mbecke, Mofolo Mofolo, Martin Mhlanga
Research Trends in Existing Technologies that are Building Blocks to the Internet of Things

The internet of things (IoT) is based on interfacing the digital and physical worlds and making the information generated as a result available via the internet. The challenge that IoT faces as a new research area, is in the identification of further research areas and the way forward. Therefore this paper is the culmination of a research to identify further research areas in the IoT and the relevance of IoT to South Africa. The research first conducted a review of 28 IoT European Union (EU) Framework projects available over the internet. From the reviews, the research extracted the technologies that the EU is conducting research on as building blocks to the IoT. Using an adaptation of the Graham Vickery and Sacha Wunsch-Vincent framework which analyses developments in ICT research and development, this research interviewed experts working with the identified technologies and came up with research trends in the various technologies that are building blocks to the IoT and its relevance to South Africa.

Nomusa Dlodlo, Mofolo Mofolo, Lionel Masoane, Stanley Mncwabe, George Sibiya, Lawrence Mboweni
Efficient Partitioning and Allocation of Data for Workload Queries

Our aim is to provide efficient partitioning and replication of data. We seek to accommodate a variety of transaction types (both short and long-running, read and write-oriented) to support workloads in cloud environments. We do so by introducing an approach that partitions and allocates small units of data, that we call micropartitions, to multiple database nodes. Only the necessary data is available to the workload in the form of micropartitions. Transactions are routed directly to the appropriate micropartitions.

First, we use agglomerative hierarchical clustering technique to group the workload queries based on data requirements. We represent each cluster with an abstract query definition. The abstract query definition is a query statement that represents the minimal data requirements that would satisfy all the queries that belong to a given cluster. A micropartition is realized by executing the abstract query.

We show that our abstract query definition is complete and minimal. Intuitively, completeness means that all queries of the corresponding cluster can be correctly answered using the micropartition generated from the abstract query. The minimality property means that no smaller partition of the data can satisfy all of the queries in the cluster.

Our empirical results show that our approach improves data access efficiency over standard partitioning of data.

Annamaria V. Kish, John R. Rose, Csilla Farkas
Toward the Automatic Construction of Strategic Plans Based on Ontologies

In this paper, the authors propose an approach toward the automatic construction of Strategic Plans based on Ontologies. Transnational corporations have units located in different geographical regions with a set of multiple nationalities. Aligning the targets and values to the global vision and mission of the company is an activity that needs constant update. However, it is possible that the strategists from different places will interpret the objectives differently and the alignment will not be close to the original. As a consequence, the image and profits of the organization could deteriorate. Verifying automatically the alignment of the goals of the organization to the global organization is important because it saves time, money, and the whole team could move in the same direction. This approach is based on ontologies where the knowledge and information about the plans are structured on class, concepts and data. It is also composed of a set of data retrieval and manipulation operations, which allow the approach itself to identify a set of patterns in the group of the strategic plans. Finally, the evaluation is carried out by performing a series of tests to determine the level of adherence to the proposed approach.

Liliana Ibeth Barbosa-Santillán, José Pablo Nuño-de-la-Parra, Juan Jaime Sánchez-Escobar, Carlos Arturo Vega-Lebrun
E-Learning Environments: Actor Network Theoretic Inspirations into Localized Discovery

Virtual delivery of learning is widespread among many educational institutions which deploy learning management systems for this purpose. However, these systems yet fail to deliver similar experience with face-to-face learning environments despite added benefits they poses compared to face-to-face learning. This article study conceptual ideas, in particular actor-network theory, which can be used to design virtual learning environments.

Ashoka Jayawardena
Educational Tools: A Review of Interfaces of Mobile-Augmented Reality (mAR) Applications

This paper reviews the types of mobile Augmented Reality (mAR) interface being utilized in various applications such as education, advertisement, production, tourism and other applications. The objective of this paper is to examine the limitations on the types of mAR when they are used in higher education in terms of interaction between learning, teaching and instructional design. Based on the review, it can be concluded that there is an insufficient mAR interface being used in viewing the augmented images for classroom learning. For example, there are only two interfaces found that is being applied in the current mAR education applications. A comparative review presented in this paper suggests the appropriate mAR interface that can be implemented in education that could possibly enhance the learning outcomes.

Siti Salmi Jamali, Mohd Fairuz Shiratuddin, Kok Wai Wong
Automatic Extraction of Relationships Among Software Patterns

A software pattern is a great tool which perpetuates a proved knowledge on software engineering, and enables its reuses in different situations. The reuse of several patterns to elaborate a solution requires being aware of the relationships between patterns (inter-patterns relationships). These latter indicate what patterns can work together and in what manner. However, those relationships are difficult to discern when they are not explicitly mentioned within patterns, and their extraction is a hard task. In this context, the present paper exposes our approach of automatic inter-patterns relationships extraction, basing on a relationships analysis method.

Asma Hachemi, Mohamed Ahmed-Nacer
Smart Email: Almost An Agent Platform

Network organizations suffer today of information overload and strain that rise their operational costs. One of the reasons of that is the dominance of email messaging as the principal means of document exchange between their workers. Proactive documents can rationalize these costs and augment email systems with a process view based on collaboration patterns.

Magdalena Godlewska, Bogdan Wiszniewski
Improving Trace Analysis Using Ontologies for Hardware Resourcing

Testing is one of the traditional techniques used to verify the quality of complex systems. Traditionally, black-box testing relies on the degree of controllability and observability of the system under test; a system with increased controllability and observability is a system easier to test. One common observation point for testing is execution traces. Execution traces are sequences of events representing observation of the system under test. The execution traces are usually stored as plain text files (i.e., logs). The current size and complexity of systems makes the execution trace analysis a complex and time consuming task given the size and format of the information. This paper presents the application of ontological methods in facilitating execution trace analysis by defining an initial Execution Trace Ontology that is used by different ontology query tools. The queries used over the ontology allow us to identify errors presented in the execution trace associated with different aspects of the case study. Results showed the feasibility of this approach, where ontologies helped to provide semantics to the information and reasoning engines (ontology query engines), facilitating the definition of test goals.

Manuel Corona-Pérez, Gerardo Padilla-Zárate, Liliana Ibeth Barbosa Santillán
An Energy Efficient Self-healing Mechanism for Long Life Wireless Sensor Networks

In this paper, we provide an energy efficient self-healing mechanism for Wireless Sensor Networks. The proposed solution is based on our probabilistic sentinel scheme. To reduce energy consumption while maintaining good connectivity between sentinel nodes, we compose our solution on two main concepts, node adaptation and link adaptation. The first algorithm uses node adaptation technique and permits to distributively schedule nodes activities and select a minimum subset of active nodes (sentry) to monitor the interest region. And secondly, we introduce a link control algorithm to ensure better connectivity between sentinel nodes while avoiding outliers appearance. Without increasing control messages overhead, performances evaluations show that our solution is scalable with a steady energy consumption. Simulations carried out also show that the proposed mechanism ensures good connectivity between sentry nodes while considerably reducing the total energy spent.

Dame Diongue, Ousmane Thiare
Novel Steganography over HTML Code

Different security strategies have been developed to protect the transfer of information between users. This has become especially important after the tremendous growth of internet use. Encryption techniques convert readable data into a ciphered form. Other techniques hide the message in another file, and some powerful techniques combine hiding and encryption concepts. In this paper, a new security algorithm is presented by using Steganography over HTML pages. Hiding the information inside Html page code comments and employing encryption, can enhance the possibility to discover the hidden data. The proposed algorithm applies some statistical concepts to create a frequency array to determine the occurrence frequency of each character. The encryption step depends on two simple logical operations to change the data form to increase the complexity of the hiding process. The last step is to embed the encrypted data as comments inside the HTML page. This new algorithm comes with many advantages, such as generality, applicability to different spoken languages, and can be extended to other Web programming pages such as XML, ASP.

Ammar Odeh, Khaled Elleithy, Miad Faezipour, Eman Abdelfattah
Secure Cost Effective M-Learning Through Cloud Computing

Mobile learning (m-learning) has been realized as an efficient tool which has potential to enhance and support the traditional way of learning. However, due to increasing number of users, services, education contents and resources, how to deploy m-learning become problematic. There are many aspects for e-learning. Some of them are providing study materials, teacher student interaction, and timely distribution of information etc. Providing virtual classrooms with the help of multimedia technology is an advanced method in e-learning. In the area of e-learning an efficient way to provide effective e-learning service is through delivering video and audio contents online. The recent technologies like screen casting is an example to show how effective they are. The learning information such as video and audio of institution/individuals is an important asset. It means that the e-learning video and audio content should be protected in motion, in process and at rest when internet based e-learning system is used. Encrypt the data wherever it is in the cloud: at rest, in process, or in motion. Data doesn’t stay in one place on any network and this is especially true in case of data in the cloud based m-earning. Cloud computing is a promising technology to overcome the problems in m-learning and it provides reliable, customized and dynamic computing environments for end-users. In this paper we present a new and simple cost effective encryption algorithm (MBBXOR) as selective encryption for secure storing mobile distance learning content and streaming it in an extended cloud computing environment.

K. Kartheeban, M. Venkatesulu
A Multi-Level Privacy Scheme for Securing Data in a Cloud Environment

Privacy concern is often cited as one of the key factors that impede large-scale adoption of the cloud computing paradigm by enterprise customers. Existing solutions to privacy issue with respect to cloud computing, commonly through encryption mechanisms, often result in performance problem. This paper proposes and presents a multi-level privacy support scheme for addressing the trade-off between privacy of user’s data stored in the cloud and system performance. This is achieved by using encryption algorithms with varying strengths to protect the privacy of different categories of user’s data depending on their privacy sensitivity. Simulation results, using Jindael AES encryption algorithm as case study, lends credence to the efficacy of the proposed privacy scheme.

Ezekiel K. Olatunji, Matthew O. Adigun, Paul Tarwireyi
Metadata
Title
Innovations and Advances in Computing, Informatics, Systems Sciences, Networking and Engineering
Editors
Tarek Sobh
Khaled Elleithy
Copyright Year
2015
Electronic ISBN
978-3-319-06773-5
Print ISBN
978-3-319-06772-8
DOI
https://doi.org/10.1007/978-3-319-06773-5