Skip to main content

2019 | Buch

Advances in Information and Communication Networks

Proceedings of the 2018 Future of Information and Communication Conference (FICC), Vol. 2

insite
SUCHEN

Über dieses Buch

The book, gathering the proceedings of the Future of Information and Communication Conference (FICC) 2018, is a remarkable collection of chapters covering a wide range of topics in areas of information and communication technologies and their applications to the real world. It includes 104 papers and posters by pioneering academic researchers, scientists, industrial engineers, and students from all around the world, which contribute to our understanding of relevant trends of current research on communication, data science, ambient intelligence, networking, computing, security and Internet of Things.

This book collects state of the art chapters on all aspects of information science and communication technologies, from classical to intelligent, and covers both theory and applications of the latest technologies and methodologies.

Presenting state-of-the-art intelligent methods and techniques for solving real-world problems along with a vision of the future research, this book is an interesting and useful resource.

Inhaltsverzeichnis

Frontmatter
Hybrid Data Mining to Reduce False Positive and False Negative Prediction in Intrusion Detection System

This paper proposes an approach of data mining machine learning methods for reducing the false positive and false negative predictions in existing Intrusion Detection Systems (IDS). It describes our proposal for building a confidential strong intelligent intrusion detection system which can save data and networks from potential attacks, having recognized movement or infringement regularly reported ahead or gathered midway. We have addressed different data mining methodologies and presented some recommended approaches which can be built together to enhance security of the system. The approach will reduce the overhead of administrators, who can be less concerned about the alerts as they have been already classified and filtered with less false positive and false negative alerts. Here we have made use of KDD-99 IDS dataset for details analysis of the procedures and algorithms which can be implemented.

Bala Palanisamy, Biswajit Panja, Priyanka Meharia
Adblock Usage in Web Advertisement in Poland

Research concerning users blocking advertisements constitutes a new research area both in the scope of analysis of collected data regarding that topic, determinants concerning users blocking advertisements and IT tools. The paper refers to this and systematizes knowledge in the scope of types of online advertisements and methods for blocking them using an adblock, and it identifies reasons and main categories of reasons for users blocking advertisements. The research presented in the paper was confronted with results of an analysis of application of adblocks. The obtained results will facilitate conducting further, more thorough research. Considerations included in the paper can constitute a set of recommendations for publishers displaying advertisements on websites and they can be useful for drawing conclusions and preparing guidelines for projects supporting sustainable development in the scope of online advertising.

Artur Strzelecki, Edyta Abramek, Anna Sołtysik-Piorunkiewicz
Open Algorithms for Identity Federation

The identity problem today is a data-sharing problem. Today the fixed attributes approach adopted by the consumer identity management industry provides only limited information about an individual, and therefore, is of limited value to the service providers and other participants in the identity ecosystem. This paper proposes the use of the Open Algorithms (OPAL) paradigm to address the increasing need for individuals and organizations to share data in a privacy-preserving manner. Instead of exchanging static or fixed attributes, participants in the ecosystem will be able to obtain better insight through a collective sharing of algorithms, governed through a trust network. Algorithms for specific datasets must be vetted to be privacy-preserving, fair and free from bias.

Thomas Hardjono, Alex Pentland
A Hybrid Anomaly Detection System for Electronic Control Units Featuring Replicator Neural Networks

Due to the steadily increasing connectivity combined with the trend towards autonomous driving, cyber security is essential for future vehicles. The implementation of an intrusion detection system (IDS) can be one building block in a security architecture. Since the electric and electronic (E/E) subsystem of a vehicle is fairly static, the usage of anomaly detection mechanisms within an IDS is promising. This paper introduces a hybrid anomaly detection system for embedded electronic control units (ECU), which combines the advantages of an efficient specification-based system with the advanced detection measures provided by machine learning. The system is presented for - but not limited to - the detection of anomalies in automotive Controller Area Network (CAN) communication. The second part of this paper focuses on the machine learning aspect of the proposed system. The usage of Replicator Neural Networks (RNN) to detect anomalies in the time series of CAN signals is investigated in more detail. After introducing the working principle of RNNs, the application of this algorithm on time series data is presented. Finally, first evaluation results of a prototypical implementation are discussed.

Marc Weber, Felix Pistorius, Eric Sax, Jonas Maas, Bastian Zimmer
Optimizing Noise Level for Perturbing Geo-location Data

With the tremendous increase in the number of smart phones, App stores have been overwhelmed with applications requiring geo-location access in order to provide their users better services through personalization. Revealing a user’s location to these third party Apps, no matter at what frequency, is a severe privacy breach which can have unpleasant social consequences. In order to prevent inference attacks derived from geo-location data, a number of location obfuscation techniques have been proposed in the literature. However, none of them provides any objective measure of privacy guarantee. Some work has been done to define differential privacy for geo-location data in the form of geo-indistinguishability with l privacy guarantee. These techniques do not utilize any prior background information about the Points of Interest (PoIs) of a user and apply Laplacian noise to perturb all the location coordinates. Intuitively, the utility of such a mechanism can be improved if the noise distribution is derived after considering some prior information about PoIs. In this paper, we apply the standard definition of differential privacy on geo-location data. We use first principles to model various privacy and utility constraints, prior background information available about the PoIs (distribution of PoI locations in a 1D plane) and the granularity of the input required by different types of apps, in order to produce a more accurate and a utility maximizing differentially private algorithm for geo-location data at the OS level. We investigate this for a particular category of Apps and for some specific scenarios. This will also help us to verify whether Laplacian noise is still the optimal perturbation when we have such prior information.

Abhinav Palia, Rajat Tandon
Qualitative Analysis for Platform Independent Forensics Process Model (PIFPM) for Smartphones

This paper details how forensic examiners determine the mobile device process and if the Platform Independent Forensics Process Model for Smartphones (PIFPM) helps them in achieving the goal of examining a smartphone. The researcher conducted interviews, presented the PIFPM process to the examiners, and supplied surveys that the examiners were exposed to. Using convenience sampling, the frequency and percent distribution of each examiner is given as well as strengths and weaknesses of PIFPM as it relates to the examiner. Based on the hypotheses given by the researcher, the results were either refuted or supported through sampling from the forensic examiners. The goal of this paper is to uncover interesting details that the researcher overlooked when examining a smartphone.

F. Chevonne Thomas Dancer
Privacy Preserving Computation in Home Loans Using the FRESCO Framework

Secure Multiparty Computation (SMC) is a subfield of cryptography that allows multiple parties to compute jointly on a function without revealing their inputs to others. The technology is able to solve potential privacy issues that arises when a trusted third party is involved, like a server. This paper aims to evaluate implementations of Secure Multiparty Computation and its viability for practical use. The paper also seeks to understand and state the challenges and concepts of Secure Multiparty Computation through the construction of a home loan calculation application. Encryption over Multi Party Computation (MPC) is done within 2 to 2.5 s. Up to 10 K addition operations, MPC system performs very well and most applications will be sufficient within 10K additions.

Fook Mun Chan, Quanqing Xu, Hao Jian Seah, Sye Loong Keoh, Zhaohui Tang, Khin Mi Mi Aung
Practically Realisable Anonymisation of Bitcoin Transactions with Improved Efficiency of the Zerocoin Protocol

As transaction records are public and unencrypted, Bitcoin transactions are not anonymous and the privacy of users can be compromised. This paper has explored several methods of making Bitcoin transactions anonymous, and Zerocoin, a protocol that anonymises transactions based on Non-Interactive Zero-Knowledge proofs, is identified as a promising option. Although theoretically sound, the Zerocoin research has two shortcomings: (1) Zerocoin transactions are vastly inefficient compared to Bitcoin transactions in terms of verification time and size; and (2) despite this inefficiency, the protocol has not been tested in an actual Bitcoin network to validate its practicality. This paper addresses these two problems by first making performance improvements to the Accumulator Proof of Knowledge (AccPoK) and Serial Number Signature of Knowledge (SNSoK) in the Zerocoin protocol, and then integrating both the original and improved protocol into the Bitcoin client software to evaluate their performances in a Bitcoin network. Results show that the improved Zerocoin protocol reduces the verification time and size of the SNSoK by 80 and 60 times, respectively, and reduces the size of the AccPoK by 25%. These translate to a 3.41 to 6.45 times reduction in transaction latency and a 2.5 times reduction in block latency in the Bitcoin network. Thus, with the improved Zerocoin protocol, anonymising Bitcoin transactions has become more practical.

Jestine Paul, Quanqing Xu, Shao Fei, Bharadwaj Veeravalli, Khin Mi Mi Aung
Walsh Sampling with Incomplete Noisy Signals

With the advent of massive data outputs at a regular rate, admittedly, signal processing technology plays an increasingly key role. Nowadays, signals are not merely restricted to physical sources, they have been extended to digital sources as well. Under the general assumption of discrete statistical signal sources, we propose a practical problem of sampling incomplete noisy signals for which we do not know a priori and the sampling size is bounded. We approach this sampling problem by Shannon’s channel coding theorem. Our main results demonstrate that it is the large Walsh coefficient(s) that characterize(s) discrete statistical signals, regardless of the signal sources. By the connection of Shannon’s theorem, we establish the necessary and sufficient condition for our generic sampling problem for the first time. Our generic sampling results find practical and powerful applications in not only statistical cryptanalysis, but software system performance optimization.

Yi Janet Lu
Investigating the Effective Use of Machine Learning Algorithms in Network Intruder Detection Systems

Research into the use of machine learning techniques for network intrusion detection, especially carried out with respect to the popular public dataset, KDD cup 99, have become commonplace during the past decade. The recent popularity of cloud-based computing and the realization of the associated risks are the main reasons for this research thrust. The proposed research demonstrates that machine learning algorithms can be effectively used to enhance the performance of existing intrusion detection systems despite the high misclassification rates reported in the literature. This paper reports on an empirical investigation to determine the underlying causes of the poor performance of some of the well-known machine learning classifiers. Especially when learning from minor classes/attacks. The main factor is that the KDD cup 99 dataset, which is popularly used in most of the existing research, is an imbalanced dataset due to the nature of the specific intrusion detection domain, i.e. some attacks being rare and some being very frequent. Therefore, there is a significant imbalance amongst the classes in the dataset. Based on the number of the classes in the dataset, the imbalance dataset issue can be considered a binary problem or a multi-class problem. Most of the researchers focus on conducting a binary class classification as conducting a multi-class classification is complex. In the research proposed in this paper, we consider the problem as a multi-class classification task. The paper investigates the use of different machine learning algorithms in order to overcome the common misclassification problems that have been faced by researchers who used the imbalance KDD cup 99 dataset for their investigations. Recommendations are made as for which classifier is best for the classification of imbalanced data.

Intisar S. Al-Mandhari, L. Guan, E. A. Edirisinghe
Anonymization of System Logs for Preserving Privacy and Reducing Storage

System logs constitute valuable information for analysis and diagnosis of systems behavior. The analysis is highly time-consuming for large log volumes. For many parallel computing centers, outsourcing the analysis of system logs (syslogs) to third parties is the only option. Therefore, a general analysis and diagnosis solution is needed. Such a solution is possible only through the syslog analysis from multiple computing systems. The data within syslogs can be sensitive, thus obstructing the sharing of syslogs across institutions, third party entities, or in the public domain. This work proposes a new method for the anonymization of syslogs that employs de-identification and encoding to provide fully shareable system logs. In addition to eliminating the sensitive data within the test logs, the proposed anonymization method provides 25% performance improvement in post-processing of the anonymized syslogs, and more than 80% reduction in their required storage space.

Siavash Ghiasvand, Florina M. Ciorba
Detecting Target-Area Link-Flooding DDoS Attacks Using Traffic Analysis and Supervised Learning

A novel class of extreme link-flooding DDoS (Distributed Denial of Service) attacks is designed to cut off entire geographical areas such as cities and even countries from the Internet by simultaneously targeting a selected set of network links. The Crossfire attack is a target-area link-flooding attack, which is orchestrated in three complex phases. The attack uses a massively distributed large-scale botnet to generate low-rate benign traffic aiming to congest selected network links, so-called target links. The adoption of benign traffic, while simultaneously targeting multiple network links, makes detecting the Crossfire attack a serious challenge. In this paper, we present analytical and emulated results showing hitherto unidentified vulnerabilities in the execution of the attack, such as a correlation between coordination of the botnet traffic and the quality of the attack, and a correlation between the attack distribution and detectability of the attack. Additionally, we identified a warm-up period due to the bot synchronization. For attack detection, we report results of using two supervised machine learning approaches: Support Vector Machine (SVM) and Random Forest (RF) for classification of network traffic to normal and abnormal traffic, i.e, attack traffic. These machine learning models have been trained in various scenarios using the link volume as the main feature set.

Mostafa Rezazad, Matthias R. Brust, Mohammad Akbari, Pascal Bouvry, Ngai-Man Cheung
Intrusion Detection System Based on a Deterministic Finite Automaton for Smart Grid Systems

Smart grid system is a target to many types of network attacks, such as Denial Of Serive (DOS), Man In The Middle (MITM) ... that could compromise users’ privacy and network’s integrity and availability. For this reason we developed a network based intrusion detection system (IDS) that relies on a deterministic finite automaton (DFA) to recognize an attack’s language that represents some of the attacks that can target the smart grid network. The attacks’ language is defined using elements or symbols that help identifying each type of attack. Results of our simulations show the efficiency of our IDS and its ability to detect dangerous cyber attacks.

Nadia Boumkheld, Mohammed El Koutbi
The Detection of Fraud Activities on the Stock Market Through Forward Analysis Methodology of Financial Discussion Boards

Financial discussion boards (FDBs) or financial forums on the Internet allow investors and traders to interact with each other in the form of posted comments. The purpose of such FDBs allows investors and traders to exchange financial knowledge. Unfortunately, not all posted content on FDBs is truthful. While there are genuine investors and traders on FDBs, deceivers make use of such publicly accessible share price based FDBs to carry out financial crimes by tricking novice investors into buying the fraudulently promoted stocks. Generally, Internet forums rely on default spam filtering tools like Akismet. However, Akismet does not moderate the meaning of a posted content. Such moderation relies on continuous manual tasks performed by human moderators, but it is expensive and time consuming to perform. Furthermore, no relevant authorities are actively monitoring and handling potential financial crimes on FDBs due to the lack of moderation tools. This paper introduces a novel methodology, namely, forward analysis, employed in an Information Extraction (IE) system, namely, FDBs Miner (FDBM). This methodology aims to highlight potentially irregular activities on FDBs by taking both comments and share prices into account. The IE prototype system will first extract the public comments and per minute share prices from FDBs for the selected listed companies on London Stock Exchange (LSE). Then, in the forward analysis process, the comments are flagged using a predefined Pump and Dump financial crime related keyword template. By only flagging the comments against the keyword template, results indicate that 9.82% of the comments are highlighted as potentially irregular. Realistically, it is difficult to continuously read and moderate the massive amount of daily posted comments on FDBs. The incorporation of the share price movements can help to categorise the flagged comments into different price hike thresholds. This allows related investigators to investigate the flagged comments based on priorities depending on the risk levels as it can possibly reveal real Pump and Dump crimes on FDBs.

Pei Shyuan Lee, Majdi Owda, Keeley Crockett
Security Enhancement of Internet of Things Using Service Level Agreements and Lightweight Security

In the era of the Internet, people can interconnect and obtain information via the network. Through the combinational use of network and various wireless communication sensing networks, objects can now communicate with each other through the Internet environment. Furthermore, as IT evolves and as IPv6 technology eventually matures, data transmission can be carried out between smart objects by using sensing networks, networking, and computing functions. This concept which is emerging into a network environment is known as the Internet of Things (IoT). However, the related researches of the IoT have not discussed the data insecurity issues. This study establishes security level agreements to ameliorate excessive computational loads with the lightweight security mechanism so that data can be protected in the perception layer, then the computational cost of data encrypted in the perception layer.

Shu-Ching Wang, Ya-Jung Lin, Kuo-Qin Yan, Ching-Wei Chen
Enhancing the Usability of Android Application Permission Model

Visualization is an encouraging tool for the study and understanding of text. Effective Visualization refers to visualization of data in a way that it requires minimal training for the user and is easy to understand. Keeping in consideration the user privacy and concern when it comes to android application permission model, the textual representation of permissions is transformed into visualization and the effect is examined deeply. The results depict that the purpose of visualization has been achieved. With the use of technique of visualization, users read, understand, acknowledge and are more aware about the permissions being accessed by the application.

Zeeshan Haider Malik, Habiba Farzand, Zahra Shafiq
MUT-APR: MUTation-Based Automated Program Repair Research Tool

Automated program repair (APR) techniques introduced to fix faults through the insertion of new code or modifying existing ones until the program under test passes a given set of test cases called repair tests. MUTation based Automated Program Repair (MUT-APR) is a prototype tool built on the top of GenProg to fix binary operator faults in C programs. MUT-APR is a configurable mutation-based APR tool that varies the APR mechanisms and components to find the best combination for a problem under study. The implementation of MUT-APR components is discussed in this paper as well as how the tool can be used to repair faults. Limitations of the existing framework are also discussed.

Fatmah Y. Assiri, James M. Bieman
GPU_MF_SGD: A Novel GPU-Based Stochastic Gradient Descent Method for Matrix Factorization

Recommender systems are used in most of nowadays applications. Providing real-time suggestions with high accuracy is considered as one of the most crucial challenges that face them. Matrix factorization (MF) is an effective technique for recommender systems as it improves the accuracy. Stochastic Gradient Descent (SGD) for MF is the most popular approach used to speed up MF. SGD is a sequential algorithm, which is not trivial to be parallelized, especially for large-scale problems. Recently, many researches have proposed parallel methods for parallelizing SGD. In this research, we propose GPU_MF_SGD, a novel GPU-based method for large-scale recommender systems. GPU_MF_SGD utilizes Graphics Processing Unit (GPU) resources by ensuring load balancing and linear scalability, and achieving coalesced access of global memory without preprocessing phase. Our method demonstrates 3.1X–5.4X speedup over the most state-of-the-art GPU method, CuMF_SGD.

Mohamed A. Nassar, Layla A. A. El-Sayed, Yousry Taha
Effective Local Reconstruction Codes Based on Regeneration for Large-Scale Storage Systems

We introduce Regenerating-Local Reconstruction Codes (R-LRC) and describe their encoding and decoding techniques in this paper. After that their repair bandwidths of different failure patterns are investigated. We also explore an alternative of R-LRC, which gives R-LRC lower repair bandwidth. Since R-LRC is an extended version of Pyramid codes, optimization of repair bandwidth of a single failure will also apply to R-LRC. Compared with Pyramid Codes, Regenerating-Local Reconstruction Codes have two benefits: (1) In an average, they use around 2.833 blocks in repairing 2 failures while the Pyramid codes use about 3.667 blocks. Hence, they have lower IOs than Pyramid Codes. (2) When there are 2 failures occurring at common block group and special block group, they require only around M/2, which is lower compared with M in Pyramid codes when k ≥ 2. In addition, we present an efficient interference alignment mechanism in R-LRC, which performs algebraic alignment so that the useless and unwanted dimension is decreased. Therefore, the network bandwidth consumption is reduced.

Quanqing Xu, Hong Wai Ng, Weiya Xi, Chao Jin
Blockchain-Based Distributed Compliance in Multinational Corporations’ Cross-Border Intercompany Transactions
A New Model for Distributed Compliance Across Subsidiaries in Different Jurisdictions

Multinational Corporations (MNCs) have been facing increasing challenges of compliance and audit in their complex crisscrossing intercompany transactions which are subject to various regulations of across various jurisdictions in its global network. In this paper, we investigate these challenges and apply Blockchain-based distributed ledger technology to address them. We propose a Blockchain-based compliance model and then build a Blockchain-based distributed compliance solution to enable automatic distributed compliance enforcement for an MNC’s cross-border intercompany transactions across all Jurisdictions of the MNC’s network. This solution can help an MNC reduce the risk of compliance and audit, and enhance the capability of responding to various audits efficiently even after many years, thus improve the trust relationship and reputation with various auditors.

Wenbin Zhang, Yuan Yuan, Yanyan Hu, Karthik Nandakumar, Anuj Chopra, Sam Sim, Angelo De Caro
HIVE-EC: Erasure Code Functionality in HIVE Through Archiving

Most of the researches being conducted in the area of cloud storage using Erasure Codes are mainly concentrated in either finding optimal solution for a lesser storage capacity or lesser bandwidth consumption. In this paper, our goal is to provide Erasure Code functionalities directly from the application layer. For this purpose, we reviewed some application layer languages, namely, Hive, Pig and Oozie, and opt for the addition EC support in Hive. We develop several Hive commands that allow Hive tables to be first archived and then encoded or decoded with different parameters, such as join and union. We test our implementation using the MovieLen Dataset locally and on the cloud. We also compare the performance against a replicated system.

Aatish Chiniah, Mungur Utam Avinash Einstein
Self and Regulated Governance Simulation
Exploring Governance for Blockchain Technology

Blockchain technology and blockchain applications sit at the cross-road of data science and Internet of Things applications where getting the governance right for this new technological paradigm is of core concern for leaders aspiring to realize smart city and living initiatives. In this research, we deploy computational simulation of self and regulated governance and extend the findings to the new blockchain technology ecosystems. We propose that getting the governance approach right is as important as getting the technological platform issues resolved.

Hock Chuan Lim

Open Access

Emergency Departments
A Systematic Mapping Review

Emergency services are essential and any person may require these services at some point in their lives. Emergency services are run by complex management and consist of many different parts. It is essential to establish effective procedures to ensure that patients are treated in a timely fashion. By obtaining real-time information, it is expected that intelligent decisions would be made. Hence, thorough analytics of problems concerning appropriate operational effective management, would help prevent patient dissatisfaction in the future. Mapping studies are utilized to configure and explore a research theme, whereas systematic reviews are utilized to combine proofs. The use of improvement strategies and quality measurements of the health care industry, specifically in emergency departments, are essential to value patients’ level of satisfaction and the quality of the service provided based on patients’ experience. This paper explores and creates momentum with all the methodologies utilized by researchers from 2010 and beyond with the stress on patient fulfillment in the emergency services segment.

Salman Alharethi, Abdullah Gani, Mohd Khalit Othman
Benchmarking the Object Storage Services for Amazon and Azure

Cloud computing is increasingly being used as a new computing model that provide users rapid on-demand access to computing resources with reduced cost and minimal management overhead. Data storage is one of the most famous cloud services that have attracted great attention in the research field. In this paper, we focus on the object storage of Microsoft Azure and Amazon cloud computing providers. This paper reviews object storage performance of both Microsoft Azure blob storage and Amazon simple storage service. Security and cost models for both cloud providers have been discussed as well.

Wedad Ahmed, Hassan Hajjdiab, Farid Ibrahim
An Improvement of the Standard Hough Transform Method Based on Geometric Shapes

Hough Transform is a well known method in image processing, for straight line recognition, very popular for detecting complex forms, such as circles, ellipses, arbitrary shapes in digital images. In this paper, we are interested in the Hough transform method that associates a point to a sine curve, named the standard hough transform, applied to a big set of continue points such as triangles, rectangles, octogons, hexagons in order to overcome time problem, due to the small size of a pixel and to establish optimization techniques for the Hough Transform method in time complexity, in the main purpose to obtain thick analytical straight line recognition, in following some parameters. The proposed methods, named Triangular Hough Transform and Rectangular Hough Transform considers an image as a grid, respectively represented in a triangular tiling or a rectangular tiling and contribute to have accumulator data to reduce computation time, accepting limited noises in straight line detection. The analysis also deals with the case of geometric shapes, such as octogons and hexagons where the tiling procedure of image space is necessary to obtain new Hough Transform methods based on these forms.

Abdoulaye Sere, Frédéric T. Ouedraogo, Boureima Zerbo
Recognition of Fingerprint Biometric System Access Control for Car Memory Settings Through Artificial Neural Networks

Recognition and authentication are important factors for implementation in every computerized system. This particularly plays a significant role in electronic banking and luxurious cars. PIN code or key can be lost or stolen by an imposter. Therefore, the characteristics of humans are the best recognition points to authenticate a user. Artificial Neural Network (ANN) is the only computational network which works as the working of human brain and its neurons function by adopting the features of a human. In this research, we have proposed an algorithm for training of fingerprint biometric system by implementing Artificial Neural Networks for the recognition of finger features of the human. The method includes detection of minutiae values of the ridge termination and bifurcation points. The multilayer feed forward network is the successful network with error back propagation algorithm for pattern recognition through supervised learning. This network is being used in many applications of recognition and control. This architecture is applicable for finger minutiae extraction for recognition of car user and its features through memory settings. This network gives 99% correct classification for recognition of the user.

Abdul Rafay, Yumnah Hasan, Adnan Iqbal
Haar Cascade Classifier and Lucas–Kanade Optical Flow Based Realtime Object Tracker with Custom Masking Technique

Computer vision has been proven a remarkable entity in modern computer science. Different applications of this field have been used on a regular basis. In this paper, we propose a new model for tracking real time object (i.e., car, human face, interior objects, arms, etc.) from video feed by providing training with HAAR features and also with the implementation of Lucas Kanade Optical Flow including Custom Masking technique. Object tracking has been considered to be very much useful in augmented reality, security, virtual reality, training with simulation, etc. In this research, we have trained our classifier of a specific object for detection purpose. Upon successful training of the classifier, a video footage has been passed into the classifier. Firstly, it detects region of interest (ROI) consisting of the object(s). Secondly, with necessary preprocessing techniques we have detected the contour area enclosing the object. Finding out the contour within the detected object(s) enabled us to create green color within the contour enclosing area. We only kept green contours and subtracted everything from the frames of the scene by Custom Masking technique. Finally, we tracked the object by trailing the green contours of the detected object(s) by Lucas Kanade Optical Flow. Our developed system is able to detect and track different object(s) from a video feed with the settings of less than or equal to 30 frames per second (FPS).

Karishma Mohiuddin, Mirza Mohtashim Alam, Amit Kishor Das, Md. Tahsir Ahmed Munna, Shaikh Muhammad Allayear, Md. Haider Ali
Modified Adaptive Neuro-Fuzzy Inference System Trained by Scoutless Artificial Bee Colony

Neuro-fuzzy systems have produced high accuracy in modeling numerous real-world applications. However, the in-built computational complexity and curse of dimensionality often cease opportunities of implementations in applications with large input size. This is also true with adaptive neuro-fuzzy inference system (ANFIS) as mostly the applications in literature are with small input size. The five-layer architecture of ANFIS is modified in this paper to reduce computational cost. For effective parameters training, the popular swarm-based metaheuristic algorithm Artificial Bee Colony (ABC) algorithm is employed after modification for enhanced convergence ability. The proposed ABC variant eliminates scout bees, hence called ABC-Scoutless, outperforms standard ABC and particle swarm optimization (PSO) on benchmark test functions. The modified ANFIS trained by ABC-Scoutless performs equally better as standard ANFIS on benchmark classification problems with different input range, but with less computational cost due to reduced number of trainable parameters.

Mohd Najib Mohd Salleh, Norlida Hassan, Kashif Hussain, Noreen Talpur, Shi Cheng
Conditional Image Synthesis Using Stacked Auxiliary Classifier Generative Adversarial Networks

Synthesizing photo-realistic images has been a long-standing challenge in image processing and could provide crucial approaches for dataset augmentation and balancing. Traditional methods have trouble in dealing with the rich and complicated structural information of objects resulting from the variations in colors, poses, textures and illumination. Recent advancement in Deep Learning techniques presents a new perspective to this task. The aim of our paper is to apply state-of-the-art generative models to synthesize diverse and realistic high-resolution images. Extensive experiments have been conducted on celebA dataset, a large-scale face attributes dataset with more than 200 thousand celebrity images, each with 40 attribute labels. Enlightened by existing structures, we present stacked Auxiliary Classifier Generative Adversarial Networks (Stack-ACGAN) for image synthesis given conditioning labels, which generates low resolution images (e.g. $$64\times 64$$ ) that sketch basic shapes and colors in Stage-I and high resolution images (e.g. $$256\times 256$$ ) with plausible details in Stage-II. Inception scores and Multi-Scale Structural Similarity (MS-SSIM) are computed for evaluation of the synthesized images. Both quantitative and qualitative analysis prove the proposed model is capable of generating diverse and realistic images.

Zhongwei Yao, Hao Dong, Fangde Liu, Yike Guo
Intelligent Time Series Forecasting Through Neighbourhood Search Heuristics

Automated forecasting is essential to business operations that handle scores of univariate time series. Practitioners have to deal with thousands of time series with a periodicity ranging from seconds to monthly. The sheer velocity and volume of time series make it challenging for human labour to manually identify the order of the time series to forecast the results. An automated forecasting algorithm or framework is essential to complete the task. The approach must be robust in the identification of the order of the time series, and readily applicable to scores of time series without manual intervention. The most modern automated forecasting algorithms are derived from exponential smoothing or ARIMA models. In this paper, the authors proposed a new heuristics approach to identify the initial starting point for a neighbourhood search to obtain the most appropriate model. The results of this method are used to compare against the methods proposed in the literature.

Murphy Choy, Ma Nang Laik
SmartHealth Simulation Representing a Hybrid Architecture Over Cloud Integrated with IoT
A Modular Approach

Every field is being evolved in a new direction with the technological advancement. In case of healthcare, the traditional system is being transferred over to cloud with integration of Internet of Things (IoT) inclusive of all the smart devices, wearable body sensors and mobile networks. This healthcare community cloud would be a beginning of context aware services being provided to patients at their home or at the place of medical incident. This context aware platform built over the cloud and IoT integrated infrastructure would save cost as well as time to reach to hospital and ensuring the availability of services by the qualified staff. In this paper, researchers would be simulating the healthcare community cloud that is context aware of the patients based on their current medical condition. This simulation model is compared to another previously simulated SelfServ Platform in combination with societal information system that used NetLogo.

Sarah Shafqat, Almas Abbasi, Tehmina Amjad, Hafiz Farooq Ahmad
SDWM: Software Defined Wi-Fi Mobility for Smart IoT Carriers

More and more companies are advocating for additional unlicensed spectrum in next-generation Wi-Fi to cover the vast increase in internet connected devices. Additional 50 billion devices are projected to be connected by 2020. Wi-Fi is regarded as the “Oxygen for Innovation” that will incorporate new bunches of services in the smart IoT era as of low cost service delivery. This paper introduces a new mobility framework, called SDWM, based on Software Defined Networking (SDN) to extend residential/enterprise indoor real- time services across standard carriers and service providers with Network Functions Virtualization (NFV) in smart cities. Efficient date forwarding mechanism and traffic offload technique are adopted to avoid core network congestion. Indoor services are extended over any type of infrastructure without enforcing small cell setup. Mobility is achieved through SDN overlay network that dynamically establishes virtual path to roaming mobile node’s (MN) home network using a new unique identifier that is forwarded during DHCP IP allocation process. The distributed architecture simplifies the integration to existing infrastructures with unified access to both wireless and wired networks. A physical prototype is created to illustrate how mobile nodes can roam freely across carriers’ wireless hotspots with direct agreements with home networks while ensuring seamless accessibility to indoor services without violating involved entities’ security policies. Experimental results show clear improvements over existing mobility protocols or wireless controllers as of restricting tunnels overheads and VLAN/MPLS headers.

Walaa F. Elsadek, Mikhail N. Mikhail
Smart Eco-Friendly Traffic Light for Mauritius

Nowadays going out anywhere, especially in urban areas is becoming more and more of a headache. Going out during peak hours to go to work or from leaving work, and being stuck for a long time in traffic is just frustrating. Having traffic lights on the road during peak hours sometimes do not really help. Traditional traffic lights do not really account for the difference of vehicle density on the different lanes. Thus police officers are tasked to take responsibility to control traffic at junctions which, if not controlled, can be chaotic. Furthermore, the police officers, while controlling traffic are exposed to harmful gas emission which can be disastrous for health. Commuters who are stuck in the traffic are also exposed to notorious gas emission. As a consequence the problem is that traditional traffic lights do not react dynamically to the change in traffic density at different point of time and also they do not take into account the amount of pollutants which drivers are exposed to. To try and solve these two problems, this paper proposes a smart traffic light which takes into account the density of vehicle in a lane as well as the level of vehicle emission within each lane. Such that if the level of vehicle emission is notorious for the health within a lane, the traffic light will go green otherwise it will remain red up until a threshold number of vehicle has been reached within a lane. The smart traffic light, through the use of Internet of Things, works with sensors such as magnetic and gas to detect the amount of vehicles and gas emission levels on each lane, respectively. Each lane has a set of these sensors connected to an Arduino which in turn are all connected to a central Raspberry Pi. The Raspberry Pi, being connected to the Internet, will do all the processing via Node-RED. Node-RED is a graphical interface for node.js. All data captured by the sensors are sent to the IBM Bluemix Cloud for analysis. With this system it is envisaged a more fluid and dynamic traffic which takes vehicle emission into account at the traffic light.

Avinash Mungur, Abdel Sa’d Bin Anwar Bheekarree, Muhammad Bilaal Abdel Hassan
Logistics Exceptions Monitoring for Anti-counterfeiting in RFID-Enabled Supply Chains

In recent years, the radio frequency identification (RFID) technology has been used as a promising tool for anti-counterfeiting in RFID-enabled supply chains due to its track-and-trace abilities of contactless object identification with the unique electronic product code (EPC). While this system does improve its performance in many ways, uncertainties of daily operations might bring about one or more logistics exceptions, which would further trigger other dependent exceptions. These exceptions could be well organized and exploited by adversaries to fool the system for counterfeiting while related reports are unfortunately very few. In this paper, we presented our results focusing on the inter-dependencies between those logistics exceptions and the detecting intelligence in a resource-centric view. A cause-effect relational diagram is first developed to explicitly express the relations among those logistics exceptions by incorporating the taxonomy of exceptions and resource-based theory, and then an improved intelligent exception monitoring system is designed to achieve the goal of autonomous, flexible, collaborative and reliable logistics services. Finally, a case study of two typical logistics exceptions indicates that our proposed cause-effect diagram outperforms the extant approaches in the understanding of group logistics exceptions, which enables the designed monitoring system to perform well for anti-counterfeiting.

Xiaoming Yao, Xiaoyi Zhou, Jixin Ma
Precision Dairy Edge, Albeit Analytics Driven: A Framework to Incorporate Prognostics and Auto Correction Capabilities for Dairy IoT Sensors

Oxford English Dictionary defines Prognostics as “an advance indication of a future event, an omen”. Generally, it is confined to fortune or future foretellers, more have subjective or intuition driven. Data Science, on the other hand, embryonically enables to model and predict the health condition of a system and/or its components, based upon current and historical system generated data or status. The chief goal of prognostics is precise estimation of Remaining Useful Life (RUL) of equipment or device. Through our research and through industrial field deployment of our Dairy IoT Sensors, we emphatically conclude that Prognostics is a vital marker in the lifecycle of a device that can be deduced as inflection point to trigger auto-corrective, albeit edge analytics driven, in Dairy IoT Sensors so that the desired ship setting functions can be achieved with precision. Having auto-corrective capability, importantly, plays pivotal role in achieving satisfaction of Dairy farmers and reducing the cost of maintaining the Dairy sensors to the manufacturers as these sensors are deployed in geographically different regions with intermittent or network connectivity. Through this paper, we propose an inventive, albeit, small footprint, ML (Machine Learning) dairy edge that incorporates supervised and unsupervised models to detect prognostics conditions so as to infuse auto-corrective behavior to improve the precision of dairy edge. The paper presents industrial dairy sensor design and deployment as well as its data collection and certain field experimental results.

Santosh Kedari, Jaya Shankar Vuppalapati, Anitha Ilapakurti, Chandrasekar Vuppalapati, Sharat Kedari, Rajasekar Vuppalapati
An IoT System for Smart Building

With the notion of Smart City, we have entered a hyper-connected world that puts the city, the challenges of massive urbanization and digital at the center of reflection. At the same time, several continents (Europe, Africa, North America, Asia) are experiencing a multi-faceted terrorist threat (attacks, cyberattacks) that uses infrastructures, city gathering places or the Internet as a platform for expression. he Internet of tomorrow is the Internet of Things, the one where objects come alive, are attentive, interact with each other, but also and especially with us, our family, our friends, our colleagues, our daily counterparts … the one where they serve individuals, societies and the planet, to better accompany us. It is a network where billions of connected objects feel, understand and act, to anticipate our needs, but above all, to respond to our choices. The smart city is a promise whose security is an essential link, especially since it integrates the citizen. We offer a platform dedicated to security in a building that uses IoT. It is a multi-use application that allows to assemble a maximum of security and rapid decisions.

Khoumeri El-Hadi, Cheggou Rabea, Farhah Kamila, Rezzouk Hanane
Aiding Autobiographical Memory by Using Wearable Devices

In this paper, we investigate the effectiveness of two distinct techniques (Special Moment Approach and Spatial Frequency Approach) for reviewing the lifelogs, which were collected using a wearable camera and a bracelet, simultaneously for two days. Special moment approach is a technique for extracting episodic events. Spatial frequency approach is a technique for associating visual with temporal and location information. Heat map is applied as the spatial data for expressing frequency awareness. Based on this, the participants were asked to fill in two post-study questionnaires for evaluating the effectiveness of those two techniques and their combination. The preliminary result showed the positive potential of exploring individual lifelogs using our approaches.

Jingyi Wang, Jiro Tanaka
Intelligent Communication Between IoT Devices on Edges in Retail Sector

In this paper, we have designed a custom communication model for IoT devices on its edges to induce intelligence in the retail sector. Wireless technologies, such as RFID (Radio Frequency Identification), NFC (Near Field Communication), needed BLE (Bluetooth Low Energy), and LPWAN (Low-Power Wide-Area Network), etc. were delegated to perform various tasks within the available services. We introduce Smart Shop (SmSH) architecture that integrates features of context-aware services, platform independent, edge computing and proximity sensing to establish seamless connectivity for the customer entrusted tasks. This architecture should encompass the M2M communication devices and gateways which are generally connected to the cloud environment to establish meaningful communication in commercial IoT applications. Our architecture has been proposed primarily for the retail sector and can be extended easily to other domains by altering the required functionalities, device types and services.

M. Saravanan, N. C. Srinidhi Srivatsan
A Survey on SDN Based Security in Internet of Things

Internet of Things (IoT) is an emerging technology where tens of billions of devices that include everything from small wearable fitness bands, medical devices, smart devices to factory automobiles can be connected to the Internet which makes the life easy without or with little human intervention. Though IoT has proven to be more transformative, as its market size increases, it is really a big challenge to secure such a large number of devices that are connected by a complex heterogeneous network with a variety of access protocols. Software defined networking (SDN) decouples the control plane from the data plane, enabling fast reaction to security threats and security policy enforcement. IoT security can be achieved by the integration of SDN with IoT. SDN is an intelligent network paradigm which can open up ways to secure IoT and different access control mechanisms. This survey paper analyzes SDN based IoT security mechanisms to secure communications in IoT and present open research issues.

Renuga Kanagavelu, Khin Mi Mi Aung
Cerebral Blood Flow Monitoring Using IoT Enabled Cloud Computing for mHealth Applications

This paper presents a novel application of cloud computing enabled by Internet of Things (IoT) in monitoring parameters affecting cerebral blood flow (CBF) which is the movement of blood through the network of cerebral arteries and veins supplying the brain. The example design implemented in this proposal can be easily replaced with similar applications to generalize the concept offered by our work. This enables healthcare professionals have ubiquitous access to the processed medical data (in this case cerebral blood flow) of patients during their treatment process. Using cloud computing, accessing data from multiple locations has become easier. Big Data analytic frameworks enabled by IoT and cloud computing methods present new opportunities to extract new knowledge and create novel applications in health care domain. This paper shows one of the novel methods to improve the quality of health care data processing inside the cloud. Our scheme proposes a design in which the cerebral circulation data is captured using sensors connected to Raspberry Pi and then pushed to the cloud, stored in database, processed, and analyzed. Results then will be retrieved and distributed to medical professionals via Android mobile application. This application is designed to keep track of cerebral circulation and process the data obtained through the sensors. Anomalies, such as oxygen imbalance, internal bleeding, swelling due to an increase of water, and disturbance in blood flow that can lead to serious health issues can be detected. We have used Amazon web services (AWS) cloud platform to perform cloud services. Our approach is inspired by Amazon Simple Beer Service (SBS) [10]; a cloud-connected kegerator; that sends sensor data (beer flow and in our case cerebral circulation data flow) to AWS [5]. SBS publishes sensor data collected by an IoT enabled device (Raspberry Pi) to an AWS application program interface (API) gateway over Hypertext Transfer Protocol Secure (HTTPS). To the best of our knowledge this is the first scheme offered to replace the manual process of monitoring CBF using biomedical electronic devices.

Beulah Preethi Vallur, Krishna Murthy Kattiyan Ramamoorthy, Shahnam Mirzaei, Shahram Mirzai
When Siri Knows How You Feel: Study of Machine Learning in Automatic Sentiment Recognition from Human Speech

Opinions and sentiments are essential to human activities and have a wide variety of applications. As many decision makers turn to social media due to large volume of opinion data available, efficient and accurate sentiment analysis is necessary to extract those data. Hence, text sentiment analysis has recently become a popular field and has attracted many researchers. However, extracting sentiments from audio speech remains a challenge. This project explored the possibility of applying supervised Machine Learning in recognizing sentiments in English utterances on a sentence level. In addition, the project also aimed to examine the effect of combining acoustic and linguistic features on classification accuracy. Six audio tracks were randomly selected to be training data from 40 YouTube videos (monologue) with strong presence of sentiments. Speakers expressed sentiments towards products, films, or political events. These sentiments were manually labelled as negative and positive based on independent judgment of three experimenters. A wide range of acoustic and linguistic features were then analyzed and extracted using sound editing and text mining tools, respectively. A novel approach was proposed, which used a simplified sentiment score to integrate linguistic features and estimate sentiment valence. This approach improved negation analysis and hence increased overall accuracy. Results showed that when both linguistic and acoustic features were used, accuracy of sentiment recognition improved significantly, and that excellent prediction was achieved when the four classifiers were trained, respectively, namely, kNN, SVM, Neural Network, and Naïve Bayes. Possible sources of error and inherent challenges of audio sentiment analysis were discussed to provide potential directions for future research.

L. Zhang, E. Y. K. Ng
A Generic Multi-modal Dynamic Gesture Recognition System Using Machine Learning

Human computer interaction facilitates intelligent communication between humans and computers, in which gesture recognition plays a prominent role. This paper proposes a machine learning system to identify dynamic gestures using tri-axial acceleration data acquired from two public datasets. These datasets, uWave and Sony, were acquired using accelerometers embedded in Wii remotes and smartwatches, respectively. A dynamic gesture signed by the user is characterized by a generic set of features extracted across time and frequency domains. The system was analyzed from an end-user perspective and was modelled to operate in three modes. The modes of operation determine the subsets of data to be used for training and testing the system. From an initial set of seven classifiers, three were chosen to evaluate each dataset across all modes rendering the system towards mode-neutrality and dataset-independence. The proposed system is able to classify gestures performed at varying speeds with minimum preprocessing, making it computationally efficient. Moreover, this system was found to run on a low-cost embedded platform – Raspberry Pi Zero (USD 5), making it economically viable.

G. Gautham Krishna, Karthik Subramanian Nathan, B. Yogesh Kumar, Ankith A. Prabhu, Ajay Kannan, Vineeth Vijayaraghavan
A Prediction Survival Model Based on Support Vector Machine and Extreme Learning Machine for Colorectal Cancer

Colorectal cancer is the third largest cause of cancer deaths in men and second most common in women worldwide. In this paper, a prediction model based on Support Vector Machine (SVM) and Extreme Learning Machine (ELM) combined with feature selection has been developed to estimate colorectal-cancer-specific survival after 5 years of diagnosis. Experiments have been conducted on dataset of Colorectal Cancer patients publicly available from Surveillance, Epidemiology, and End Results (SEER) program. The performance measures used to evaluate proposed methods are classification accuracy, F-score, sensitivity, specificity, positive and negative predictive values and receiver operating characteristic (ROC) curves. The results show very good classification accuracy for 5-year survival prediction for the SVM and ELM model with 80%–20% partition of data with 16 number of features and this is very promising as compared to existing learning models result.

Preeti, Rajni Bala, Ram Pal Singh
Sentiment Classification of Customer’s Reviews About Automobiles in Roman Urdu

Text mining is a broad field having sentiment mining as its important constituent in which we try to deduce the behavior of people towards a specific item, merchandise, politics, sports, social media comments, review sites, etc. Out of many issues in sentiment mining, analysis and classification, one major issue is that the reviews and comments can be in different languages, like English, Arabic, Urdu, etc. Handling each language according to its rules is a difficult task. A lot of research work has been done in English Language for sentiment analysis and classification but limited sentiment analysis work is being carried out on other regional languages, like Arabic, Urdu and Hindi. In this paper, Waikato Environment for Knowledge Analysis (WEKA) is used as a platform to execute different classification models for text classification of Roman Urdu text. Reviews dataset has been scrapped from different automobiles’ sites. These extracted Roman Urdu reviews, containing 1000 positive and 1000 negative reviews are then saved in WEKA attribute-relation file format (ARFF) as labeled examples. Training is done on 80% of this data and rest of it is used for testing purpose which is done using different models and results are analyzed in each case. The results show that Multinomial Naïve Bayes outperformed Bagging, Deep Neural Network, Decision Tree, Random Forest, AdaBoost, k-NN and SVM Classifiers in terms of more accuracy, precision, recall and F-measure.

Moin Khan, Kamran Malik
Hand Gesture Authentication Using Depth Camera

Nowadays humans are concerned more about their privacy because traditional text password becomes weaker to defend from various attacks. Meanwhile, somatosensory become popular, which makes gesture authentication become possible. This research tries to use humans dynamic hand gesture to make an authentication system, which should have low limitation and be natural. In this paper, we describe a depth camera based dynamic hand gesture authentication method, and generate a template updating mechanism for the system. In the case of simple gesture, the average accuracy is 91.38%, and in the case of complicated gesture, the average accuracy is 95.21%, with 1.65% false acceptance rate. We have also evaluated the system with template updated mechanism.

Jinghao Zhao, Jiro Tanaka
FSL-BM: Fuzzy Supervised Learning with Binary Meta-Feature for Classification

This paper introduces a novel real-time Fuzzy Supervised Learning with Binary Meta-Feature (FSL-BM) for big data classification task. The study of real-time algorithms addresses several major concerns, which are namely: accuracy, memory consumption, and ability to stretch assumptions and time complexity. Attaining a fast computational model providing fuzzy logic and supervised learning is one of the main challenges in the machine learning. In this research paper, we present FSL-BM algorithm as an efficient solution of supervised learning with fuzzy logic processing using binary meta-feature representation using Hamming Distance and Hash function to relax assumptions. While many studies focused on reducing time complexity and increasing accuracy during the last decade, the novel contribution of this proposed solution comes through integration of Hamming Distance, Hash function, binary meta-features, binary classification to provide real time supervised method. Hash Tables (HT) component gives a fast access to existing indices; and therefore, the generation of new indices in a constant time complexity, which supersedes existing fuzzy supervised algorithms with better or comparable results. To summarize, the main contribution of this technique for real-time Fuzzy Supervised Learning is to represent hypothesis through binary input as meta-feature space and creating the Fuzzy Supervised Hash table to train and validate model.

Kamran Kowsari, Nima Bari, Roman Vichr, Farhad A. Goodarzi
A Novel Method for Stress Measuring Using EEG Signals

Stress is one of the major contributing factors which lead to various diseases including cardiovascular diseases. To avoid this, stress monitoring is very essential for clinical intervention and disease prevention. In present study, the feasibility of exploiting Electroencephalography (EEG) signals to monitor stress in mental arithmetic tasks is investigated. This paper presents a novel hardware system along with software system which provides a method for determining stress level with the help of a Theta sub-band of EEG signals. The proposed system performs a signal-processing of EEG signals, which recognizes the peaks of the Theta sub-band above a certain threshold value. It finds the first order difference information to identify the peak. This proposed method of EEG based stress detection can be used as quick, noninvasive, portable and handheld tool for determining the stress level of a person.

Vinayak Bairagi, Sanket Kulkarni
Fog-Based CDN Architecture Using ICN Approach for Efficient Large-Scale Content Distribution

Along with continuing evolution of Internet and its applications, the Content Delivery Network (CDN) has become a hot topic with both opportunities and challenges. CDN was mainly proposed to solve the content availability and download time issues by delivering the content through edge cache servers deployed around the world. The CDN technology helps optimize the traffic on the original servers but with the increasing demand of Internet contents the edge servers become overloaded. In this article, a new perspective on edge content delivery service is described to scale such bottleneck and optimize the overall delivery performance. The Fog computing technology has been emerged recently with salient capabilities motivated us to bring it at CDN model. We propose to introduce Fog nodes at the edge of CDN edges to provide another level of content delivery but it is based on content’s names not its location, which may like Information Centric Networking (ICN). ICN is a new networking architecture, much more adapted to the current Internet usage (user care only about the content or service they want and not about the machine that host it). This approach not only solves the bottleneck problem but also extend the CDN more to the network edge in which the Fog node acts as local content delivery with high reliability and scalability of content access. This novel solution is expected to extend the reach, scale and functionality of content delivery networks (CDNs).

Fatimah Alghamdi, Ahmed Barnawi, Saoucene Mahfoudh
Connectivity Patterns for Supporting BPM in Healthcare

Health information technology frequently leads to unintended consequences (UICs) post implementation. We believe a key cause of UICs are various HIT mediated connections between people and processes. To better manage UICs we first need to understand the nature of these connections. Business Process Management (BPM) approaches can help support HIT design but to date there are no methods focused on identifying patterns of HIT connectivity. This poster describes our three stage method to identify and model HIT connectivity patterns and then map the patterns to existing BPM workflow patterns. We use our method to analyze a case study of a perioperative information system to provide preliminary examples of individual and collaborative connectivity patterns.

Amos Harris, Craig Kuziemsky
A Systematic Review of Adaptive and Responsive Design Approaches for World Wide Web

World Wide Web (WWW) in today’s age is used on different devices. These devices communicate with each other thus creating a design problem of layouts and data mismatch. Consequently, there is a strong need of visualizing and envisioning web data with proper customization suiting different types of users. The purpose of this Systematic Literature Review (SLR) is to investigate the two widely used web based designs i.e. Adaptive Web Design (AWD) and Responsive Web Design (RWD). Particularly, a systematic literature review has been used to identify 58 research works, published during 2009–2017. Consequently, we identify 23 research works regarding AWD, 14 research works related to RWD and 21 research works pertaining to both AWD and RWD. Moreover, 4 significant tools and 13 leading techniques have been identified in the context of AWD and RWD implementation. Finally, significant aspects of two web development approaches (AWD and RWD) are also compared with the traditional web design. It has been concluded that the traditional web design is not sufficient to fulfill the needs of ever-growing web users all around the globe. Therefore, the combination of both AWD and RWD is essential to meet the technological advancements in WWW.

Nazish Yousaf, Wasi Haider Butt, Farooque Azam, Muhammad Waseem Anwar
Quantum Adiabatic Evolution with a Special Kind of Interpolating Paths

We study a special kind of interpolating paths in quantum adiabatic search algorithms. With this special kind of adiabatic paths, notably we find that even when the parameter n within it tends to infinity, the adiabatic evolution would be a failure if the initial state is orthogonal to the final target state. But superficially, it seems that the minimum gap of the quantum system happening at the end of the computation would not affect the validity of the algorithm, since each ground state of the problem Hamiltonian encodes a solution to the problem. When the beginning state has a nonzero overlap with the final state, again if the parameter n within the special interpolating paths tends to infinity, it may give ones the counterintuitive impression that the adiabatic evolution could be considerably faster than the usual simple models of adiabatic evolution, even possible with constant time complexity. However, the fact is that as in the usual case, the quadratic speedup is the quantum algorithmic performance limit for which this kind of interpolating functions can provide for the adiabatic evolution. We also expose other easily made mistakes which may lead to draw the wrong conclusions about the validity of the adiabatic search algorithms.

Jie Sun, Songfeng Lu, Chao Gao
Extraction, Segmentation and Recognition of Vehicle’s License Plate Numbers

In this paper, an automatic vehicle license plate recognition method for Western Australia license plates is proposed. The method consists of three stages, namely, (1) plate extraction; (2) character segmentation; and (3) character recognition. The primary techniques employed in each stage are edge detection, connected component analysis and template matching. An image set of 100 vehicles is generated and used to evaluate the algorithm. The experimental test shows the algorithm’s success rate of 97%, 97% and 98% in Stages 1, 2 and 3, respectively. The respective average time taken in each stage was 234 ms, 37 ms and 29 ms.

Douglas Chai, Yangfan Zuo
A Comparison of Canny Edge Detection Implementations with Hadoop

Edge detection plays a large role in digital image processing; it allows for easier identification of objects within the image from a human perspective and also opens the door for automated object detection via machine learning. One of the common edge detection algorithms, developed by John F. Canny in 1986, is a multi-stage algorithm making use of gradients within the image to calculate potential edges. Hadoop is a Java library built on the MapReduce framework and was designed to be used for distributed processing of big data projects. We, instead, choose to adapt the Canny edge detection algorithm to run via Hadoop using two methods: a streaming Python implementation and a Java implementation to compare their run times and to determine whether or not using Hadoop for such a problem is desirable over the classic sequential implementation. In Sect. 1, we introduce edge detection, explaining what it is and why it is important. In Sect. 2, we explore Canny’s algorithm. In Sect. 3, we then explain our methodology of parallelizing the sequential code and the issues which arise during the process. In Sect. 4, we display the results of our implementations. In Sect. 5 we conclude with a possible explanation of why our code does not perform as well as we had anticipated; due to memory limitations of our specific Hadoop cluster, the implementations created do not perform well for processing typical images, but do allow for the processing of very large images. We also explore possibilities for future work.

Josiah Smalley, Suely Oliveira
Analysis and Prediction About the Relationship of Foreign Exchange Market Sentiment and Exchange Rate Trend

This paper aims at finding the relationship between the market sentiment and the market trend in the foreign exchange market, and predicting the future trend of the market in a specific time period. We analyze the market sentiment through the broadcast news, and use the polynomial naïve byes model to classify the sentiment of the news. We set several time windows, and use time series analysis to predict the future market trend within the time window.

Wanyu Du, Mengji Zhang
Effectiveness of NEdT and Band 10 (8.3 μm) of ASTER/TIR on SSST Estimation

Effectiveness of Noise Equivalent delta Temperature (NEdT) and Band 10 (8.3 μm) of Advanced Spaceborne Thermal Emission and Reflection Radiometer/Thermal Infrared Radiometer (ASTER/TIR) on SST Estimation is confirmed with MODerate resolution atmospheric TRANsmission (MODTRAN). Also, Skin Sea Surface Temperature (SSSST) estimation accuracy of ASTER/TIR (with and without band 10 (8.3 μm)) is evaluated. Through regressive analysis, it is found that NEdT of noise is quite influence while band 10 is very effective to improve SST estimation for Tropic and Mid-Latitude Summer of atmospheric models.

Kohei Arai
Raspberry Pi Based Smart Control of Home Appliance System

The unusual development of the “Web of Things (IoT)” is changing the world, and the quick decrease in the cost of normal IoT segments is enabling individuals to improve new ideas and items at home. The prominence of home computerization has expanded because of the ease and effortlessness through the availability of PDAs and tablets and other savvy gadgets. Keen homes can expand the solace and security of the occupant through, for instance, interfaces to control light, temperature or distinctive electronic gadgets. The administration of vitality assets is a vital issue for keen homes. It is in this manner conceivable to put the warmers on reserve when the occupants are missing or to naturally adjust the utilization of the electrical assets as per the necessities of the inhabitants with the end goal to lessen the squandering of the vitality assets. What’s more, another fundamental objective of the use of data innovation to families is the security of people. This is conceivable through frameworks that can envision risky circumstances or respond to occasions that jeopardize the trustworthiness of individuals. The present work plans a shrewd and secure framework dependent on various sensors and Raspberry Pi by means of the Internet of Things (IoT).

Cheggou Rabea, Khoumeri El-Hadi, Farhah Kamila, Rezzouk Hanane

Open Access

Correction to: Emergency Departments
A Systematic Mapping Review

The original version of the chapter “Emergency Departments: A Systematic Mapping Review” has been revised. It has now been made available open access under a CC BY 4.0 license and the copyright holder has been updated to “The Author(s)”. The book has also been updated with this change.

Salman Alharethi, Abdullah Gani, Mohd Khalit Othman
Backmatter
Metadaten
Titel
Advances in Information and Communication Networks
herausgegeben von
Prof. Dr. Kohei Arai
Supriya Kapoor
Rahul Bhatia
Copyright-Jahr
2019
Electronic ISBN
978-3-030-03405-4
Print ISBN
978-3-030-03404-7
DOI
https://doi.org/10.1007/978-3-030-03405-4

Neuer Inhalt