Skip to main content
Top

2021 | Book

Progress in Advanced Computing and Intelligent Engineering

Proceedings of ICACIE 2020

Editors: Dr. Chhabi Rani Panigrahi, Dr. Bibudhendu Pati, Prof. Binod Kumar Pattanayak, Prof. Seeven Amic, Dr. Kuan-Ching Li

Publisher: Springer Singapore

Book Series : Advances in Intelligent Systems and Computing

insite
SEARCH

About this book

This book focuses on theory, practice and applications in the broad areas of advanced computing techniques and intelligent engineering. This book includes 74 scholarly articles which were accepted for presentation from 294 submissions in the 5th ICACIE during 25–27 June 2020 at Université des Mascareignes (UdM), Mauritius, in collaboration with Rama Devi Women’s University, Bhubaneswar, India, and S‘O’A Deemed to be University, Bhubaneswar, India. This book brings together academicians, industry persons, research scholars and students to share and disseminate their knowledge and scientific research work related to advanced computing and intelligent engineering. It helps to provide a platform to the young researchers to find the practical challenges encountered in these areas of research and the solutions adopted. The book helps to disseminate the knowledge about some innovative and active research directions in the field of advanced computing techniques and intelligent engineering, along with some current issues and applications of related topics.

Table of Contents

Frontmatter

AI and Machine Learning Applications

Frontmatter
Forecasting Price of Indian Stock Market Using Supervised Machine Learning Technique

There has been increasing enthusiasm for displaying and determining stock costs over recent decades. The artificial neural network is not acceptable as it is a grouping of both fictitious and experimental disciplines, which can be a successful way to enhance the performance of the mix of different models if the model is very unusual. In this paper, the different strategies like linear regression, kNN, Naïve, MA, AR, ARMA, ARIMA, and autoARIMA are used for forecasting the stock markets. This paper suggests which method is the best to use for predicting the stock market. This paper endeavors to address the determining of stock costs. On this unique circumstance, we gathered information on a month to month shutting stock indices of SENSEX. Subsequently, it very well may be utilized these models for estimating a task, particularly when higher anticipating precision is required.

Mohit Iyer, Ritika Mehra
Speaker Recognition Using Noise Robust Features and LSTM-RNN

A tremendous growth has been observed in terms of active research in the field of speaker recognition. This has been mainly due to the increasing need of zero-touch interfaces in devices and mobile biometric authentication systems. This paper discusses implementation of text-independent speaker verification system using long short-term memory (LSTM)-based neural network for speaker modeling by using various approaches for the front-end feature extraction including Mel Frequency Spectral Coefficients (MFSC), Mel Frequency Cepstral Coefficients (MFCC), Gammatone Filter Spectra (GTF), and Gammatone Filter Cepstral Coefficients (GFCC). Additionally, to determine the best-suited speaker verification system for given noisy conditions of environment, all the combinational systems are tested under induced noisy conditions with white noise at −20 and −40 dB, as well as under clean environmental condition. The results show that the MFSC-based LSTM-RNN combination tends to perform better than all the other combinations regardless of the noise added in the dataset.

Mohit Dua, Pawandeep Singh Sethi, Vinam Agrawal, Raghav Chawla
Brain Tumor Segmentation Using Random Walks from MRI Images

Shivhare, Shiv Naresh Kumar, NitinMagnetic resonance imaging (MRI) is a vital and universally recognized medium to assess brain neoplasms. This paper presents a study on brain tumor segmentation based on the random walk algorithm which is a graph-based method in which pixels of a brain MR image are treated as nodes. Segmentation is performed by interactively labeling certain nodes as foreground and background seeds, followed by computing the probability of each unlabeled node to reach all the labeled nodes using random paths. The method is applied on two different MR modalities viz. T2-weighted MRI with fluid attenuated inversion recovery (FLAIR), and T2 MRI to segment complete tumor, and tumor core regions, respectively, by utilizing visual traits of MRI images and identifying local and global brain tissues information. Efficacy is validated quantitatively as well as qualitatively through performing the experiments on publicly available brain tumor segmentation challenge (BRATS-2013) dataset. Results demonstrate that the proposed method performs favorable as compared to several existing methods.

Shiv Naresh Shivhare, Nitin Kumar
Factors Accountable for Diabetes Using Artificial Intelligence in Medico-Care

The new advancements in technology have given us heaps of data. The healthcare industry is one such industry which has accumulated mounds of data from varied sources. To manage this large volume of heterogeneous data which is usually referred as ‘big data’, data analytic tools have been developed. With the changing demands of healthcare management like from disease-centric to patient-centric and volume-based to value-based, healthcare delivery models need to be built using artificial intelligent analytical tools. This paper presents factors accountable for diabetes using artificial intelligence tools. Python programming language has been used to produce the results.

Karuna Babber, Shruti Wadhwa
An Evaluation of Deep Learning Networks to Extract Emotions from Yelp Reviews

User reviews are increasing exponentially in the online world and they play a crucial role in the decision-making process of buying products or services. A lot of thoughts go by a customer mind when it comes to making online purchases. While reviews are important for customers, businesses can also derive various advantages from the data generated by its user base. They can leverage user opinions, sentiment and emotion to gain customer insights. While several successful studies have already been conducted in analysing sentiments expressed in documents, we find that latest trends in machine learning, specifically deep learning, are still not well studied when it comes to emotion detection. We therefore, present here a study to evaluate the application of two deep learning networks namely CNN and LSTM in identifying emotions in yelp reviews.

Yasser Chuttur, Leevesh Pokhun
Classification of Brain Tumor MRIs Using Deep Learning and Data Augmentation

Brain tumor identification and classification is crucial in everyday life. This paper focuses on a four-class classification problem to differentiate between three prominent types of brain tumor namely glioma, meningioma, pituitary tumors and no tumor. The proposed system uses deep transfer learning and two pre-trained and one custom model to classify these brain MRI images. The empirical work is performed using a custom dataset made from existing public datasets. The proposed system registers a classification accuracy of up to 99%. Performance measures such as precision, recall and F-score have also been calculated. Moreover, since the dataset is not so big, the results show that transfer learning is a useful technique when the availability of medical images is limited.

Gulshansingh Bhagbut, Zahra Mungloo-Dilmohamud
Deep Neural Network Optimization for Handwritten Text Recognition

These days a large number of handwritten documents are available in scanned images. The aim of Handwritten Text Recognition (HTR) is to transcribe the offline document images by a computer through the use of deep neural networks. Even though hybrid architectures, designed for this purpose are gaining popularity, we aim to further optimize the prescribed models by changing the hyperparameters associated with the number of convolutional neural network layers on the overall accuracy. We explore the implementation of hybrid architectures and show that mainly the depth of the network plays a noteworthy role in improving the accuracy of the trained model.

Chahat Goel, Aishwarya Chaudhary, S. Indu, Sudipta Majumdar
Consolidating Online Real Estate Data Using Image Analysis and Text Processing

Property buyers often access online real estate websites to search for properties which they might be interested in. However, property sellers may in turn have their properties listed on several real estate agency websites to increase their chances of reaching a large audience. Consequently, same properties are often found on multiple websites. Problems arise when same properties are listed with different prices and different details on different websites. To users, such a situation impedes on the decision-making process. Ideally, having a single portal, where all data pertaining to the same property from different real estate agency websites are consolidated that would be useful for potential buyers prior for making a decision. Identifying similar real estate properties posts, however, is not a straightforward exercise. With different information available from different websites, additional processing is required. In this paper, we propose an approach that makes use of image analysis and text processing to detect similar real estate properties which advert from different websites posted for the Mauritius market.

Yasser Chuttur, Haydar Mahadooa
Time Series Visualization of Customer Emotions Using Artificial Neural Network

Online customer reviews have been recognized as having a high influence on both customers and business managers’ decisions. While current approach to display general customer rating score is useful for customers to obtain a quick indication of overall customers experience with a given business, a fixed score does not provide useful information for managers on different customer service strategies adopted over a given period of time. In this study, we propose a method to analyze customer reviews over a period of time, so that emotions expressed by customers can be visualized across a timeline. We make use of Long Short-Term Memory (LSTM) as an artificial neural network technique on reviews posted on marideal.mu, a hotel booking website in Mauritius. Emotions such as ‘love’, ‘sadness’, ‘joy’, ‘surprise’, ‘anger’ and ‘fear’ were identified from 4270 reviews for 110 hotels found on marideal.mu website. We demonstrate how hotel managers can drill through the classified comments so as to identify various degree of satisfaction from their customers’ experience and have a better insight about the performance of their businesses over a desired period of time. Findings of this proposal can help managers develop means to better identify business strategies to improve both customer satisfaction and business performance.

Yasser Chuttur, Nandishta Rawoteea
Sentiment Analysis Using Deep Learning for Recommendation in E-Learning Domain

Sentiment analysis (SA) is one of the methods that can assist in extracting information from a large amount of data. It is considered one of the research fields in text mining, which has become vital to employ within recommendation systems, as well as in e-learning environments. In the current work, we present a new method of recommendation model utilizing sentiment analysis based on convolutional neural network (SABCNN) and natural language processing (NLP) techniques. Starting from collecting and analyzing the learners’ sentiments of reviews for the e-content with their corresponding rating within e-platforms, a sentence or a specific text is classified to multi-levels by determining what semantics of feelings it holds. Our research aims towards recommending learning resources that are relevant to the learners’ preferences with the aid of the previous reviews of other learners, sharing him/her the top preferences.

Rawaa Alatrash, Hadi Ezaldeen, Rachita Misra, Rojalina Priyadarshini
Selection of Best K of K-Nearest Neighbors Classifier for Enhancement of Performance for the Prediction of Diabetes

Today, getting a meaningful information from an ocean of data obtained from numerous sources becomes very tedious work. A number of methods are applied to analyses various types of results obtained from them. Classification is one of them. Machine learning algorithms are used to train classifiers and enhance the capability of different classification methods. In this research paper, focus is given on K—nearest neighbor algorithm and has been analyzed its performance to classify patients in diabetic and non-diabetic class using Pima Indian Diabetes Dataset obtained from UCI repository. Diabetes mellitus, which is a metabolic disorder of human body, is one of the major health threats in world. Due to it, body becomes unable to consume insulin properly released by pancreas gland. Experimental analysis has been performed on PIMA dataset using python. A detail study is made on it using KNN algorithm with different values of K and identified the best value of K on which KNN returns best result. KNN provide a better accuracy 81.17% after applying feature selection method.

Subhash Chandra Gupta, Noopur Goel
Phishing Website Prediction: A Machine Learning Approach

Phishing is an act of stealing our precious, personal, sensitive data such as credentials that which we use for accessing the resources and services, available across the cyberspace. Seeing these rapidly growing phishing attacks and their adverse effect on the businesses including individual users, it has now become a need for the organizations and individuals worldwide to be able to effectively predict the phishing website and differentiate them from legitimate ones. The aim of this research paper is to efficiently predict the phishing websites so that users may be benefitted from this study and prevent them from getting trapped. In this paper, machine learning techniques are used for prediction. Data mining is used worldwide by almost every face of the society viz. business organizations, govt. organizations, and other kind of data collectors to extract knowledge from the collected data. On the other hand, machine learning is a data mining technique that is used to analyze, classify the data, and efficiently predict the results for the estimation and planning by all of the organizations all around the globe. Classification algorithms, namely logistic regression, decision tree, and random forest classification, are used to predict the fake websites and presented their comparison of their predictions achieved. The results have been presented in numeric format as well as graphically with the help of chart. The data used is taken from UCI machine learning online repository. The seed value is changed and analyzed, and results achieved are least accuracy of 95.93, 97.96, and 98.78% of accuracy as the highest as well. Some future study and applying some good practices may help in designing a better and more accurate solution for the prediction of the phishing website, just by examining the URL and its features.

Anjaneya Awasthi, Noopur Goel
COVID-19 Sentimental Analysis Using Machine Learning Techniques

With the rise in patients affected by the coronavirus, the World Health Organization declared it a pandemic. Globally, people are forced to stay at home to maintain social distancing. People are sharing their feelings through social media platform like Twitter. Twitter data helps to understand grief and pain due to coronavirus. In this paper, a Twitter dataset has been used for sentiment analysis of people's opinions related to coronavirus (COVID-19) that is a vital issue these days all over the world, and various countries are affected by this pandemic. So analyze the people sentiment regarding this pandemic using machine learning techniques and sentiment analysis model such as Naive Bayes, Support Vector Machine, Logistic Regression, and Random Forest Classifier has been proposed to analyze sentiment more effectively. Further, a comparison of these models has been done to prove extremely effective and accurate based on the analysis of feelings and opinions regarding coronavirus (COVID-19).

Chhinder Kaur, Anand Sharma
Multimodal Music Mood Classification Framework for Kokborok Music

This article describes one of the applications of Music information retrieval (MIR) integrated with natural language processing. The proposed work represents one of the applications of MIR that is music mood classification of one of the North-eastern regional language, which is Kokborok. It is widely spoken in the states of North East (NE) India and many other countries like Nepal, Bhutan, Myanmar and Bangladesh. The selection of the song is particular to Kokborok songs collected from the Bible, which has written in the recognized Romanized language which is accepted worldwide. We develop the multimodal corpus for audio and lyrics for Kokborok song and performed coarse-grained annotation to create mood annotated dataset and then perform classification task on both audio and lyrics separately. We projected mood taxonomy for Kokborok songs and set a mood annotated corpus with the corresponding taxonomy. Initially, we used 48 parameters for audio classification and six Text stylistic feature for lyrics based classification. The SVM classifier is used with linear kernel function for classification. Finally, Mood classification system was developed for Kokborok song consist of three different systems based on audio, lyrics and multimodal (audio and lyrics together). We also compared different classifier used to get the system performance for the above three systems. We achieved 95% accuracy for audio, 97% for lyrics and multimodal system, and the accuracy rate is about 96%.

Sanchali Das, Sambit Satpathy, Swapan Debbarma
Forecasting of Daily Demand’s Order Using Gradient Boosting Regressor

Supply chain management is an important task in terms of business process. In this task, an important terminology is forecasting. And, the order forecasting is essential for the related personals. In this paper, daily demand’s forecasting is done based on the data of the Brazilian logistics company. Previously, artificial neural network (ANN) was applied in this dataset. To get the best accuracy, different settings of multi-layers perceptron neural network were used with proper optimization. But using deep neural network for forecasting orders having limited data sometimes makes a model overfitted and complex. So, a normal machine learning-based approach is used to avoid these facts, and also, there is an improvement of the model’s accuracy. Gradient boosting regressor is applied in a more practical way with optimization of different parameters which decrease the error rate of previous work near about on average by 0.86% and at best by 1.44%.

Tansif Anzar
Improving Impulse Noise Classification Using Ensemble Learning Methods

Medical image denoising is an essential pre-processing step in medical image processing which improves the performance of clinical diagnosis and prognosis. The high level medical image processing algorithms like segmentation, classification etc. works better if the image is denoised appropriately. The main objective of this research work is to find and replace only the corrupted pixels with suitable estimates of pixels in medical images. The other pixels which are not corrupted are left undisturbed, thereby preserving the image quality for proper diagnosis. For the primary task of finding the corrupted pixels, an ensemble of machine learning (EML) classifiers namely Naïve Bayes (NB), Support Vector Machine (SVM), Decision Tree (DT or RT) and Random Forest (RF) are used by supervised learning methods. The final classification output is determined by the majority voting of the outputs of each ML classifier which works in parallel. By adopting this method, a classification accuracy of 99.87% is achieved.

Kunaraj Kumarasamy, S. Maria Wenisch, S. Balaji, L. J. Jenifer Suriya, A. Jerlin, S. Robert Rajkumar
Image Data Preservation with Fractional Sine Transform and Dual Chaotic Sequence

The security issue of visual data is prevalent in today's era because of the fast growth of communication technology. The cutting edge technologies like machine learning, deep learning, and cloud computing has given new heights to data communication. And in this virtual age the preservation of digital images are of prime importance because it carries much personal and sensitive information from an individual's personal data to the any nation's defence data. So, the growth in communication technology not only given heights to mankind, but also attracts the attackers for their mischievous activities of data theft. Therefore, to safeguard this, various image security methods are already proposed. So, in the chain, a hybrid method combining discrete fractional sine transform with logistic and Arnold cat map is proposed here. This hybrid combination is noteworthy in the sense that it has higher PSNR and least error in the reconstruction as compared to their ancestor algorithm.

Sharad Salunke, M. Venkatadri, Md Farukh Hashmi, Bharti Ahuja
Enhancing Deep Learning Capabilities with Genetic Algorithm for Detecting Software Defects

Regardless of existing and well-defined processes, some defects are inevitable, resulting in software performance degradation. The use of traditional machine learning techniques can automate the prediction of software defects. This automated approach significantly improves the quality of the finished product and reduces the cost incurred during development and maintenance stages. The accuracy of artificial neural networks for the automatic prediction of software bugs, can be further enhanced with the use of metaheuristics algorithms. We propose a hybrid approach which combines Genetic Algorithm (GA) and Deep Neural Network (DNN) to better classify software defects. GA is used as a pre-learning phase to automatically optimize the input features for the DNN, as irrelevant variables have a substantial negative impact on the prediction accuracy. Results from experiments using the PROMISE dataset, demonstrates that a DNN consuming optimized features yields better results.

Kajal Tameswar, Geerish Suddul, Kumar Dookhitram
Application of Classifier for Breast Cancer Cell Detection

Due to increased number of breast cancer cases, there is rise in concern about the disease. So, in today’s era, machine learning classification can play a vital role in classifying the degree of malignancy of the cancer. This paper uses four different classifiers such as KNN, logistic regression, naive Bayes and decision tree to classify the cancer into two classes of benign and malignant. The proposed model is simulated over Wisconsin diagnostic breast cancer dataset and evaluated by accuracy metric. The model classifies the disease based on the most accurate classifier. The KNN model shows significantly better result than the other classifier.

Ashutosh Mishra, Unnati Mantry, Dibyasha Garhnayak, Sourav Panda, Rasmita Rautray, Rasmita Dash, Rajashree Dash
Apriori-Backed Fuzzy Unification and Statistical Inference in Feature Reduction: An Application in Prognosis of Autism in Toddlers

Maitra, Shithi Akter, Nasrin Zahan Mithila, Afrina Hossain, Tonmoy Shafiul Alam, MohammadWeak Artificial Intelligence (AI) allows the application of machine intelligence in modern health information technology to support medical professionals in bridging physical/psychological observations with clinical knowledge, thus generating diagnostic decisions. Autism, a highly variable neurodevelopmental condition marked by social impairments, reveals symptoms during infancy with no abatement with time due to comorbidities. There exist genetic, behavioral, neurological actors playing roles in the making of the disease and this constructs an ideal pattern recognition task. In this research, the Autism Screening Data (ASD) for toddlers was initially exploratorily analyzed to hypothesize impactful features which were further condensed and inferentially pruned. An interesting application of the business intelligence algorithm: Apriori has been made on transactions consisting of ten features and this has constituted a novel preprocessing step derived from market basket analysis. The huddling features were fuzzily modeled to a single feature, the membership function of which evaluated to the degree to which a toddler could be called autistic, thus paving the way to the first optimized Neural Network (NN). Features were further eliminated based on statistical t-tests and Chi-squared tests, administering features only with $$p{\text {-values}} < 0.05$$ p -values < 0.05 —giving rise to the second and final optimized model. The research showed that the unremitted 16-feature and the optimized 5-feature models showed equivalence in terms of maximum test accuracy: 99.68%, certainly with lower computation in the optimized scheme. The paper follows a ‘hard (EDA, inferential statistics) + soft (fuzzy logic) + hard (forward propagation) + soft (backpropagation)’ pipeline and similar systems can be used for similar prognostic problems.

Shithi Maitra, Nasrin Akter, Afrina Zahan Mithila, Tonmoy Hossain, Mohammad Shafiul Alam
Developing a Framework for Generating Hotel Recommendation

Recommendation Systems (RS) are utilized in a variety of areas. RS provide suggestions to items that a particular user is most likely interested in. There has been a huge amount of research in this field. An intelligent approach to generate recommendations by using heterogeneous data is proposed in this paper. For increasing the performance of the recommendation system, we consider the surrounding environments of the considered hotels. Feedback system is considered as a computer program in information technology (IT). The analysis of the feedbacks provided by the users is essential for the improvement of accuracy RS. By analyzing both textual and numeric feedbacks, a better suggestion will be provided to the users.

Md. Shafiul Alam Forhad, Mohammad Shamsul Arefin
Rumor Source Identification on Social Networks: A Combined Network Centrality Approach

The source identification of any particular rumor in the social network is a critical task due to its intricate network connectivity. In this paper, we have analyzed network centrality measures to detect the rumor source on real social network datasets. We used susceptible-infected (SI) model to construct a graph of infected nodes initiated from a single source and proposed a combined network centrality approach (CNCA) to identify that source. Our approach combines well-known rumor centrality value with betweenness centrality value to maximize the likelihood of source detection probability. Simulations were performed on varied network structures with varied complexities to analyze the performance of the proposed approach. We compared our approach with rumor centrality, betweenness centrality, and Jordan centrality approaches. We observed that a combination of rumor centrality and betweenness centrality approach outperforms the individual centrality measures on source identification.

Abhijit Das, Anupam Biswas
Sentiment Polarity Detection on Bengali Book Reviews Using Multinomial Naïve Bayes

Hossain, Eftekhar Sharif, Omar Moshiul Hoque, MohammedRecently, sentiment polarity detection has increased attention to NLP researchers due to the massive availability of customer’s opinions or reviews in the online platform. Due to the continued expansion of e-commerce sites, the rate of purchase of various products, including books, is growing enormously among the people. Reader’s opinions/reviews affect the buying decision of a customer in most cases. This work introduces a machine learning-based technique to determine sentiment polarities (either positive or negative category) from Bengali book reviews. To assess the effectiveness of the proposed technique, a corpus with 2000 reviews on Bengali books is developed. A comparative analysis with various approaches (such as logistic regression, naive Bayes, SVM, and SGD) also performed by taking into consideration of the unigram, bigram, and trigram features, respectively. Experimental result reveals that the multinomial naive Bayes with unigram feature outperforms the other techniques with $$84\%$$ 84 % accuracy on the test set.

Eftekhar Hossain, Omar Sharif, Mohammed Moshiul Hoque
Real-Time Facial Emotions Analysis in Videos

Deciphering human facial expressions is an integral part of achieving a seamless human to machine communication. This may assist in various cognitive tasks and convey important information such as emotions, intentions and opinions in a medium where machines can understand beyond the traditional binary form of communication. In this paper, we present a significant progress in this direction. We adapted a machine learning approach based on a neural network to help recognize human facial expressions in real time. With a core revolving around a multi-layered convolutional network responsible to train a model, we were able to successfully detect frontal faces from a video stream and encode, in real time, the seven emotions of joy, disgust, neutral, anger, fear, sadness and surprise.

Babajee Phavish, Suddul Geerish, Armoogum Sandhya, Foogooa Ravi
An Extended Genetic Algorithm-Based Prevention System Against DoS/DDoS Flood Attacks in VoIP Systems

VoIP frameworks need protections against unexpected organized dangers. The plausibility of such dangers begins from the progressing convergence of media transmission and IP arrange foundations. The flood attack is now considered as the most unsafe threat. In this paper, a new model for mitigation is proposed using a genetic algorithm that is capable to detect distributed attacks with convincing performance after evaluation using known parameters. The model can detect two unexpected behaviors of attackers. The results after performance analysis show that the detection rate is almost 100% for an attack rate of 40 messages per second and above. The false alarm rate is also good (around zero) while the sensitivity is near to 100%.

Sheeba Armoogum, Nawaz Mohamudally
Entropy Based Cluster Selection

Banerjee, Arko Pujari, Arun K. Panigrahi, Chhabi Rani Pati, BibudhenduClustering has emerged as a method of unsupervised partitioning of a given set of data instances into a number of groups (called clusters) so that instances in the same group are more similar among each other with respect to instances in other groups. But there does not exist a universal clustering algorithm that can yield satisfactory result for any dataset. In this work we consider an ensemble (collection) of clusterings (partitions) of a dataset obtained in different ways and devise two methods that judiciously select clusters from different clusterings in the ensemble to construct a robust clustering. The superior performances of the proposed methods over well-known existing clustering algorithms on several benchmark datasets are empirically reported.

Arko Banerjee, Arun K. Pujari, Chhabi Rani Panigrahi, Bibudhendu Pati
Human Activity Recognition Using Machine Learning: A Review

With the enrichment of technologies, humans want to maximize automation by reducing the manpower and time, Human Activity Recognition (HAR) has a heterogeneous broad range of significant applications such as health care, theft detection, work monitoring in an organization and detecting emergencies. Various machine learning (ML) classification algorithms are applied on publicly available HAR datasets to recognize human activities in the literature. In this work, we have identified different HAR datasets involving different levels of activities and the methods for acquisition of data. We have also done a detailed review of HAR approaches with the implementation of various ML classifiers along with specific future directions in this area.

Ankita Biswal, Sarmistha Nanda, Chhabi Rani Panigrahi, Sanjeev K. Cowlessur, Bibudhendu Pati

Advanced Computer Networks and Algorithms

Frontmatter
Ensuring Secure Communication from an IoT Edge Device to a Server Through IoT Communication Protocols

The Internet of things (IoT) is seen as one of the next Internet revolutions. More and more IoT devices are connecting to the Internet. These devices sense the physical world and perform some tasks in response to events occurring. Such devices are identified through unique IP addresses and often have to send data which might be sensitive and confidential over the network. Thus, the security of the data packets sent by IoT devices over the Internet has to be considered. The Constrained Application Protocol (CoAP) supports the RESTful HTTP functionalities and has been proposed specifically for IoT devices which are characterized by low processing power and energy. However, CoAP uses UDP protocol and must rely on the Datagram Transport Layer Security (DTLS) and sometimes on IPSec for security. In this paper, the focus has been on the practical evaluation of the different mechanisms that can be used to secure data packets being sent by IoT devices, to ensure that communication from the IoT node to the cloud/server is not compromised; and data is encrypted ensuring full end-to-end security. The IoT technologies communication protocols considered are transport layer and application layer protocols HTTP, CoAP, and FTP. A testbed was implemented which allow connectivity between a resource-constrained device, namely a Raspberry Pi, and a cloud server. Using this test bed, the different communication protocols for IoT secure connectivity have been assessed.

Roshni Vidya S. Boolakee, Sandhya Armoogum, Ravi Foogooa, Geerish Suddul
A QoI Assessment Framework for Participatory Crowdsourcing Systems

Participatory Crowdsourcing Systems have the potential to improve services in our daily life, such as health care, transportation and to monitor even the urban landscape using participatory sensing strategies. Data are the core mechanism that enables Participatory Crowdsourcing Systems to operate. It is very important to understand the evolution and relevance of data in Participatory Crowdsourcing Systems. Thus, this paper proposes a Quality of Information assessment framework which all Participatory Crowdsourcing Systems should strive to achieve to ensure data quality. The framework operates in a matrix schema that consists of four independent classes (horizontally) and has various dimensions within each class (vertically). The proposed framework will be flexible as it can incorporate new quality classes in the case of emerging technologies or domain areas. On the other hand, the vertical layer will have two subsections namely mandatory and desired features contained within a class.

Ashley Rajoo, Kavi Kumar Khedo, Utam Avinash Einstein Mungur
An Approach to Personalize VMware vSphere Hypervisor (ESXi) Using HPE Image Streamer

Server virtualization technology is an automation method to control and monitor task running in multiple virtual machines. It uses software to divide a physical server into multiple virtual machines. Each virtual machine runs its own operating system. It increases the effective use of server. The VMware ESXi operating system can be personalized with required designs, updates, and patches. This study focuses on personalization of the ESXi operating system with the customized application stack. The bare metal compute hardware boots up directly into this application stack and is ready to host virtual machines on it. The HPE Image Streamer is used to host, configure, and serve the operating systems to HPE Synergy compute modules.

Richa, Jyoti Singh
A Proposed IoT Architecture for Corals Research Using AI and Robotics

Aumeer, Wafiik Pooloo, Nabeelah Khoodeeram, RajeevCoral reefs are one of the most diverse ecosystems on the planet and are vital in providing nursery, spawning, refuge, and nurturing areas for a multitude of different organisms. However, coral reefs are degrading owing to climate change, overfishing, industrial pollution and invasive species. Thereafter, mass coral bleaching events and infectious disease outbreaks take place causing corals to die. This paper aims at contributing in the monitoring of corals in the Mauritian marine ecosystem by utilising a Hybrid Underwater Vehicle (HUV) autonomously for collection of oceanographic data as well as images and videos of surveyed coral reefs sites. Data collected will be transmitted to a base station via Cellular IoT communication for analytics using two methods of Artificial Intelligence (AI) namely Machine Learning and Deep Learning. The conceptual design is thoroughly described together with software applications.

Wafiik Aumeer, Nabeelah Pooloo, Rajeev Khoodeeram
Voice Password-Based Secured Communication Using RSA and ElGamal Algorithm

Secured voice authentication-based communication is the main aim of this study. Here, eight speech keywords have been recorded and stored in computer memory. One speaker recognition model was used for voice password authentication. Then, the speech keywords were encrypted using private key. The message was encrypted using RSA or ElGamal algorithm. The message was modulated using FSK digital modulation technique and sent through the communication channel. The speech samples were demodulated and decrypted at the receiver. The received speech samples matched with the original transmitted voice samples. The equality ratio for this study is 0.6 and above. In this study, secured authentication technique has been adopted. After voice authentication, secured communication has been done successfully.

Prashnatita Pal, Bikash Chandra Sahana, S. Ghosh, Jayanta Poray, Amiya Kumar Mallick
UD-RMM: A Remote Monitoring and Management Tool Using PowerShell Universal Dashboard for Universities

Nagarajan, Pranav Thyagarajan, JayavigneshA Remote Monitoring and Management Tool is a software that performs multiple operations like monitoring a group of computers and observing the behavior, installing software remotely, troubleshoot the computers from a single location. Nevertheless, remotely monitoring a network of more than 500 computers is a cumbersome task. This project aims at implementing a hassle-free solution, particularly for computer laboratories in a University by developing a GUI (Graphical User Interface) based tool with a Software-as-a-Service approach. This tool is built entirely out of Microsoft PowerShell (PoSH) right from scripting the functions of each feature to designing the tool’s GUI, while making use of many Networking Protocols and Standards wherever necessary. With PoSH being available as a cross-platform tool, this tool easily supports the implementation in both Windows and Linux operating systems.

Pranav Nagarajan, Jayavignesh Thyagarajan
Performance and Resource-Aware Virtual Machine Selection using Fuzzy in Cloud Environment

Mongia, Vikas Sharma, AnandCloud computing is an on-demand computing which provides elastic resources to users on pay-as-you-go basis. The ever-increasing resource demand of data-driven applications promulgated the deployment of data centers which results in high energy consumption. Therefore, efficient resource management is crucial to serve users’ request. Live migration plays an important role in resource management. However, excessive migrations sometime degrade application performance. Thus, careful virtual machine selection must be done to ensure low migration count. This work develops a virtual machine selection policy: performance and resource-aware virtual machine selection using fuzzy (PRSF) that aims to utilize CPU resources to their maximum in order to reduce migration count. An attempt is also made to decrease migration time of a virtual machine by considering its main memory allocations. The policy implements Mamdani fuzzy controller to optimize the selection decision. The performance evaluation of proposed policy with benchmark algorithms has shown reduction of 32.78% and 81.7% in energy consumption levels and migration count, respectively.

Vikas Mongia, Anand Sharma
Redefining Data Dimensionality Through Dynamic Linkages in Data-Space Continuum

Dimensionality of data has been overly defined by scholars as the number of attributes a dataset holds. As such, high dimensional data is described as a curse to most analytics tools and algorithms; since, even a small increase in the number of attributes grows the predictive space significantly; so much so that the model is no longer as accurate as it ought to be. This study however seeks to redefine data dimensionality by first viewing data as a vector that exists within a data space continuum. Following the redefinition of dimensionality, this study proceeds to show the robustness and benefit of highly dimensional data in improving the quality of analytics. This is achieved in this study by introducing time as an important scalar in the data space continuum and showing the contribution of considering time during data analysis. This contribution explains the highly improved ability of the predictive space to not only being restricted to showing the how or to what extent attributes affect each other or the target output but also explains the pointer reasons why predicted events occur with relevance to a time series. This ability is thus deemed as a blessing to analysis following the redefinition of data dimensionality. The assumption made is that data points existing during different time stamps experience a variety of effects resulting from natural or artificial third parties.

Benard Alaka, Bernard Shibwabo Kasamani
Implementation of Encryption Techniques in Secure Communication Model

Wireless communication evolves the modern generation where communication has been possible to limit the distance without any physical connection between two parties. Basically, communication can be of two types namely wired and wireless where the second is preferred mostly than first. But we need a secure wireless communication in case of industries, companies, e-commerce and communication technologies, etc. In this research, we can transmit encrypted data wirelessly through the channel to provide data security using different encryption techniques. To do this we propose a model which comprises two parts namely transmitter and receiver. In the transmitter portion, the plaintext is encrypted and then after modulation, it is sent through the channel. The receiver at first receives the channel output and then demodulates and finally decrypts the encrypted data to get the original plaintext which is sent from the sender to the receiver. Here, we have considered two encryption techniques, i.e., Caesar and RSA, and compared them with no encryption in our proposed model. We implement our model using the MATLAB programming language. The overall investigation shows that the RSA algorithm deserves better performance than Caesar and no encryption techniques because RSA shows lower bit errors with signal-to-noise ratio compared to no encryption and Caesar cipher.

Md. Sharif Hossen, Md. Shakhaowat Hossen
A Group Decision Making Problem Involving Fuzzy TOPSIS Method

Selection is a process to find out the best alternative solution result using the given alternatives, criteria and experts. The purpose of this manuscript is to development of fuzzy technique to group fuzzy technique method through FTOPSIS method. We introduce a literature survey in different models of fuzzy and that have been applied the field of decision making. In the multi-criteria decision technique, fuzzy TOPSIS is proposed for selection of four different projects by fuzzy TOPSIS software. Lastly, we determine the best project using group fuzzy TOPSIS methodology. To illustrate the sequel of the group ideal solution and have defend our replica to be structured and vigorous.

Prashanta Kumar Parida
IoT-Based Smart Intravenous Drip Monitoring System

Health care being the most important aspect in heading toward a contented life also plays an essential role in India's progress. These days, automating the health monitoring devices leads to a drastic change in medical sphere as it ensures the safety of the patients and even helps in reducing the stress of doctors and nurses. In this field, intravenous remedy plays an important role as it is the system wherein the liquid substances are directly inserted into the patient's vein via an IV tube but it could also worsen the situations if not taken proper care. Thus, this paper emphasizes on the necessity to overcome such a consequence by introducing a solution to it. Hereby an automatic intravenous drip monitoring system is developed which directly sends an alert message to the assigned nurse when the fluid level of the bottle reaches a certain limit. This system measures the weight of the saline bottle with the help of a load cell and then using an automatic alerting and indicating device namely GSM sends the alert signal. This system would be a significant serve to build a different approach toward the intravenous therapy.

Muskan Jindal, Nidhi Gajjar, Nehal Patel
Cryptanalysis of Lightweight Ciphers Using Metaheuristics

There are several lightweight cryptographic ciphers (LWC) which have been proposed lately for constrained environments. When it comes to selecting one specific cipher for a particular application, the task becomes complicated. Several surveys have been conducted on hardware and software implementations of lightweight cryptographic primitives in order to evaluate and benchmark their characteristics such as throughput, ROM and RAM requirements, power consumption, and so on. But tackling the question about the strength of lightweight cryptographic primitives and their ability to withstand known and potential attacks is not a straightforward activity. The underlying design of the primitives or the key size of LWC is insufficient criteria to measure and compare their cryptographic strengths. This paper attempts to evaluate the relative strength of software implementation of lightweight block ciphers using metaheuristic algorithms and proposes a framework for selection of encryption algorithm. Experiments show that LWC algorithms are as strong as classical AES despite their simple design. Moreover, they execute faster with less resources than conventional AES. Finally, it is observed that the relative strength of LWC is independent of their underlying architectural design.

Seeven Amic, K. M. Sunjiv Soyjaudah, Gianeshwar Ramsawock
Product Classification in E-Commerce Sites

Given a predefined catalog hierarchy where all the categories are defined, the task by which the catalog path of every product is automatically predicted is known as product classification. This paper explores some of the methods of product classification. With the help of supervised learning, we have presented various machine learning models to classify products into a set of known categories. With the information such as name and description of the product being provided, this model can accurately place it under a designated category. Machine learning models that have been deployed include decision trees, support vector machine (SVM), random forest, logistic regression, and Naïve Bayes, with logistic regression giving the highest accuracy of 91.55%. The proposed implementation has huge potential to be able to slowly but substantially increase the automation of the categorization of products.

Anannya Patra, V. Vivek, B. R. Shambhavi, K. Sindhu, S. Balaji
Real-Time Detection of Inter-Frame Video Forgeries in Surveillance Videos

Detection of video forgery is a critical requirement to ensure integrity of video data and with an increase in the use of cameras and dependability of surveillance systems, it becomes more critical. In this paper, video forgery detection is proposed based on the concept of exponential weighted moving average of optical flow variation factors. This mechanism relies on the fact that optical flow variation factor is almost continuous in an original video. However, discontinuity points in the optical flow variation factor are introduced in forged video which are not visible with the naked eye. The proposed mechanism detects inter-frame forgery process, i.e., frame deletion, insertion, and duplication in surveillance videos. The proposed detection mechanism is deployed on cloud server where a live video stream is received from different ATM vestibules. Upon detection, an alert message is sent to the concerned bank for immediate action. Experiments are performed on the videos captured from ATM vestibules of a bank. The accuracy and latency of proposed mechanism manifest pragmatism of the proposed idea.

Kshitij Saluja, Naveen Aggarwal
An IoT-Based System Architecture for Environmental Monitoring

Internet of Things (IoT) has grown in popularity in all spheres of life. Environmental issues in the modern era present a major concern for researchers, scientists and government. In combating the environmental issues, monitoring, identifying and quantifying different contaminating parameters in the environment are necessary, and for this purpose, IoT technology can be extremely useful. In this paper, we propose an IoT-based environmental monitoring system that operates on a four-layer architecture wherein environmental data are captured by a sensor network, aggregated and then communicated to a web server via a cloud service. Then, these data are analyzed by various analytical IoT specific tools and then delivered to the applications for further processing. This system can be cost effective and easy to deploy considering that the underlying architecture is simpler than other existing similar systems.

Binod Kumar Pattanayak, Deojeet Nohur, Sanjeev K. Cowlessur, Rajani Kanta Mohanty
Performance Evaluation of VM Allocation Strategies on Heterogeneous Environments in Cloud Data Center

Garg, Rajni Arora, Indu Gupta, AnuCloud computing paradigm provides on demand utilities to the customers and involves high resource requirements. Virtual machine allocation plays a vital role in the optimization of resource usage in a data center. This approach consolidates workload on minimal number of servers to improve their resource utilization and thus contributes to energy efficiency. The literature evidences number of virtual machine allocation strategies that vary considerably in adopted approaches and evaluation environments. This research work aims to analyze performance of commonly used heuristic allocation policies. To ensure logistic comparison, the selection of policies is done that implements bin-packing approach in allocation. These algorithms are intensively compared to different workload data-sets and varying experimental setups. Performance of these algorithms with different threshold and virtual machine selection policies is also evaluated. The results conclude that the policies which consider power and computing capacity of the server perform better in almost all scenarios.

Rajni Garg, Indu Arora, Anu Gupta
QSens: QoS-Aware Sensor Node Selection in Sensor-Cloud Architecture

Roy, Arijit Misra, Sudip Kotasthane, AdityaIn this paper, we propose a Quality-of-Service (QoS)-aware sensor node selection scheme, QSens, for sensor-cloud architecture. In this architecture, a Sensor-Cloud Service Provider (SCSP) provisions Sensors-as-a-Service (Se-aaS) to the registered end-users. On the other hand, the end-users pay the charges for their availed services. This work has twofold objectives—first, we define the Service-Level Agreements (SLAs) in sensor-cloud to bind sensor owners, SCSP, and end-users together with certain contracts, and second, with the help of these SLAs, the proposed scheme provisions to select a suitable set of sensor nodes, based on the QoS value, to serve an application. The SLA between sensor owner and SCSP enforces the former to share the detailed specifications of his/her sensor nodes to the SCSP. On the other hand, the SLA between SCSP and the end-users enforces the SCSP to determine the optimal QoS of different available sets of sensor nodes and share with the end-users. We formulate the QoS of a sensor node with its specifications shared by the sensor owner. Further, we apply Karush–Kuhn–Tucker (KKT) conditions to obtain an optimal sensor node, based on the QoS value. Extensive experimental results depict that the total payable service price varies in the range 77.69–86.97% with the increase in the service price of SCSP from 500–1000 units. On the other hand, with the change in the price of sensor nodes from 500–1000 units, the total payable service price varies from 35.79–54.6%.

Arijit Roy, Sudip Misra, Aditya Kotasthane
Introduction to Adjacent Distance Array with Huffman Principle: A New Encoding and Decoding Technique for Transliteration Based Bengali Text Compression

Constructing a binary tree, the Huffman algorithm introduced the method of text compression that helps to reduce the size keeping the original message of the file. Nowadays, Huffman-based algorithm assessment can be measured in two ways; one in terms of space, another is decoding speed. The requirement of memory for a text file is going to be reasonable while the time effectiveness of Huffman decoding is being more significant. Meanwhile, this research is introducing the adjacent distance array with Huffman principle as a new data structure for encoding, and decoding the Bengali text compression using transliterated English text. Since the transliterated English text accommodates to reduce the unit of symbols accordingly, we transliterated the Bengali text into English and then applied the Huffman principle with adjacent distance array. By calculating the ASCII values, adjacent distance array is used to save the distances for each adjacent symbols. Apart from the regular Huffman algorithm, a codeword has produced by traversing the whole Huffman tree for a character in case, respectively adopting the threshold value and adjacent distance array can skip the lengthy codeword and perform the decoding manner to decode estimating the distances for all adjacent symbols except traversing the whole tree. Our findings have acquired 27.54% and 20.94% compression ratios for some specimen transliterated Bengali texts, as well as accomplished a significant ratio on different corpora.

Pranta Sarker, Mir Lutfur Rahman
BSAT: A New Tool for Analyzing Cryptographic Strength of Boolean Function and S-Box of Symmetric Cryptosystem

Behera, Pratap Kumar Gangopadhyay, SugataThe cryptographic primitives such as Boolean function and S-Box are used as a building block to design stream cipher and block cipher, respectively. However, it is important to evaluate the cryptographic strength to measure the security of such crytposystem resistance against important cryptanalytic attacks. We develop a new tool called Boolean function and S-Box analysis tool (BSAT) to evaluate the cryptographic strength of the Boolean function and S-Box. As the size of S-Box increases, it takes a lot of computational effort to evaluate all the required cryptographic properties. In this paper, we develop a tool to evaluate important cryptographic properties of both Boolean function and S-Box to minimize the computational time. The computational time of BSAT is lesser than the SET tool in terms of calculating the algebraic normal form and algebraic degree. Our BSAT takes only 0.404 s to evaluate the AES properties faster than the SET, which takes 650 millisecond (0.65 s).

Pratap Kumar Behera, Sugata Gangopadhyay
Sorted Galloping Prevention Mechanisms Against Denial of Service Attacks in SIP-Based Systems

The IP telephony service is gaining its popularity over the past fifteen years. Statistical reports show that there has been a hike in the number of attacks on VoIP systems over the last five years. Among the many existing threats, Denial of Service (DoS) flood attack is considered as the worst to VoIP environment. In this paper, we present three statistical models to mitigate flood attacks. The models can detect distributed attacks for two behaviors. They also show very high accuracy and very small false positive alarms. The proposed models can train the VoIP system sequentially every 500 ms to reduce false positive cases to zero. Based on all criteria considered in this paper, we notice that the q-SGP prevention model is better than the two other models.

Sheeba Armoogum, Nawaz Mohamudally
Performance Enhancement and Reduce Energy Consumption with Load Balancing Strategy in Green Cloud Computing

Cloud computing is an innovative technique which provides on-demand access of computing resources like processing power, memory, storage, network, etc. Cloud providers always suffering from the proper allocation of the resources for the execution of requested tasks. Because of this conflict or improper matching of tasks with the computing resources, they lead with QoS and performance degradation, and at end result, it may have bigger issue with greater energy consumption. Hence, in cloud environment, load balancing is an important approach to allocate resources to execute requested tasks with utmost efficiency and with fully utilization of computing resources. With effective load balancing mechanism, we can utilize the cloud resources, enhance performance, achieve QoS, and can reduce significant amount of energy consumption. Proposed load balancing algorithm focuses on response time and processing time for the execution of requested tasks to better justify the results in Green Cloud Environment. By reducing response time and processing time of proposed system, we can enhance system performance and achieve better Quality of Service (QoS) compared with any existing load balancing policies like Round Robin, active monitoring, LFU, and Throttled in Cloud system. By cutting down processing time and response time would result in better energy efficient mechanism for Green cloud computing. By using proposed load balancing algorithm, we can tentatively reduce response time by 25% and can reduced data center processing time by nearly 20% with compared to Round Robin algorithm. With provided solution of effective load balancing strategy, there is a significant reduction in energy consumption by cloud environment and helps to manage Green Cloud Environment more robust.

Hitesh A. Bheda, Chirag S. Thaker, Darshan B. Choksi
A Framework for Secure Communication on Internet of Things (IoT)

Internet of things (IoT) encompasses millions of resource constrained wirelessly connected devices that are very often prone to external malicious attacks. Hence, there arises a need for protecting these devices from such attacks. In this paper, we have proposed a secure communication scheme for resource constrained scheme where a device is identified by a key pair, public key and private key, and its unique identification (ID) that is assigned to it by Google Cloud IoT Core and the device stores the private key for communication with the server. Key generation uses a cryptographic approach. The communication can be facilitated by the device communicating its private key along with a cryptographically signed Java-based Web Token (JWT) to the MQTT bridge following which the connection can be established. The same JWT can be used for further communication as well.

Mohammad Reza Hosenkhan, Binod Kumar Pattanayak
COVTrac: Covid-19 Tracker and Social Distancing App

Das Mohapatra, Subhashish Nayak, Suvendu Chandan Parida, Sasmita Panigrahi, Chhabi Rani Pati, BibudhenduIn order to break the chain of SARS-CoV-2 virus infection, most countries are taking the help of mobile apps. The existing apps lack the necessary features to address the major issues of detection and alert of social distancing, detection and prevention of large gatherings, easy generation of travel permit during lock down, etc. The existing apps do not provide any solution for alerting user of the users social distance violation. Though many of these apps are being used for contact tracing, they do not say which places the user has visited recently which is a crucial parameter in contact tracing and preventing community spread. Smartphones empowered with the latest technology like Google geo-location and low energy Bluetooth (BLE) with such an app can complement a country’s general Covid-19 control strategies—comprising testing, contact tracing, seclusion and social distancing. In this work, we develop a Covid-19 tracker and social distancing app named as COVTrac, an Android app that uses various capabilities of the smartphone to address these issues with the existing mobile app along with a number of other concerns such as privacy, accurate contact tracing and prevention of socio-physical interactions.

Subhashish Das Mohapatra, Suvendu Chandan Nayak, Sasmita Parida, Chhabi Rani Panigrahi, Bibudhendu Pati
JOB-DCA: A Cost Minimizing Jaya Optimization-Based Data Center Allocation Policy for IaaS Cloud Model

Cloud computing presents the new era of revolutionary computing of Internet to fulfill the user demands for computing that provides pool of services and resources that are globally distributed in different regions around the globe. The end user demands for the adequate resources are processed on pay-per-use. The end user computing is served through virtual machines (VMs), which are created and allocated at different data centers for the on-demand user request. Allocation of data centers is challenging one for the on-demand request. The data center allocation mechanism must consider different parameters of user data center and network. Most of the works address few of the parameters and applied optimization techniques. However, the implementation of new optimization techniques always opens a scope for the researchers in different fields. In this work, we discussed the Jaya optimization-based data center allocation (JOB-DCA) mechanism for infrastructure as a service (IaaS) cloud. The work considers selected parameters to find the optimized data center list and allocate to the on-demand resources with minimum cost. The presented JOB-DCA allocation technique tries to allocate the preferred data centers with reduced total VM cost. A new cost function is defined to minimize the cost and the existing benchmark cloud simulation tool known as CloudAnalyst is used to analyze the performance of the JOB-DCA technique. The proposed technique is compared with the benchmark mechanisms, and the results show 3.2% minimized cost in the JOB-DCA.

Sasmita Parida, Bibudhendu Pati, Suvendu Chandan Nayak, Chhabi Rani Panigrahi

Sustainable Computing and Engineering

Teaching and Learning Concepts of Audio Modulation Using Tangible User Interfaces

Digital technologies form an integral part of sound production. To meet industry requirements, audio processing and modulation are taught in computer science-related courses within modules such as multimedia, data communications and science, technology, engineering, and mathematics (STEM). However, due to the complexity of the subject, teaching and learning audio modulation within traditional settings involves challenges, thereby adversely impacting student engagement. The use of tangible user interfaces (TUI) can potentially improve student experience during teaching and learning, although this type of user interface has not been much explored in the area of sound processing and modulation. As such, this paper investigates the application of a novel TUI-based system to assist in teaching and learning concepts of audio modulation. Using an TUI evaluation framework, the proposed solution was evaluated in order to assess five key constructs, notably learnability, interaction, tangibility, enjoyment, and intention for future use. During the evaluation process, 29 students practically utilized the solution and provided feedback on the constructs assessed. Results revealed an inclination towards agreement for the different constructs investigated, although some limitations were identified. Based on these limitations, recommendations are provided towards improving design of such systems.

Ah-Kwet Rémi Wong Suk Hee, Hadija Ramadhani Halfani, Girish Bekaroo
Performance Investigation of Dipole and Moxon Antennae for VHF Communication

VHF communication links are mostly used by defense personnel and amateur radio operator or commonly known as HAM radio operators. The HAM radio operators communicate in the frequency band ranging from 144 to 146 MHz. Hence, in this paper, the dipole and Moxon antennae are designed for 145 MHz and are resonating at 144 MHz and 144.44 MHz, respectively. Both the dipole and Moxon antennae are simulated on HFSS using copper material and an element radius of 1.5 mm. The antennae are majorly investigated based on gain, directivity and return loss, and other parameters. The observed gain for dipole antenna and Moxon antenna is 2.4 dB and 6.2 dB, respectively, whereas the return loss of dipole antenna is −16.78 dB, and for Moxon antenna, the return loss is −30.26 dB. The Moxon antenna is highly directive as compared to the dipole antenna. As a result, Moxon antenna provides a reliable solution for VHF communication.

Akshay Jain, Pranay Chavan, Pratik Maradiya, Kiran Rathod
Cloud Computing Load Balancing Using Amazon Web Service Technology

Amazon.com is a high and most important trafficked web services around the globe. It is an infrastructure-based web service which provides a wide range of products and services via Internet. AWS is a kind of platform which allows development of different applications by providing solutions for messaging, data storage with high scalability. Though AWS provides a wide range of services, there is a chance that servers are getting heavily loaded due to increasing in numbers of task being requested for the execution by the end users. Therefore, load balancing is required to balance the loads on different servers located on different regions available around the globe.

Nagarjuna Hota, Binod Kumar Pattanayak
New Management Algorithms for Smart Electricity Network: Designing and Working Principles

Randriantsoa, Ando Ny Aina Fakra, Ali Hamada Damien Ranjaranimaro, Manitra Pierrot Rachadi, Mohamed Nasroudine Mohamed Gatina, Jean ClaudeThe energy needs considerably increase with the economic development of a country. The promotion of renewable energy sources was enhanced because of global awareness of fossil fuel depletion and climate changes induced by their use. Renewable energy sources are, however, intermittent by nature. Alternative solutions such as the integration of renewable energy systems into the conventional electric grids have been recommended. The integration approach uses photovoltaic, wind, and other renewable energy sources to supply sustainable energy to the built spaces during peak loads or electric backup. In this article, we propose a new approach to manage the energy flow in a smart grid. Two algorithms were developed to manage the integration of renewable sources combined with a storage system in a conventional power grid. The first algorithm aims to smooth the consumption peak and reduce the extraction from the conventional grid, while the second aims to maximize the use of renewable energy depending on the energy demand. Reliability tests of the algorithm behavior have been conducted in comparison with the HOMER software, and the results show a maximum relative error of 4.78% on the management of grid extraction. These algorithms are based on scenarios and operational parameters to optimize the integration of renewable energy in the current electricity network. Their application will reduce the negative impact of fossil energy and enhance the energy transition.

Ando Ny Aina Randriantsoa, Ali Hamada Damien Fakra, Manitra Pierrot Ranjaranimaro, Mohamed Nasroudine Mohamed Rachadi, Jean Claude Gatina
Analyzing Key Barriers for Adoption of Digitalization in Indian Construction Industry: A Case Study

In India, infrastructure and construction sectors are rapidly growing segments but the main issues faced by the industry are lesser productivity due to slow and delay in operational procedures. Due to this slowness, project trio components, i.e., cost, time, and quality, all are critically compromised. Therefore, innovative plans in construction process are required to take up further to build excellence business model by use of digitalization in construction. To take it forward as a general practice to use digitalization in construction, main barriers are essential to study first. From the extensive literature review and input taken from industry experience, professional’s fourteen main barrier alternatives are identified and further ordering are established using Decision-Making Trial and Evaluation Laboratory technique (DEMATEL) method. After the analysis alternative ‘improper management approach for digitalization’ is having utmost association with remaining alternatives whereas; ‘lack of IT policy and standard’ is the alternative having minimum association with remaining alternatives. Present-case study reviews the key barriers to implement digitalization as a practice and this research work will be useful to the construction professional in their respective areas to find constraint to implement digitalization in construction processes and assess accordingly.

Avirag Bajpai, Subhas Chandra Misra
Improved Peak-to-Average Power Ratio Reduction Method for OFDM/OQAM System

Sandeep Kumar, V.A low complexity peak-to-average power ratio (PAPR) reduction method is proposed for orthogonal frequency division multiplexing with offset quadrature amplitude modulation (OFDM/OQAM) systems. Firstly, the time-domain oversampling is performed, and then power normalization is adopted and then followed by clipping. Finally, the output clipped signals are filtered by a low-pass filter. In addition, effects of clipping with oversampled OFDM/OQAM signals are analyzed in terms of signal noise distortion ratio (SNDR), signal distortion ratio (SDR) and signal-to-noise ratio (SNR). Simulation results demonstrate that the proposed method can effectively reduce the peak regrowth and PAPR after filtering.

V. Sandeep Kumar
Implementation of a Smart Parking System for a University Campus

In recent times, the amount of vehicle users is spreading rapidly requiring additional parking spaces. A university campus can be considered as a small city and it usually undergoes the same parking problems encountered by smart cities. Searching and getting a parking space has always been a concern in university campus or even in smart cities. This paper aims to present a smart parking system based on Internet of things (IoT) to eliminate the stress, frustration and the dissatisfaction of users and to help them spot an available parking space to station his/her vehicle in a university campus. When there is no available parking space in a parking area, certain users have the tendency to roam around in search of a parking space, thus wasting time and fuel. Hence, one possible solution is to create a mobile application, which shows a real-time status of each parking space in the parking area and takes into consideration users’ weekly time plan to allocate a parking space. Ultrasonic sensor and ESP8266 DevKitC microcontroller are used to implement the system. A real-time database records the status of each parking space. The system prototype is presented in the paper.

M. Saheer Jhugroo, M. Shahil Kataully, Soulakshmee D. Nagowah
High-Resolution Wind Speed Mapping for the Island of Mauritius Using Mesoscale Modelling

In the context of developing wind energy for the generation of electricity in many countries, proper knowledge and understanding of the availability of wind resources over these countries are of great importance. This paper demonstrates how the Weather Research and Forecasting (WRF) mesoscale model can be used to simulate the wind flow patterns over a complex terrain such as the island of Mauritius at a horizontal spatial grid resolution of 1 km × 1 km in view of generating reliable wind speed data. The simulation period spans from January 2015 to December 2017. The WRF results are validated against ground truth data for various climatic conditions such as calm days and the occurrences of two extreme events, namely a cyclone and an anticyclone. Statistical metrics such as Bias, RMSE and Pearson correlation are used for comparing both the datasets. Excellent agreement with r-values above 0.6 is obtained for all the cases considered. Finally, high-resolution (1 × 1 km) mean monthly, seasonal, and yearly wind maps are generated for the island of Mauritius.

Tyagaraja S. M. Cunden, Michel R. Lollchund
Analysis of Wind Energy Resources for the Island of Mauritius Using Concepts of Thermodynamics

Exergy is the term which relates to the ‘quality’ of the work potential of a system. It is usually split into kinetic exergy, physical exergy, chemical exergy and potential exergy. Hence, giving a more realistic background on the performance of the system. For wind energy systems, kinetic exergy and physical exergy are the only relevant quantities. However, kinetic exergy is more related to the wind turbine system, while physical exergy results from the environmental factors around the system. In this paper, the physical exergetic analysis of the wind energy resources over the island of Mauritius is conducted. It is calculated by using the meteorological variables such as temperature, humidity ratio, pressure and wind speed at typical turbine hub height (60 magl), obtained from running the Weather Research and Forecasting (WRF) model. A high-resolution physical wind exergy map is generated which can help energy planners and decision makers to identify potential hotspot regions for the installation of wind turbine systems that will operate at maximum efficiency.

Tyagaraja S. M. Cunden, Naafeera B. R. Abdel Hassan, Michel R. Lollchund
Design and FPGA Implementation of an Efficient 8×8 Multiplier Using the Principle of Vedic Mathematics

This paper is regarding the design and analysis of an efficient 8×8 Vedic multiplier by the principle of Vedic mathematics [1]. Here, a unique Vedic multiplier architecture is adapted, which is not based on the regular method of multiplication (addition, shifting). The design is as per the "Urdhwa-Tiryakbhyam Sutra" of Vedic mathematics. Two numbers (binary of 8-bit each) are multiplied using the methodology of this principle/sutra. "Urdhwa-Tiryakbhyam" means "vertically and crosswise", wherein the partial products are computed at once, thus lessening the delay and hence making the multiplication faster. This 8 by 8 bit multiplier is coded in Verilog HDL and tested on a DE10-lite FPGA kit.

Smitha Bhat Kaje, Jagadish Nayak
Fault Detection and Isolation in a Leaky Water Distribution Network Using Fuzzy Logic Control Based on Residual Pressure Analysis

In this research paper, we try to address the problem of leakage in an experimental water distribution network using the method of fault detection and isolation. The algorithm is based on the residual generation and analysis of nodal pressures for the faulty network compared to the ideal one. A fuzzy controller then determines the location of the critical leaking node based on a Mamdani set of inference rules for isolation using flow control valves. Results show that the residual pressure analysis for FDI is very effective when network is embedded with two leaks, whereby isolation of the faulty pipes is successfully carried out to prevent wastage of water.

Lekhramsingh Latchoomun, Tsiatsipy Durand Brunel
Robo-Friend: Can a Social Robot Empathize with Your Feelings Effectively?

Ahmed, Eshtiak Islam, Ashraful Chowdhury, Atiqul Islam Rahman, Mohammad Masudur Chowdhury, Shahnaj Hosen, Md ImranSocial robots are becoming more popular everyday because of their resemblance with human behavior and interaction styles. They should be treated more like companions rather than just a fancy source of entertainment. Recent studies have shown great promise for robots to act as teachers, companions, caregivers and so on. In this study, primarily, a feasibility analysis is done to find out the way how a social robot can be used as a companion where it can sense the emotions of the users, empathize with their feelings and provide feedback with a view to changing their mood. A context study was conducted to make design implications where user experience goals such as inspiration, sense of control, relaxation, accomplishment and confidence were considered in the implementation. Furthermore, a prototype was designed based on a social robot, Pepper which then was interacted with potential users, either in groups or individually. The results support the fact that, a well-implemented social robot can effectively empathize with human users’ emotions.

Eshtiak Ahmed, Ashraful Islam, Atiqul Islam Chowdhury, Mohammad Masudur Rahman, Shahnaj Chowdhury, Md Imran Hosen

Management and Banking Applications

Frontmatter
Using Stakeholder Expectations and Perceptions to Guide the Brand Refresh of a Tropical Airline

Literature on successful approaches for executing brand evolution exercises in airlines, along with effective livery designs, is scant. This paper aims to contribute to the existing literature around successful airline branding and livery design, by reporting on multiple stakeholder engagement in the brand evolution of a tropical airline, Air Seychelles. Research gleaning different stakeholder expectations and perceptions, at different stages, is elaborated upon. A mixed method approach was used, building on both quantitative and qualitative feedback. Initial focus groups were held to analyse the situation regarding the airline’s brand perception. Further focus groups were then executed to refine the Air Seychelles brand essence. Customer surveys were done on board Air Seychelles flights across different sectors, to situate customers’ viewpoints on the different brand touchpoints. From analysis of data, it was evident that Air Seychelles needed a ‘brand evolution’ rather than a ‘brand revolution’, thus guiding proposals for the refreshed brand for subsequent implementation.

Vimi Neeroo Lockmun-Bissessur, Swaleha Peeroo, David Savy
An Analysis of Communication Strategies of Fast Food Outlets on Social Media in Mauritius

Fast food outlets in Mauritius are leveraging social media as a marketing tool. However, studies on the adoption of social media by companies in Mauritius are scarce. This paper aims to examine the types of content strategies used by businesses in Mauritius to engage customers. Content analysis was used to analyze the posts on the Facebook pages of McDonald's, KFC and Pizza Hut. This study shows that KFC and McDonald's use engaging contents, while Pizza Hut posts use informative contents. By observing the interactions on the Facebook pages, we found that Pizza Hut embraces an egocentric communication strategy, while KFC and McDonald's use a conversational communication strategy. However, all three fast food outlets adopt a secretive communication strategy. This study adds to the body of knowledge by identifying the content strategies and social media communication strategies of fast food outlets on their local Facebook pages.

Swaleha Peeroo, A. Mooznah Auleear Owodally
Factors Affecting Task Allocation and Coordination in Distributed Agile Software Development

The advantages in terms of low-cost resources are driving software industry to shift towards global markets in recent years. The distributed agile software development (DASD) companies aim at achieving high productivity by using resources around the world. Together with benefits, DASD presents some issues and challenges, which require further studies and research. Adopting an appropriate task allocation process is important for the smooth running of the DASD projects. Assigning tasks to remote teams requires a number of factors to be taken into consideration. Therefore, there is the need to identify factors that influence task allocation and coordination in agile distributed environment. In this study, a survey is conducted with agile professionals to assess the importance of the identified factors in the literature. Snowball sampling technique has been chosen due to small population size in Mauritius. The target population of the survey consists basically of agile project managers, agile team leaders, agile team members, agile researcher and other decision-makers involved in the process. The results show that some factors are valued differently from the literature.

Chitra Nundlall, Soulakshmee D. Nagowah
Optimizing Recruitment Process Within Businesses: Predicting Interview Attendance Using C4.5 Algorithm

An essential component of the job recruitment process involves conducting interviews, during which decision is made on the appropriateness of a candidate for a particular job. An important challenge faced by interviewers during the recruitment process relates to determining whether or not selected candidates will attend the job interview. Candidates failing to attend interviews because of various reasons lead to time wasting and as such, effective prediction is needed by human resource managers about whether or not a candidate will attend an interview. In other words, if the human resource department knows exactly whether or not the job seeker is attending the interview, a more efficient interview invitation strategy could be worked upon. However, even though different machine learning techniques are available for prediction, limited work has been undertaken to address this recruitment process related issue. Taking cognizance of this problem, this paper investigates, analyzes, and predicts candidate attendance during interviews in order to optimize recruitment process within businesses. For this, C4.5 algorithm was applied on an open dataset related to job interview attendance. Following generation of a classification tree, results showed some insightful patterns on when candidates do not attend job interviews.

Shivianee Sandhip Laldjee, Chiamaka Ann Marie Ajufo, Girish Bekaroo
Training Engineers as Drivers of e-Learning in a University

e-learning has become unexpectedly popular in the educational sector in Mauritius. In this context, the Université des Mascareignes (UdM) has partnered with Université de Caen in developing an e-learning strategy where it trains academics to become training engineers with the perspective of implementing effective e-learning platforms. This paper considers the need to have a structured form of virtual learning that will be dependent on factors like the relevance of e-learning, the importance of training engineers in e-learning, the approaches needed to develop effective e-learning and the ways in which e-learning could be sustained. A survey was conducted among the 17 participants registered as the training engineers course. Results explained that training engineers were firstly willing to be certified as straining engineers which would allow them properly develop e-learning. Potential training engineers considered learning to be provocative which would demand involvement in developing dedicated platforms for effective interaction between learner and trainer. Trainer motivation and student capacity mattered mostly in supporting the sustainability of e-learning.

Nirmal Kumar Betchoo
Using Scratch Software as a Teaching-Learning Tool in French Language Classes: A Case Study at Université Des Mascareignes (Mauritius)

In the teaching-learning of languages, digital tools are said to allow learners to be more motivated, and to produce a more elaborate and better-quality work. French language, compulsory in the Mauritian school curriculum, from primary school to secondary school (Cambridge O-Level), is a subject that worries learners who have difficulties in expressing themselves in writing. In this study, Scratch, a free software developed by the MIT, was used among students enrolled in the first year at Université des Mascareignes (Mauritius) to revise the basics of grammar in a fun way. Learners’ comments at the end of the sessions were analysed to identify not only the positive aspects of this learning method, but also to review its shortcomings and the problems encountered. In other words, we pose the problem from the point of view of the learners, in order to know if it is possible to strengthen one’s knowledge and skills in French language by moving away from the traditional method of teaching and by using Scratch software in groups.

Neelam Pirbhai-Jetha
Improving Fraud Detection Mechanism in Financial Banking Sectors Using Data Mining Techniques

The banking sector is facing increasing risks of fraud and malpractices. Thus, adopting new methods and approaches to detect, prevent, and predict frauds is essential. Data Mining (DM) techniques are the latest methods which are primarily responsible for ensuring data integrity, when are used in the way of machine learning approach. This paper essentially discusses how can the banking sector takes the advantage of one of these methods is a logistic regression. It is an effective classification algorithm that gives high accuracy for detecting binary problems and through the outcomes it can predict irregular transactions. The paper offers this algorithm effectively by presenting suggested developed model, as a part of a framework for the development of a fraud detection mechanism. Finally, the paper will test the suggested model by using a LR algorithm for proving its performance.

Hanan Hamdan AL-Abri, Basant Kumar, Joseph Mani
Digital Learning for Millenials: IT in Eduation and/or IT for Education

Digitalisation has transformed organisations (existing or new ones), social systems and the global economy. One of the major drivers and emergence of digital platforms in the education sector is technology. Since we are saturated with technological devices, how can we, therefore, neglect or refuse the role technology plays or must play in education? For many researchers and practitioners, the main topic of discussion remains whether to consider the learner or the technology. In this paper, two main trending issues are addressed, the first one accounts on how and why digital technology transforms teaching–learning process. The second one focuses on the evolution brought about by information technology in the higher education sector, bearing in mind that the twenty-first century is the century of ‘digital natives’, term coined by Marc Prensky in [1] to refer to those who have grown up with digital technology. This theoretical/conceptual paper intends to investigate the different tools and artefacts which are used in the teaching–learning environment, in order to design new research data for future use.

Neelam Pirbhai-Jetha, Pascal Boncoeur, Normada Bheekharry
Supply Chain Management—Marketing Integration a Key Element in the Digital Era

The purpose of this paper is to study the development and influence of digitalization in the marketing processes and supply chain management and how an integration of both can be a useful tool in marketing decision making. Many research has been conducted on supply chain management and marketing; however, little attention has been drawn to the combination of these two functions and the potential impacts of information technology in decision making and value-creation. Data generated from Internet of things, cloud computing, big data analytics and customer profiling have created new opportunities for businesses to supplement their business intelligence for better understanding customer demand and bargaining power of suppliers.

Normada Devi Bheekharry
Bank Customer’s Credit Score Prediction Using Feature Selection and Data Mining Algorithm

Data mining techniques of classification and prediction model are used for analyzing the customer of bank through their real data, customer is a performing as asset or non-performing asset for the bank. Every year each bank faces a similar problem that most of them customers do not refunding loan installment on time and many precisely become defaulter for the bank. So every bank needs a system that can predict in future any customer in future will be profitable or not profitable asset for the bank. If any customer found as non-performing asset means, it has bad credit score. In such case if a customer further requests for new loan, bank can easily identify him as defaulter and reject his/her request on the basis of our new proposed models. In this way, bank extracts their non-performing assets. Besides, bank can also identify their new customers for having good credit score and may offer them other services for the beneficial of bank. There are many as such models that are used for prediction. This paper compares different classification and prediction algorithms to developed best suitable model to analyze loan requesting customer’s data set using their credit score. Mainly in this paper, there is a comparison between random forest algorithms and logistic regression which is best suitable for prediction of credit scores of customers with apply k best feature selection on data set. The aim of proposed model is to reduce bankruptcy, non-performing assets and the losses of bank.

Durgesh Kumar Singh, Noopur Goel
The Use of the WhatsApp Platform as an Educational Tool During the Confinement Period of the Outbreak of Covid-19

The purpose of this research paper is to gather the views on the use of the WhatsApp platform as an educational tool during the confinement period of Covid-19. Both quantitative and qualitative methods have been used for data collection from the year one department of management students of the Université des Mascareignes (UdM). All the students owned smartphones and WhatsApp was already part of the e-routines for communication for both learners and lecturer. It was also found that during the confinement period, this platform with its numerous features helped to create a positive and cordial climate among the students and lecturer, thus creating a sense of belonging through the WhatsApp group. This platform provided the required support towards the successful completion of the set learning objectives. The survey revealed that the majority of students would like to continue using WhatsApp in their educational process post-Covid-19 period too.

Nundini Devi Akaloo
Backmatter
Metadata
Title
Progress in Advanced Computing and Intelligent Engineering
Editors
Dr. Chhabi Rani Panigrahi
Dr. Bibudhendu Pati
Prof. Binod Kumar Pattanayak
Prof. Seeven Amic
Dr. Kuan-Ching Li
Copyright Year
2021
Publisher
Springer Singapore
Electronic ISBN
978-981-334-299-6
Print ISBN
978-981-334-298-9
DOI
https://doi.org/10.1007/978-981-33-4299-6