Skip to main content
Top

2021 | Book

Evolutionary Computing and Mobile Sustainable Networks

Proceedings of ICECMSN 2020

Editors: Prof. V. Suma, Dr. Noureddine Bouhmala, Dr. Haoxiang Wang

Publisher: Springer Singapore

Book Series : Lecture Notes on Data Engineering and Communications Technologies

insite
SEARCH

About this book

This book features selected research papers presented at the International Conference on Evolutionary Computing and Mobile Sustainable Networks (ICECMSN 2020), held at the Sir M. Visvesvaraya Institute of Technology on 20–21 February 2020. Discussing advances in evolutionary computing technologies, including swarm intelligence algorithms and other evolutionary algorithm paradigms which are emerging as widely accepted descriptors for mobile sustainable networks virtualization, optimization and automation, this book is a valuable resource for researchers in the field of evolutionary computing and mobile sustainable networks.

Table of Contents

Frontmatter
Optimal Resource Sharing Amongst Device-to-Device Communication Using Particle Swarm Algorithm

Device-to-Device Communication (D2D) has been described as one of the important innovations in the development of 5G networks. This form of D2D networking has its own benefits of improved network capacity and reduced power usage, making it as a primary candidate for the upcoming 5G cellular networks. In this work, a single cell scenario with one base station and multiple cell users and D2D pairs are considered. As sharing causes performance degradation due to interference between CUs (cellular users) and D2D pairs, a permutation optimization strategy based on Particle swarm optimization (PSO) has been proposed to optimize resource sharing between cellular users and D2D pairs. This technique is found to maximize system performance through better resource sharing.

H. M. Nethravathi, S. Akhila
A Systemic Method of Nesting Multiple Classifiers Using Ensemble Techniques for Telecom Churn Prediction

In this contemporary world, almost every business and companies deploy machine learning methods for taking exemplary decisions. Predictive Customer analytics supports to accomplish of momentous insights from customer data. This trend is more distinct in the telecommunication industry. The most challenging issue for telecom service providers is the increased churn rate of customers. In Existing works, combining various classification algorithms to design hybrid algorithms as well as ensembles have reported best results compared to single classifiers. But, selecting classifiers for creating an effective ensemble combination is challenging and are still in investigation. This work presents a various type of ensembles such as Bagging, Boosting, Stacking and Voting to combine with different Base classifiers in a systemic way. Experiments were conducted with benchmark Telecom customer churn UCI dataset. It is inferred that; Ensemble learners outperform single classifiers due to its strong classifying ability. The models designed were affirmed using standard measures such as AUC, Recall, F-Score, TP-rate, Precision, FP-Rate and Overall Accuracy. This way of combining multiple classifiers as an ensemble achieves the highest accuracy of 97.2% from bagged and boosted-ANN.

J. Beschi Raja, G. Mervin George, V. Roopa, S. Sam Peter
Hybrid Method in Identifying the Fraud Detection in the Credit Card

As the world is rapidly moving towards digitalization and money transactions are becoming cashless, the utilization of credit cards has rapidly heightened. The fraud activities associated with it have also been increasing which leads to a huge loss to the financial institutions. Therefore, we need to analyze and detect the fraudulent transaction and separate them from the non-fraudulent ones. The paper proffers the frame work to detect credit card frauds. A combination of the three techniques is used. These methodologies include Decision Trees, Neural Network and K-Nearest Neighbor. For each new incoming transaction, the new label is assigned by taking the majority of the labels from the output of these techniques. This model is believed to work fairly good for all sizes and kinds of dataset as it combines the advantages of the individual techniques. We conclude the paper with a comparison of our model with the existing ones.

Pooja Tiwari, Simran Mehta, Nishtha Sakhuja, Ishu Gupta, Ashutosh Kumar Singh
Real-Time Human Locator and Advance Home Security Appliances

In this modern era, human safety has become an important issue. In the past years, crimes against girls and children have raised significantly. In this paper, a novel prototype Human Location Monitoring System is implemented using GPS and GSM. Through this device, the victim child can send their location, and also parents can check their children’s location. The model is not only restricted to humans but also can be fitted in any vehicle or any device or thing which you want to keep off-track. The results show that the proposed model is far better than the existing solution which uses Bluetooth, RFID, and Wi-Fi technology. It is a desirable device for people as it has a range of the whole world and is cost-efficient.

Anand Kesharwani, Animesh Nag, Abhishek Tiwari, Ishu Gupta, Bharti Sharma, Ashutosh Kumar Singh
A Novel Implementation of Haptic Robotic Arm

Robotics is a study of machines that are used to do different jobs. Robots are used to perform some work traditionally done by humans. Haptic technology is a growing area that will be useful for all humans, where human interactions are difficult and hazardous. The proposed system finds a range of applications in a harmful environment which can be used for medical applications and other areas, where it is difficult for humans. This method uses the technique of the master–slave concept and it is demonstrated at different stages of the performance of the glove with respect to the flex sensors.

A. Kavitha, P. Sangeetha, Aijaz Ali Khan, K. N. Chandana
A Survey on Partially Occluded Faces

Over the past decade, partially occluded face recognition has been an urgent challenge to computer visionaries due to conditions, which appear unconstrained. The main aim of the facial recognition system is to attain the ability to detect partially occluded regions of an individual’s face and authenticating/verifying that face. There are existing neural networks that are proven to be perfect on analysing the patterns for constrained looks but fail to perform in analysing partially occluded faces that are common in the real world. The paper discusses the trainable Deep Learning Neural Network (DLNN) for partially occluded faces by recognizing all the possible faces in the image, either resting, posing or projecting faces and matching them across the trained datasets of DLNN and encoding the identified faces.

Shashank M. Athreya, S. P. Shreevari, B. S. Aradhya Siddesh, Sandeep Kiran, H. T. Chetana
Some Effective Techniques for Recognizing a Person Across Aging

To identify a person across aging is a very challenging and interesting task. It has gained a lot of attention from the researchers as it has a wide range of real-life applications like finding missing children, renewal of passport, renewal of a driving license, finding criminals, etc. Many researchers have also proposed their own methodologies; still there is a gap to fill in. Hence, the authors have proposed some techniques for the betterment of the system performance. The first one is with GLBP as a novel feature, second with Convolutional Neural Network (CNN) and the last one is a modification of the previous method with biased face patches as inputs. CNN is found to be the perfect solution for face recognition problem over aging as there is no need for any complicated preprocessing and feature extraction steps. FGNET and MORPH II datasets are used for testing the performance of the system. All these techniques outperform available state-of-the-art methods in the Rank-1 recognition rate.

Mrudula Nimbarte, Madhuri Pal, Shrikant Sonekar, Pranjali Ulhe
A Comprehensive Survey on Federated Cloud Computing and its Future Research Directions

Shishira, S. R. Kandasamy, A.The cloud computing paradigm is popular due to its pay-as-you-go model. Due to its increasing demand for service, the user has a huge advantage in paying for the service currently needed. In a federated cloud environment, there is one or more number of cloud service providers who share their servers to service the user request. It improves minimizing cost, utilization of services and improves performance. Clients will get benefited as there is a Service Level Agreement between both. In the present paper, survey is provided on the benefits of the federated environment, its architecture, provision of resources and future research directions. Paper also gives the comparative study on the above aspects.

S. R. Shishira, A. Kandasamy
Decoy Technique for Preserving the Privacy in Fog Computing

Fog computing is a process which computes and stores the data, which facilities network services between computing information centers and end devices. Fog computing is additionally fogging or edge computing. Fog computing focuses on increasing the efficiency and to quicken the data computing to cloud for storage and processing. This paper is mainly focus on securing personal information within the cloud employing a fog computing facility and decoy techniques. In the proposed system primary stage each licensed and unauthorized user can refer to the decoy information in fog computing. Using user profiling technique, the authorized user will be identified. If it’s an unauthorized user, they cannot access the original data. If it is an authorized user, then it proceeds to the second stage by verifying the challenges. The challenge can be secured with verification code or else it can be a private question. Once authorized users clear the security challenge, they can access the original data.

K. P. Bindu Madavi, P. Vijayakarthick
Design of Book Recommendation System Using Sentiment Analysis

In this paper, we propose four-level process to recommend the best book to the users. The levels are named as grouping of similar sentences by the semantic network, sentiment analysis (SA), clustering of reviewers and recommendation system. In the first level, grouping of similar sentences by the semantic network is done taking pre-processed data using parts of speech (POS) tagger from the datasets of reviewers and books. In the second level, SA is done in two phases which are training phase and testing phase by using deep learning methodology like convolutional neural networks (CNN) with n-gram method. The outcome of this level is given as input to the third level (clustering) which clusters the reviewers based on their age, locality and gender using K-nearest neighbor (KNN) algorithm. In the last level, a recommendation of books is done based on top-n interesting books using collaborative filtering (CF) algorithm. The system of book recommendation is to be done to get the best accuracy within less elapsing time.

Addanki Mounika, S. Saraswathi
Review of Python for Solar Photovoltaic Systems

In recent years, the usage of solar energy as a source to produce power has increased exponentially as it provides a clean and efficient alternative to depleting non-renewable resources. The normal working period of a photovoltaic (PV) panel is 20 years, but due to defects in manufacturing or atmospheric condition changes, the efficiency and the lifespan of the panel decrease each year. The objective of this review article is to present and analyze the different methods that can be used to reduce the degradation rate of the PV cells in an economically viable way. Open-source frameworks are important to make any solution affordable; hence we explore the usage of python language in developments relating to improvement in the performance of PV cells. Based on this review a practically employable solution to improve working conditions for PV cells can be obtained.

R. Sivapriyan, D. Elangovan, Kavyashri S. N. Lekhana
Data Exploratory Analysis for Classification in Machine Learning Algorithms

Availability of big data transformed the way machine learning works and the way data is used in machine learning. In real time the data gathered from various sources might be unstructured, incomplete, unrealistic and incorrect in nature. Transforming the data with the above-mentioned qualities and making it ready for analysis is a challenging task. As the quality of data have direct impact on the efficiency of the trained model, data exploratory analysis (DEA) plays a major role in understanding the data and forms the quality training dataset for the machine learning algorithms. This paper emphasizes the importance of DEA in the selection of the significant attributes and filling of missing values to form the quality training dataset. The dataset considered for experimentation is a binary classification problem “Survival prediction of Titanic Passengers”. Experimental results show that training the model with the quality dataset has improved the accuracy as compared to the case when the model was trained with a raw data.

Jesintha Bala Chandrasekar, Shivakumar Murugesh, Vasudeva Rao Prasadula
Keystroke Dynamics for User Verification

With the evolution of internet, the dependency of humans on them has increased. This has led to an increase in attacks, forgery, impersonation and so on, which require that a user and his privacy be maintained. Thus the need to protect a user has increased intensifying protection, authentication and verification methods of a user. There are many methods of authenticating a user, which include traditional methods of authentication such as passwords, personal identification numbers and so on, However, these methods have their drawbacks and hence biometrics have replaced these methods in some cases and in some cases biometrics has turned out be an additional layer of security, therefore providing better security. In this paper we propose one of the behavioral methods of biometric authentication called keystroke dynamics which uses a user’s typing rhythm to verify a user. One of the most common examples of this method is the verification of user using CAPTCHA, where the user is asked to type the letters to be verified as a genuine user and thus the user’s typing rhythm is captured based on which a match is generated and the user is verified. This method is most commonly used in applications such as online banking, email verifications and other such areas. This method acts as an additional layer of security to an existing system and helps protect the sensitive information of the user.

Ashwini Sridhar, H. R. Mamatha
Activity Prediction for Elderly Using Radio-Frequency Identification Sensors

In hospitals and nursing homes, older people usually fall due to weakness and disease. Standing or walking for a long time can be two of the many reasons for falling. One of the better ways for fall prevention is to monitor patient movement. A new kind of batteryless light sensors is providing us with new opportunities for activity prediction, where the inconspicuous nature of such sensors makes them very suitable for monitoring the elderly. In our study, we analyze such sensors known as radio-frequency identification (RFID) tags to predict the movements. We try to study a dataset obtained from 14 healthy old people between 66 and 86 years of age who were asked to wear RFID sensors attached with accelerometers over their clothes and were asked to perform a set of pre-specified activities. This study illustrates that the RFID sensor platform can be successfully used in activity recognition of healthy older people.

Prashant Giridhar Shambharkar, Sparsh Kansotia, Suraj Sharma, Mohammad Nazmud Doja
The Role of Predictive Data Analytics in Retailing

Large information analytics is a new practice in business analytics today. However, later modern overviews locate that huge information investigation may neglect to meet business wants as a result of the absence of business context and cluster arranged framework. In this paper, we present an objective situated large information investigation system for better business choices, which comprises a theoretical model which associates a business side and a major information side, setting data around the information; a case-based assessment technique which empowers to center the best arrangements; a procedure on the best way to utilize the proposed structure; and an associated device which is a constant enormous information examination stage. In this structure, issues against business objectives of the present procedure and answers for the future procedure are expressly estimated in the reasonable model and approved on genuine large information utilizing huge inquiries or enormous information examination. As an exact examination, a shipment choice process is utilized to indicate how the system can bolster better business choices as far as extensive understanding both on business and information investigation, high need and quick choices.

Mohammed Juned Shaikh Shabbir, C. M. Mankar
High-Performance Digital Logic Circuit Realization Using Differential Cascode Voltage Switch Logic (DCVSL)

Most of dual-rail CMOS circuits are loosely based around differential cascade voltage logic switch. DCVSL provides dual-rail logic gates that have latching characteristics built into circuits itself. In CVSL logic output results are held until inputs induce a change, so that there is no loss of data, thereby saving energy and power. In today’s digital application low power has become a key factor in high-speed computations. The proposed work gives an insight into the working of CVSL and the proposed method, showing a reduced number of gates, and thereby reducing area and power constraints. In this paper, exclusive detailed use of pass gate logic structure to put back the nMOS logic structure in conventional DCVSL circuit along with the implementation of adders are provided. The proposed circuit designs and results are compared and implemented using a cadence software tool and for quantum circuits QCAD tools are used. The study shows the optimization in case of power, area and speed achieved in comparison with conventional circuits.

S. S. Kavitha, Narasimha Kaulgud
Analysis and Classification of Ripped Tobacco Leaves Using Machine Learning Techniques

Tobacco crop is one of the major crops in the world and plays a vital role in the international market. After cultivation of tobacco plants, classification of ripped tobacco leaves is one of the challenging tasks. Generally in India, the classification of ripped leaves is done by a manual process only. Machine learning-based classification model is introduced to nullify the human intervention in the classification of tobacco ripped leaves. The proposed model is designed according to classification of three important features like color, texture, and shape. For the experimental purpose the images of the leaves are captured using mobile sensor and the dataset was created. The dataset consists of 2040 tobacco leaf images; from the image dataset 1378 images are used for training and 622 images are used for testing. The proposed model is validated by many machine learning algorithms, where the classification model achieved an efficiency of 94.2% on validation accuracy and 86% in testing accuracy. The proposed model has significant high performance on the classification of ripped tobacco leaves.

M. T. Thirthe Gowda, J. Chandrika
Next-Generation WSN for Environmental Monitoring Employing Big Data Analytics, Machine Learning and Artificial Intelligence

A worldwide network of wireless sensors is used to monitor dynamic environmental changes with respect to time. Therefore, the data provided by these sensor networks are crucial for collecting specific information; hence data analytics is essential in such networks. For effective utilization of gathered data, big data analytics can be one of the prominent solutions since the data plays an important part in machine learning allowing the WSN to adapt the dynamic changes in environment to save cost and efforts of redesigning the present WSN. In this paper we present the advances of WSN to further develop the next-generation wireless sensor network by employing software-defined network (SDN), big data analytics, machine learning and artificial intelligence tool along with its benefits and challenges. We also discuss the software-defined wireless sensor network (SDWSN) and the possibility of application of artificial intelligence in it to meet the challenges of SDWSN and its advantages. And finally, we have discussed different problems associated with WSN network specifically for environmental monitoring and their respective solutions using different machine learning paradigms and how efficiently the adoption of big data analytics in ML and AI plays an important role to serve the improved performance requirements.

Rumana Abdul Jalil Shaikh, Harikumar Naidu, Piyush A. Kokate
Generating Automobile Images Dynamically from Text Description

Synthesis of a realistic image from matching visual descriptions provided in the textual format is a challenge that has attracted attention in the recent research community in the field of artificial intelligence. Generation of the image from given text input is a problem, where given a text input, an image which matches text description must be generated. However, a relatively new class of convolutional neural networks referred to as generative adversarial networks (GANs) has provided compelling results in understanding textual features and generating high-resolution images. In this work, the main aim is to generate an automobile image from the given text input using generative adversarial networks and manipulate automobile colour using text-adaptive discriminator. This work involves creating a detailed text description of each image of a car to train the GAN model to produce images.

N. Sindhu, H. R. Mamatha
Body Mass Index Implications Using Data Analysis in the Soccer Sports

Soccer is considered among the most popular sports in the world among the last few years. At the same time, it has become a prime target in developing countries like India and other Asian countries. As science and technology grow, we can see that sports also grow with science, and hence technology being used to determine the results sometime or sometimes it is used to grow the overall effect. This paper presents the attributes and the qualities which are necessary to develop in a player in order to play for the big-time leagues called Premier League, La Liga, Serie A, German Leagues and so on. Simple correlation and dependence techniques have been used in this paper in order to get proper relationship among the attributes. This paper also examines how the body mass index plays an effect on the presentation of soccer players with respect to their speed, increasing speed, work rate, aptitude moves and stamina. The point is likewise to discover the connection of the above credits concerning body mass index. As in universal exchange, football clubs can profit more in the event that they have practical experience in what they have or can make a similar bit of room to maneuver. In a universe of rare assets, clubs need to recognize what makes them effective and contribute in like manner.

Akash Dasmondal, P. K. Nizar Banu
Biogeography-Based Optimization Technique for Optimal Design of IIR Low-Pass Filter and Its FPGA Implementation

A bio-inspired meta-heuristic biogeography-based optimization (BBO) algorithm, which imitates the migration and mutation processes of different species according to the habitat features, is used in this paper in order to get the optimal coefficients of an infinite impulse response (IIR) low-pass filter (LPF) of order 8. BBO mainly depends on the immigration rate (IR) and emigration rate (ER), through which the searching efficiency is enhanced. The simulation results have shown a better performance in terms of stopband attenuation, transition width, passband ripples (PBR), and stopband ripples (SBR). The optimized coefficients are utilized for the implementation of the IIR filter in the Verilog hardware description language (HDL) with the field-programmable gate array (FPGA).

K. Susmitha, V. Karthik, S. K. Saha, R. Kar
Invasive Weed Optimization-Based Optimally Designed High-Pass IIR Filter and Its FPGA Implementation

A meta-heuristic algorithm named as Invasive Weed Optimization (IWO) approach is considered in this paper for the design of infinite impulse response (IIR) high-pass filter (HPF) of order 8. This particular optimization technique is inspired by nature and it mainly depends on the colonizing characteristic feature of weeds. Unlike other optimization techniques, IWO converges fastly to the optimal solution and results in accurate solution parameters such as stopband attenuation, transition width, passband ripples (PBR), and stopband ripples (SBR). The optimally obtained coefficients are employed in the design of IIR HPF of order 8 which is realized in the Verilog hardware description language (HDL) and thus dumped on the field-programmable gate array (FPGA).

V. Karthik, K. Susmitha, S. K. Saha, R. Kar
Identification of Online Auction Bidding Robots Using Machine Learning

Maan, Pooja Eswari, R.The aim of this project is to identify the bidding robots using machine learning, which bids in an online auction. Bidding robot is basically an application which helps to place a bid or click automatically on a website. So, this project will help the site owners to prevent unfair auction by easily flag the robots and remove them from their sites. The major steps are feature extraction, feature selection, model implementation, and classification. Feature engineering is done which includes feature extraction, dropping unnecessary features, and selecting necessary features. Various machine learning classification models are applied with new features to classify human and robot online auction bids and the best performance achieved is ROC score 0.954 using Random Forest.

Pooja Maan, R. Eswari
Machine Learning-Based Green and Energy Efficient Traffic Grooming Architecture for Next Generation Cellular Networks

Naik, Deepa Sireesha, Pothumudi De, TanmayIn the year 2015, the United Nation has adopted 17 Sustainable Development Goals (SDG) to ending poverty, saving the planet and bringing prosperity for all by the year 2030. Universal broadband connectivity is considered as one significant contributing factor to achieving these goals. There is a close correlation between the national Gross Domestic Product (GDP) and broadband availability. Broadband access has great potential in opening up work opportunities and boosting income for poverty-stricken people in the remote and underdeveloped countries. It is estimated that there are still about 1.2 billion people who are still not connected to the Internet. Broadband requirements from this segment along with rising broadband demand from urban consumerism have put pressure on the available frequency spectrum. The optical fiber communication has an abundance bandwidth. The Internet Service Provider (ISP) cannot provide the optical network in remote areas, due to cost constraints, climate, weather, and high investment costs. Hence its wireless counterpart WiMAX has short set up time and low deployment cost. Hence universal broadband connectivity can be achieved by Hybrid Optical WiMAX networks. Further, the abovementioned remote areas suffer from low infrastructure and unreliable power supply. In this paper, we have used alternative sources of energy to mitigate the problem of unreliable electricity supply, particularly in the areas. The proposed machine learning-based renewable energy prediction depends on the geographical location of the network node. The predicted renewable energy can be used as a source for serving traffic demands. The traffic aggregation methods were used to minimize network resource consumption. The unpredictability in harnessing renewable energy is mitigated by using backup nonrenewable energy. The simulation results show that the proposed algorithm reduces nonrenewable energy consumption.

Deepa Naik, Pothumudi Sireesha, Tanmay De
Robust Image Encryption in Transform Domain Using Duo Chaotic Maps—A Secure Communication

The increase of cyberattacks in the field of information security leads to high attention in the digitalized environment. There is a demand for well-organised methods to transmit data securely over the Internet. In the proposed algorithm, a greyscale image is secured by an encryption technique by two stages, i.e. confusion and diffusion. Digital images can be expressed in terms of numeric values called pixels. The images are converted by the integer wavelet transform-domain technique, where the images are separated by approximation and detailed sub-bands. The proposed encryption algorithm is implemented for the approximated sub-bands which contain significant information of the digital image. 1D logistic and 2D tent maps generate chaotic random keys. By various statistical and differential analyses, the robustness of the proposed algorithm is examined by achieving a maximum entropy of 7.997, near-zero correlation and larger keyspace of 1080. With the results, it has confirmed that the enhanced chaos-based Integer Wavelet Transform (IWT) encryption has succeeded to resist any cyberattacks.

S. Aashiq Banu, M. S. Sucharita, Y. Leela Soundarya, Lankipalli Nithya, R. Dhivya, Amirtharajan Rengarajan
Analysis of Attention Deficit Hyperactivity Disorder Using Various Classifiers

Attention Deficit Hyperactivity Disorder (ADHD) is a neurobehavioral childhood impairment that wipes away the beauty of the individual from a very young age. Data mining classification techniques which are becoming a very important field in every sector play a vital role in the analysis and identification of these disorders. The objective of this paper is to analyze and evaluate ADHD by applying different classifiers like Naïve Bayes, Bayes Net, Sequential Minimal Optimization, J48 decision tree, Random Forest, and Logistic Model Tree. The dataset employed in this paper is the first publicly obtainable dataset ADHD-200 and the instances of the dataset are classified into low, moderate, and high ADHD. The analysis of the performance metrics and therefore the results show that the Random Forest classifier offers the highest accuracy on ADHD dataset compared to alternative classifiers. With the current need to provide proper evaluation and management of this hyperactive disorder, this research would create awareness about the influence of ADHD and can help ensure the proper and timely treatment of the affected ones.

Hensy K. George, P. K. Nizar Banu
A Technique to Detect Wormhole Attack in Wireless Sensor Network Using Artificial Neural Network

Wormhole attack is a harmful attack that disrupts the normal functioning of the network by manipulating routing protocols and exhausting network resources. The paper presents a technique that detects the presence of wormhole attack in wireless sensor network (WSN) using artificial neural network (ANN). The proposed technique uses connectivity information between any two sensor nodes as the detection feature. The proposed technique has been implemented considering the deployment of sensor nodes in the wireless sensor network area under uniform, Poisson, Gaussian, exponential, gamma & beta probability distributions. The proposed technique does not require any additional hardware resources and gives a comparatively high percentage of detection accuracy.

Moirangthem Marjit Singh, Nishigandha Dutta, Thounaojam Rupachandra Singh, Utpal Nandi
A Survey on Methodologies and Algorithms for Mutual Authentication in IoT Devices

Authentication refers to the process of proving the identity of the IoT device or simply, it is an act of verifying the identity of the system. The scenario where two communicating parties authenticate each other at nearly the same time is called two-way authentication or mutual authentication. Mutual authentication provides trusted communication between remote IoT devices. This paper is aimed at performing a survey on various algorithms used for mutual authentication of IoT devices. The classification has been done based on the parameters used for authentication, its simplicity, optimality, and efficiency of the algorithm. An abstract analysis of the methodologies is illustrated.

Rashmi R. Sonth, Y. R. Pranamya, N. Harish Kumar, G. Deepak
Emotion Scanning of the World’s Best Colleges Using Real-Time Tweets

With the advent of technology and the popularity of social media, people have started sharing their points of view with the masses. Views of the people matter a lot to analyze the effect of dissemination of information in a large network such as Twitter. Emotion analysis of the tweets helps to determine the polarity and inclination of the vast majority toward a specific topic, issue, or entity. During elections, film promotions, brand endorsements/advertisements, and in many other areas, the applications of such research can be easily observed these days. The paper proposes performing the emotion scanning of people’s opinions on the three top colleges and universities of the world according to various surveys and indices (Harvard, MIT, and Stanford) using real-time Twitter data. Also, a comparison has been drawn between the accuracies of a few machine learning techniques used, for instance, K-nearest neighbor (KNN), support vector machines (SVM), and Naïve Bayes.

Sanjay Kumar, Yash Saini, Vishal Bachchas, Yogesh Kumar
Generating Feasible Path Between Path Testing and Data Flow Testing

Software testing plays a major role in developing error-free software. The scope of testing the software is to identify the errors and errors present in the software. The data for testing are generated at the initial stage of software testing, which is a complex task during the testing process. There are several techniques that are available to generate test data. The paper puts forth the method to produce the cases that are used in testing from the control flow graph which is based on the path-oriented approach. This technique is to determine the feasible path upon all the possible paths. To solve this technique efficiently, a genetic algorithm is applied to identify the path that is optimal. The results from the pathwise approach are compared to the data flow testing approach. The comparative result shows that the produced data set for path testing produces a more feasible path than the data flow testing technique.

C. P. Indumathi, A. Ajina
Soft Constraints Handling for Multi-objective Optimization

Mahbub, Md. Shahriar Chowdhury, Fariha Tahsin Salsabil, AnikaMost real-world search and optimization problems naturally involve multiple objectives and several constraints. In this work, an idea for a generalized new approach for handling both hard and soft constraints in Multi-Objective Optimization Problems (MOOP) is demonstrated. The main purpose is to fully satisfy all the hard constraints and satisfy soft constraints as much as possible. A modification to the binary tournament parent selection approach is proposed. The proposed approach is integrated with the two most widely used multi-objective evolutionary algorithms, i.e., NSGA-II and SPEA2. A test is conducted on four benchmark problems and satisfactory results are achieved.

Md. Shahriar Mahbub, Fariha Tahsin Chowdhury, Anika Salsabil
Parking Management System Using Internet of Things

The raid population growth and increasing number of private vehicle ownership in countries like India have largely contributed to vehicle congestion. Inappropriate vehicle parking areas are considered as one of the root causes for slow moving traffic and has invoked the attention of the administration officials. The model proposed here serves a three-stage parking system using the Internet of Things. The first stage consists of an automated gate connected to an onscreen vehicle counter. The second stage consists of sensors that are installed at each parking slot, connected with a cloud platform by using a microprocessor board. These sensors send real-time data to the cloud by keeping track of the status of each parking slot and provide an online platform for the user to check the availability of parking slots. The final stage of the system provides an application interface that sends the user’s parking details through e-mail and displays the same on the application.

Aditya Sarin, Deveshi Thanawala
Machine Learning based Restaurant Revenue Prediction

Food industry has a crucial part in enhancing the financial progress of a country. This is very true for metropolitan cities than any small towns of our country. Despite the contribution of food industry to the economy, the revenue prediction of the restaurant has been limited. The agenda of this work is basically to detect the revenue for any upcoming setting of restaurant. There are three types of restaurant which have been encountered. They are inline, food court, and mobile. In our proposed solution, we take into consideration the various features of the datasets for the prediction. The input features were ordered based on their impact on the target attribute which was the restaurant revenue. Various other pre-processing techniques like Principal Component Analysis (PCA), feature selection and label encoding have been used. Without the proper analysis of Kaggle datasets pre-processing cannot be done. Algorithms are then evaluated on the test data after being trained on the training datasets. Random Forest (RF) was found to be the best performing model for revenue prediction when compared to linear regression model. The model accuracy does make a difference before pre-processing and after pre-processing. The accuracy increases after the applied methods of pre-processing.

G. P. Sanjana Rao, K. Aditya Shastry, S. R. Sathyashree, Shivani Sahu
Solving Multi-objective Fixed Charged Transportation Problem Using a Modified Particle Swarm Optimization Algorithm

Particle Swarm Optimization (PSO) is population-based algorithm established and enhanced to solve a wide variety of real-life problems. During the last decade, different aspects of PSO have been modified and many variants have been proposed. In this paper, a modified PSO is proposed to solve multi-objective fixed charge transportation problem wherein it optimizes the transportation cost (variable and fixed) as well as time to deliver goods from sources to destinations satisfying certain constraints. The method starts with the variable cost only and then with addition of fixed cost, iterates toward optimal Pareto pair. The simulation results show a significant performance gain by the proposed method and prove it as a competent alternative to existing methods.

Gurwinder Singh, Amarinder Singh
Binomial Logistic Regression Resource Optimized Routing in MANET

MANETs consist of the node that moves continuously in a random direction. In MANET architecture, devices can move in any direction. They have recently confined imperativeness, enlisting power and memory. The hubs of the system are versatile and the topology changes quickly. In binomial logistic regression resource optimized routing in MANET (BLR-OR), disappointment methodology includes visit hub distinguished a high directing overhead and stretched out start to finish postponement and power. The proposed method used logistic regression technique to detect optimal network path. It can be obtained depending on neighbor node using coefficient value. Regression coefficient is directed by thinking about the vitality, latency, and transmission capacity of versatile hub by utilizing the coefficient incentive to choose the mobile node with high energy, latency, and bandwidth utilization for communicating packets in MANET and enhance lifetime of the network (N-L), packet delivery ratio(P-D-R) in successful way. The presentation of BLR-OR regression procedure is estimated regarding energy consumption (E-C), end-to-end delay (E-E-D).

M. Ilango, A. V. Senthil Kumar, Amit Dutta
A Lightweight Approach for Policy-Based Messaging

The popularity of Resource-Constrained Networks (RCNs) is increasing rapidly in value, volume, velocity, variety, and veracity. The limited battery power and verbosity of messages are the major limiting factors in the constrained wireless mobile environment. Hence, devising a standard that is adaptive to the limitations of the present-day wireless mobile environment is crucial for the emerging generations of constrained wireless mobile devices; it is also expected to accelerate the penetration of wireless mobile technologies to the underprivileged class of users. The eXtensible Markup Language (XML) is the natural standard for messaging across heterogeneous types of mobile devices. But it is not suitable in the wireless mobile environment, due to the increased storage and processing requirements needed. This work presents a lightweight Policy-Based Messaging (PBM) mechanism for trusted transmission of information that can be used especially in Resource-Constrained Networks (RCNs). The work is based on a data format which is less verbose. The data format is derived from the YAML Aint Markup Language (YAML), a lightweight data serialization language. Proposals include measures to define policy assertions and a two-level mechanism to make sure trusted transmission of information. The performance analysis indicates an advantage over the existing methods.

P. P. Abdul Haleem
A Lightweight Effective Randomized Caesar Cipher Algorithm for Security of Data

In any kind of organization in the present scenario, raw data or meaningful data needs to be shared among different personnel; therefore, the chances of frauds or treachery are more that creates vulnerability in the working environment. Therefore, the protection of organizations’ confidential and sensitive data among different levels of employees against these kinds of theft or illegal activity that violates the company security policy is prerequisite. ABE pronounced as attribute-based encryption was introduced. ABE is a parameter that plays a vital role in providing access control in a fine-grained manner for outsourcing data in a data sharing system. Moreover, CP-ABE was introduced that defines an access policy in order to cover all the attributes within the system. In this scheme, for encryption and decryption purposes, the user’s private key is accompanied to the group of attributes. but due to its lack of efficiency and several other paradigms, it was proved ineffective. Hence, in order to overcome the existing issues, we have implemented a new cryptographic algorithm which is competent enough to encrypt and decrypt any type of file that contains any kind of data written in the range of ASCII values. We named this algorithm as randomized Caesar cipher algorithm.

Vardaan Sharma, Sahil Jalwa, Abdur Rehman Siddiqi, Ishu Gupta, Ashutosh Kumar Singh
RPL-Based Hybrid Hierarchical Topologies for Scalable IoT Applications

The applications built nowadays are mainly distributed, and most of them have sensors enabling them to do it. Internet of Things in the same context is solving real-world problems. Although there are challenges still like on ground scalability, efficiency, etc. and a lot of other bottlenecks, in our work here we have focused on RPL routing protocol and the potential to scale under strained networks. Some real-world application scenarios like military and agriculture build in an environment with the “strained” transmission and interference ranges, which requires the nodes to be retained as part of Destination-Oriented Directed Acyclic Graph (DODAG). The simulation study done using Contiki OS-based Cooja simulation environment on hierarchical and circular network topologies for highly scalable and strained networks shows high energy consumption and the impact on the radio duty cycle on few selected nodes of DODAG. Combining the features of hierarchical and circular network topology, we propose a hybrid hierarchical topology with multiple sinks which resembles the real-world applications. The testing and relative comparison of RPL’s Objective Functions (OFs) consists of the following parameters: Power Consumption, Radio Duty Cycle, and possible topologies. The results of the simulation study of RPL protocol show that the proposed hybrid network topology results in much stable energy consumption and radio duty cycle increasing the scale and strain on the network.

Animesh Giri, D. Annapurna
A Quick Survey of Security and Privacy Issues in Cloud and a Proposed Data-Centric Security Model for Data Security

Cloud is being utilized by the majority of Internet users. Businesses storing their critical information in the cloud have increased over the ages due to simplistic and attractive features the cloud possesses. In spite of cloud utilization, there are concerns raised by users regarding cloud security. Hackers are using this opportunity to steal data stored in the cloud. In this regard, researchers have proposed techniques to provide security and privacy to the cloud. Some researchers focused on securing the cloud server, while others concentrated on securing the data transmitted to the cloud. In this research, a comparative analysis is done on papers that were published between 2013 and 2019. Based on the challenges identified in those techniques, a well-secured scheme is proposed to provide extensive security to data before it is transmitted to the cloud. The proposed technique is Data-Centric Security (DCS), and it utilizes Attribute-Based Encryption (ABE) as its framework. This scheme is at the implementation stage, and we believe it will come to address the security and privacy flaws observed.

Abraham Ekow Dadzie, Shri Kant
Homo Sapiens Diabetes Mellitus Detection and Classification

Diabetes mellitus can be defined as a set of deficiency disorders which is caused due to under-secretion of insulin. In other words, it results in very high blood sugar levels. Diabetes mellitus influences and is influenced by various factors. Diabetes mellitus, if remains unidentified or untreated can lead to lethal disorders like a cardiovascular disease such as heart attack, narrowing of arteries, nerve damage, kidney damage, skin conditions, depression, and many such complications. Statistics suggest that human beings are getting affected by this disease at an alarming rate. Yet, it remains unidentified and hence untreated in most cases. Hence, machine learning is introduced in the field of biomedical sciences such that these disorders can be treated at a larger scale without conducting pathological tests. The below paper solely focuses on predicting over a set of features for every human that if the person has a tendency of high blood sugar or diabetes mellitus or not. Building the classifier includes libraries like Python, Numpy, Pandas, Matplotlib, Seaborn, Scikit Learn, and Scipy.

Anu Agarwal, Anjay Sahoo, Indrashis Das, Siddharth S. Rautaray, Manjusha Pandey
Learning Platform and Smart Assistant for Students

Students and knowledge have always been abstract entities. Earlier, students found it tough to find the source of knowledge and now they find it difficult to gather, curate and utilise the immense amount of knowledge available from various sources. Both the scenarios seem like they are the two faces of a coin. Students face yet another problem, i.e. the problem of tracking and scheduling. More often than not we have seen that with proper guidance and materials to study, people have emerged with flying colours. If we can provide this to the students by using technology, we’ll be able to help many lead a stress-free and efficient academic life. In this paper, we have detailed the problems faced by students. We look at some existing platforms which aim to address these problems. Further, we propose a smart learning assistant for curating the study materials and tracking the student’s progress.

R. Rashmi, Sharan Rudresh, V. A. Sheetal, Dexler information Solutions Pvt Limited
Eye Disease Detection Using YOLO and Ensembled GoogleNet

Our research work has succeeded in integrating ensemble into GoogleNet image classification technique aiming higher accuracy and performance than existing models. The convolutional layers are apt for feature extraction from images. In GoogleNet classifier, only 1 fully connected dense layer (weak) is used for classification of class from these outputted features. But we integrated ensemble immediately after convolutional layers for purpose of better classification output. Thus, the output (features of image) of convolutional layers is passed as a separate input to both ensemble methods and fully connected layers of GoogleNet for obtaining the class of image. The final class of image is determined by the specific strategy after analyzing outputs of ensemble and GoogleNet fully connected layer. All earlier works focused on eye disease classification. Here, we have also experimented with YOLO for detection of location and class of diseases. The eye is considered as the most significant part of the body. But this most significant part of the body is easily subjected to various kinds of diseases. Early detection and prevention are needed. Our research work aimed at detecting top five common eye diseases with higher accuracy. User can upload a pic in a mobile or cloud application, and inbuilt AI algorithms will detect the type of eye disease with higher accuracy and thus offering prevention suggestions at an early stage without doctor intervention.

Saikiran Gogineni, Anjusha Pimpalshende, Suryanarayana Goddumarri
Comparative Analysis of MCT Load Balancing Approach in Cloud Computing Environment

The efficient working of cloud computing is based on the most important aspect, which is load balancing. This paper gives a comparison and analysis of the load balancing algorithms in a cloud computing environment. The task and resource allocation is most important for efficient working of cloud computing. The load balancing is referring to task and resource allocation in an efficient manner as this [5] is related to NP-hard optimization problem. Load distribution in the cloud computing environment in an efficient manner and taking care of all the problems at the time of load balancing is most important for load balancing algorithms. The researchers have researched on various algorithms for overcoming the load balancing generating problem during the task and resource allocation phase. In this paper, the focus is on discussing the load balancing classification and their algorithms. This paper lays emphasis on heuristic algorithms and measures the performance using the CloudSim simulator.

Shabina Ghafir, M. Afshar Alam, Ranjit Biswas
A Comparative Study of Text Classification and Missing Word Prediction Using BERT and ULMFiT

We perform a comparative study on the two types of emerging NLP models, ULMFiT and BERT. To gain insights on the suitability of these models to industry-relevant tasks, we use Text classification and Missing word prediction and emphasize how these two tasks can cover most of the prime industry use cases. We systematically frame the performance of the above two models by using selective metrics and train them with various configurations and inputs. This paper is intended to assist the industry researchers on the pros and cons of fine-tuning the industry data with these two pre-trained language models for obtaining the best possible state-of-the-art results.

Praveenkumar Katwe, Aditya Khamparia, Kali Prasad Vittala, Ojas Srivastava
Data Formats and Its Research Challenges in IoT: A Survey

In the current context, the increase in data generated by the heterogeneous IoT sensors results in several challenges to the IoT applications’ developers. One of the significant problems is handling nonstandardized zeta bytes of heterogeneous IoT data. Further, there is a lack of understanding of data generated by the sensors and incompatibility in representing the data. The objective of this survey paper is to introduce IoT sensors with their areas of applications, the different data formats, representing the data in an interchangeable format, and handling data as a stream. Further, this paper also presents challenges and research opening in IoT data.

Sandeep Mahanthappa, B. R. Chandavarkar
Software Fault Prediction Using Cross-Validation

Suresh, YeresimeSoftware faults are dangerous. Software systems are often essential to a business operation or organization, and failures in such systems cause disruption of some goal-directed activity (mission critical). Faults in safety-critical systems may result in death, loss of property, or environmental harm. Run-time faults are the most damaging as they are not always detectable during the testing process. Detecting faults before they occur gives the designers a brief inner view of the possible failure and their frequency of appearance. This helps in focused testing and saves time during the software development. Prediction models have the ability to differentiate between various patterns. This article showcases the effectiveness of cross-validation in design and development of a neural network for software fault detection.

Yeresime Suresh
Implementation of Recommendation System and Technology for Villages Using Machine Learning and IoT

With the advancement in smart city technologies, there is a need for developing the rural part of the country. The smart village is equipped with the integration of online recommendation online system, security and automation are aimed in this paper. There is a need for a system that defines smart security of home/go-downs with the smart face recognition technology for intruders and a fire sensor to detect the fire and smoke, and in case of any emergency the water is being sprinkled. The villages in our country need to have a modern garbage system to segregate the dry and wet waste as well as a device that checks for moisture content in soil and sprinkle water when required. Additionally, the proposed system also contains a heartbeat sensor and a pulse sensor to have a quick checkup of the local people; hence, implementing precision agriculture and health monitoring facilities for smart village. The whole system is made to work using renewable energy that is sunlight and the owner is alerted through emails and messages in case of any emergency through Twilio app and ThingSpeak. E-learning is the new medium for learning which has grown in the last decade. The Internet has evolved and has been developed rapidly in an exponential manner. With the rapid development of this new era of Internet, it has given rise to many applications and has proven its potential to impact billions all over the world. One such domain is the usage of Internet to create an impact in the educational domain. With the introduction of massive open online courses now education has been made widely accessible to each and every one. Here we discuss the recommendation system of E-learning platform and the major problem faced by E-learning, i.e. dropouts. The main intention of this paper toward recommendation system is to develop an integrated application that combines the effects of the recommendation system and dropout prediction. Using this smart education system, poor kids can get online support where they don’t have proper guidance and teachers can use this portal. We have our best to implement the application by researching various research papers.

B. Achyuth, S. Manasa
IoT Based Inventory Management System with Recipe Recommendation Using Collaborative Filtering

The internet is a huge pool of data. It consists of various websites that provide numerous recipes and it becomes difficult for a person to manually search for a recipe based on the available ingredients daily. Inventory management is a difficult task because it is not feasible to keep track of every food ingredient. The objective of this paper is to provide an effective solution to manage inventory of the user using the Internet of Things (IoT) as well as recommending a recipe to the user using the available inventory. A recipe scoring algorithm that follows a collaborative filtering approach is proposed to score the recipes and the recipes that yield a higher score will be recommended to the user. This reduces the time and efforts of the user which makes the system more efficient and user-centric.

Atharva S. Devasthali, Adinath J. Chaudhari, Someshkumar S. Bhutada, Snehal R. Doshi, Vaishali P. Suryawanshi
Survey Paper on Smart Veggie Billing System

The purpose of this paper is to automate the process of the billing system of vegetables. Raspberry PI is the heart of this project which monitors all the components in the system. Initially, we capture the different vegetable images and train the AI model with images. These images are used to recognize the vegetables taken by the customer. The camera captures the selected item image and AI model based on convolution neural network recognizes this image using an image recognition technique. Image recognition is not only the technique to find the vegetable, but also using the resistance of the vegetable we can recognize the vegetable. Resistance method of recognizing the vegetable is not suitable because it may cause damage to the vegetables. So we choose an image recognition method. The weight of the item is also measured using the sensor which removes the human intervention in weighing the item. Recognized item and its weight are going to display on the screen. Customer can add or delete the items using user interface. Finally, a bill is created based on the selected items of the customer.

T. V. Niteesh, B. Y. Lohith, Y. M. Gopalakrishna, R. Ashok Kumar, J. Nagaraj
An Optimized Approach for Virtual Machine Live Migration in Cloud Computing Environment

For efficient management of the cloud, there is a requirement of the virtualization in the emergent stage of cloud computing where resources such as memory, servers, virtual machines are shared across the World Wide Web. For getting or availing other facilities such as load balancing, auto-scaling, and fault tolerance, etc., in cloud computing live migration of virtual machine is required. Migrating virtual machines from one node to another without suspending the VMs is an important feature of cloud computing so that users do not have to deal with any kind of service downtime. In this research paper, a comparative study has been done for various methods of migrating the data from one node server to another node server. After that it also recognizes the optimization of the live virtual machine in migration techniques. Therefore, an optimized approach has been identified that will be beneficial in a live migration of virtual machines without affecting the node servers in a cloud computing environment.

Ambika Gupta, Priti Dimri, R. M. Bhatt
Digital Image Retrieval Based on Selective Conceptual Based Features for Important Documents

Due to the increased level of digitalization, data exchange and collection has grown to a greater extent. This bulk data needs categorization else data collected will be meaningless. The exchange of multimedia data has also increased due to the availability of the Internet. Images collected need a proper categorization that can help in fetching the required data. Government documents need to be uniquely identified from the set of plenty of data to help in different kinds of proceedings. It could help to fetch images only of the required government document. Due to the increased size of data as well as an increased quantity of data, different document image identification techniques are time-consuming. The proposed model quickly identifies specific government documents from a chunk of images with ease. Various filtering categories are applied to ease the process of categorization. To speed up the process of categorization, selective efficient features are shortlisted which contributes majorly toward substantiation of a government document.

Premanand Ghadekar, Sushmita Kaneri, Adarsh Undre, Atul Jagtap
A Novel Repair and Maintenance Mechanism for ‘Integrated Circuits’ of Ubiquitous IoT Devices by Performing Virtual IC Inspection Based on ‘Light Field Technology’

The proposed research discloses a novel mechanism for repairing and maintaining the electronic circuits during an event of failure without actually dismantling or opening-up the device circuit boards. The mechanism uses light field technology, wherein a 3D interactive view of the short circuit component is projected so that the technician can directly interact with the 3D display and in turn fix the wiring in the actual device circuit without physically interacting with the circuit. As the technician interacts with the 3D display, the light source within the internal circuitry generates multiple light beams that fluctuate with varying intensities and apply an optical force on the damaged circuit components so that the circuit components align and re-arrange themselves. We have used Spatial Light Modulator (SLM) for this purpose since the SLM optimizes and focuses light on the target (damaged) circuit component and can drive the component mechanically with greater magnitude. The optimization of SLM is carried out by using the Genetic Algorithm (GA). The simulation results for the optimized SLM based on GA are presented in the research proposal.

Vijay A. Kanade
Evolutionary Optimization of Spatial Light Modulator for Advanced Wavefront Control in an Optically Addressable ‘Electric See-Through Skin’

The research proposal discloses an electronic see-through skin that can be worn or wrapped around any object (i.e., human body, hands, legs, brain, etc.) and would allow to see-through the object irrespective of its physicality. The see-through skin modulates the light rays passing through the scattering opaque medium by utilizing the “Spatial Light Modulator (SLM).” The proposal elaborates on the optimization of the SLM for wavefront control of the light rays as they pass through the optically adaptive electric see-through skin. The optimized SLM helps to see-through any opaque object by resolving the shape of an object that is hidden behind the opaque object and is not in a direct view. The research proposal presents the simulation results of the evolutionary algorithm employed for optimizing the SLM.

Vijay A. Kanade
Retrieval of Videos of Flowers Using Deep Features

This paper presents an algorithmic model for the retrieval of natural flower videos using query by frame mechanism. To overcome the drawback of traditional algorithms, we propose an automated system using a deep convolutional neural network as a feature extractor for the retrieval of videos of flowers. Initially, each flower video is represented by a set of keyframes, then features are extracted from keyframes. For a given query frame, the system extracts deep features and retrieves similar videos from the database using k-nearest neighbor and multiclass support vector machine classifiers. Experiments have been conducted on our own dataset consisting of 1919 videos of flowers belonging to 20 different species of flowers. It can be observed that the proposed system outperforms the traditional flower video retrieval system.

V. K. Jyothi, D. S. Guru, N. Vinay Kumar, V. N. Manjunath Aradhya
Analysis of Students Performance Using Learning Analytics—A Case Study

Utilization of digital tools has been increased enormously in our daily learning activity by generating data on a huge scale. This huge amount of data generation provides exciting challenges to the researchers. Learning analytics effectively facilitates the evolution of pedagogies and instructional designs to improve and monitor the students’ learning and predict students’ performance, detects unusual learning behaviors, emotional states, identification of students who are at risk, and also provides guidance to the students. Data mining is considered a powerful tool in the education sector to enhance the understanding of the learning process. This study uses predictive analytics, which help teachers to identify student’s at risk and monitor students progress over time, thereby providing the necessary support and intervention to students those are in need.

Manjula Sanjay Koti, Samyukta D. Kumta
A Case Study on Distributed Consensus Problem on Cloud-Based Systems

Fault tolerance is a vital element in cloud computing to achieve high performance. In cloud computing fault tolerance particularly cluster management, network discovery and consistent system master node replication, consensus and coordination play a major role. This paper provides a comprehensive overview of the topic of fault tolerance, i.e., the problem of consensus in cloud computing; highlighting the important concepts along with the explanation of Byzantine Agreement problem and consensus problems in multi-agent systems. There are multiple algorithms/protocols like RAFT and PAXOS available to approach this problem. We present generalized consensus implementation by solving consensus for dual failure nodes. We also describe Apache Zookeeper as our coordination service to obtain consensus in a distributed system.

Ganeshayya Shidaganti, Ritu Pravakar, M. Shirisha, H. R. Samyuktha
An IoT Framework for Healthcare Monitoring and Machine Learning for Life Expectancy Prediction

The beginning of the IoT era, shrinking of devices and the concept of intelligent independently learning machines have led to improvements in the quality of human life. The application of machine learning to IoT data has led to the automation of the creation of analytical models. One key area of research has seen such a revolution in the health care sector. This work aims to design a wireless healthcare system that detects patients vitals using sensors, transfers data to cloud, and predicts the approximate life expectancy using machine learning techniques. The notion of the Internet of Things (IoT) interconnects devices and offers effective health care service to the patients. Here the IoT architecture gathers the sensor data and transfers it to the cloud where processing and analyses take place. Based on the analyzed data, feedback inputs are sent back to the doctor and using the present pulse rate of the patient, nominal or approximate value of life expectancy is predicted using machine learning algorithms.

Anna Merine George, Anudeep Nagaraja, L. Ananth Naik, J. Naresh
A Study on Discernment of Fake News Using Machine Learning Algorithms

Due to recent events in world politics, fake news, or malevolently established media has taken a major role in world politics discouraging the opinion of the people. There is a great impact of fake news on our modern world as it enhances a sense of discretion among people. Various sectors like security, education and social media are intensely researching in order to find improvised methods to label and recognize fake news to protect the public from disingenuous information. In the following paper, we have conducted a survey on the existing machine learning algorithm which is deployed to sense the fake news. The three algorithms used are Naïve Bayes, Neural Network and Support Vector Machine (SVM). Normalization is used to cleanse the information before implementing the algorithm.

Utkarsh, Sujit, Syed Nabeel Azeez, B. C. Darshan, H. A. Chaya Kumari
Detection of Diseased Plants by Using Convolutional Neural Network

Agricultural takes a major percentage in a country’s economic growth. Crop production plays an essential role in agriculture. Countries’ economical growth rate is reduced due to less crop production. Foods are essential for every living being, since we need proper food for survival. Hence, it is essential for every farmer to cultivate a healthy plant to increase the crop production. However, in nature, every plant can get attacked by some sort of disease but the level of damage occurred to the crops are different for every plant. If a fully matured plant get affected by a simple disease, it will not affect the full plant but if a small plant gets affected by the same disease, it causes severe damage to the plant, as we cannot manually monitor the plants and cannot detect the disease occurring in the plants everyday. Huge manpower is needed to monitor every plant in the farm so it needs time for monitoring every crop in the field. In this paper, image recognition using conventional neural network (CNN) has been proposed to reduce the time complexity and manpower requirement. The proposed algorithm accurately detects the type of diseases that occurs in the plants.

M. Maheswari, P. Daniel, R. Srinivash, N. Radha
Emoticon: Toward the Simulation of Emotion Using Android Music Application

Music unquestionably affects our emotions. We tend to listen to music that reflects our mood. Music can affect our current emotional state drastically. Earlier, the user used to manually browse songs through the playlist. Over the period, recommendation systems have used collaborative and content-based filtering for creating playlist but not the current emotional state of the user. This paper proposes an idea of an android music player application which recommends songs after determining the user’s emotion by facial recognition at that particular moment using deep learning techniques. And create a playlist by considering the emotion of the user and recommending songs according to the current emotion of the user.

Aditya Sahu, Anuj Kumar, Akash Parekh
Multi-document Text Summarization Tool

In today’s world, there is a massive amount of data being continuously generated every minute. This data can be utilised to gain a large amount of information that can have numerous uses. However, it is difficult to obtain this information because of the speed and volume of data being generated. One of the tools that can be useful in extracting useful information from textual data is a text summarization and analysis tool. Many text summarization tools are being developed but largely focus on summarising a single document effectively. This project aims to create a text summarization tool using abstractive and extractive text summarization techniques that can extract the relevant and important information from multiple documents and present it as a concise summary. The tool also performs multiple analyses on the data to obtain more useful information and make inferences based on the contents of the input textual data. This tool has various use cases as it can greatly reduce the time spent in gathering information from a large number of different documents such as surveys and feedback forms from various sources by providing an effective summary and analysis of the relevant data in these text documents.

Richeeka Bathija, Pranav Agarwal, Rakshith Somanna, G. B. Pallavi
Cognitive Computing Technologies, Products, and Applications

Cognitive computing has made industries and business organizations to operate in a different paradigm with respect to the use of technology right from carrying business operations to high-level decision-making strategy. The ability of human experts in any field to think and make right decisions varies from person to person which creates the demand and necessary requirement of a high skilled person in an industry, but it becomes difficult for any human when it comes to obtaining useful insights to carry out business operations and to take right decisions from a huge amount of data that gets generated every day. Different technologies and platforms are necessary to process almost petabytes of data and make proper use of it to obtain patterns and insights.

N. Divyashree, Prasad K. S. Nandini
Breast Cancer Prognosis Using Machine Learning Techniques and Genetic Algorithm: Experiment on Six Different Datasets

The strategy used in this research is to select the best features using the genetic algorithm from various breast cancer dataset for getting better prediction results using machine learning algorithms. This research involves two main phases. One is feature selection using Genetic Algorithm (GA) and second is breast cancer prediction using Logistic Regression (LR) and k-Nearest Neighbor techniques (k-NN).

S. Jijitha, Thangavel Amudha
Detection Classification and Cutting of Fruits and Vegetables Using Tensorflow Algorithm

This paper presents automatic fruit and vegetable recognition and chopping mechanism. This work aims to reduce the time taken for cutting fruits and vegetables in a large kitchen. This mechanism has two parts. The first part of the system is based on image processing, which consists of capturing an image of the object, comparing it with the images stored in the database, and identification of objects. For doing this, Tensor flow algorithm is used and the accuracy obtained is more than 90%. The second part of the project is the segregation of the fruits/vegetables and cutting them depending on the requirement.

Rupal Mayo Diline D’Souza, S. R. Deepthi, K. Aarya Shri
Detection of Counterfeit Cosmetic Products Using Image Processing and Text Extraction

The growing popularity of the cosmetics industry has made it vulnerable to duplication and counterfeiting. One in every five individuals purchases counterfeits cosmetics. Consumption of such cosmetics can be harmful to health. They might sometimes contain chemicals such as arsenic or other carcinogenic chemicals. The packagings of the counterfeit products are designed in such a manner that someone can barely differentiate it from the original one. The packaging of the duplicate products differs from that of the original in terms of certain features such as dimension, font style, font size, color, and ingredients. This paper reports the development of a system that uses the application of image processing and text extraction techniques that will enable a user to determine the authenticity of the test product. The system uses the features of the authentic product and compares them with the features of the test product to determine its authenticity. The system has been tested with some of the counterfeit and original cosmetic products.

Siddharth Mehta, Prajakta Divekar, Aditi Kolambekar, Amol Deshpande
Multiclass Weighted Associative Classifier with Application-Based Rule Selection for Data Gathered Using Wireless Sensor Networks

The twenty-first century has seen an explosion in the amount of data that is available to decision-makers. Wireless Sensor Networks are everywhere continuously collecting and feeding data to various systems. Translating these enormous amounts of data into information and making decisions quickly and effectively is a constant challenge that decision-makers face. Decision Support Systems that analyze data using different approaches are constantly evolving. There are many techniques that analyze these large datasets collected from Wireless Sensor Networks. Associative Classifier is one such technique which combines Associative Rule Mining along with Classification to find different patterns generated from large datasets. A common challenge associated with sensor data is that it is unstructured and heterogeneous. In this research, a Multiclass Weighted Associative Classifier with Application-based Rule Selection is proposed which translates unstructured and heterogeneous data into a set of rules that enable decision-makers in any domain to make decisions.

Disha J. Shah, Neetu Agarwal
Void-Aware Routing Protocols for Underwater Communication Networks: A Survey

Underwater Acoustic Sensor Networks (UASNs) is a technology used in several marine applications like environment prediction, defense applications, and discovering mineral resources. UASNs has several challenges like high bit error rate, high latency, low bandwidth, and void-node problem during routing. In the context of routing protocols for underwater communication networks, the void-node problem is one of the major challenging issues. The void-node problem arises in the underwater communication during the greedy-forwarding technique, due to which packet will not be forwarded further toward the sink. In this review paper, we analyze the void-node problem in underwater networks and issues related to the void-node. Also, we elaborate on the significant classification of void-handling routing protocols. We analyze both location-based and pressure-based void-handling routing protocols.

Pradeep Nazareth, B. R. Chandavarkar
A Comprehensive Survey on Content Analysis and Its Challenges

In recent years the explosive development and growth of video technology and multimedia have created a new challenge in the field of computer vision. The tremendous increase in multimedia exchange like video, audio, image, etc., through the internet and other social media has led the way to content analysis. Content analysis is a technique to interpret textual data, multimedia data, and communication artifacts. In this paper we have provided an overview of content analysis and the more profound interpretation of video content analysis in a different area is shown. Specifically, in this paper, we have focused more on affective content analysis, its methodology, trends, and challenges.

Ankitha A. Nayak, L. Dharmanna
Applications of Blockchain and Smart Contract for Sustainable Tourism Ecosystems

Blockchain is one of the most promising technologies for innovating business ecosystems in tourism. Blockchain enables secure, reliable, and efficient decentralized management systems without trusted third parties which are a core part of centralized management systems. Many tourism business activities have been doing under globalized and decentralized environments. Thus, It is necessary to do research regarding applications of blockchain technology to the tourism industry for building sustainable tourism business ecosystems. The purpose of the present paper is to examine application cases of blockchain and smart contract to the tourism industry and to identify the opportunity to innovate existing tourism business ecosystems. Stakeholders of tourism business ecosystems are able to innovate their ecosystems being conducive to their business using blockchain technology. Application cases of blockchain to tourism such as TripEcosys and TravelChain can be helpful to tourism business managers who seek new opportunities for innovative businesses.

Jaehun Joo, Joungkoo Park, Yuming Han
Friendship and Location-Based Routing in Delay Tolerant Networks

The network with the delay tolerant capabilities enables communication in the environment where intermittent connectivity or peer-to-peer connection is available. Routing for these networks is a challenging problem, therefore, the concept of Social Delay Tolerant Networks has been proposed by researchers. Friendship is a property of Social DTN where the friendship between nodes is used for transmission of a message between nodes. The problem arises in friendship-based routing is when any node doesn’t have any friend it cannot send the data to the destination. Therefore, we are proposing a novel algorithm for routing based on friendship and location to overcome the problem in friendship based routing by adding the location-based similarity which is another social property of DTN with friendship based routing algorithm. Therefore, Delivery probability is heightened and the average latencies caused to find a friend can be decreased.

Nidhi Sonkar, Sudhakar Pandey, Sanjay Kumar
Fake Account Detection Using Machine Learning

Nowadays the usage of digital technology has been increasing exponentially. At the same time, the rate of malicious users has been increasing. Online social sites like Facebook and Twitter attract millions of people globally. This interest in online networking has opened to various issues including the risk of exposing false data by creating fake accounts resulting in the spread of malicious content. Fake accounts are a popular way to forward spam, commit fraud and abuse through an online social network. These problems need to be tackled in order to give the user a reliable online social network. In this paper, we are using different ML algorithms like Support Vector Machine (SVM), Logistic Regression (LR), Random Forest (RF) and K-Nearest Neighbours (KNN). Along with these algorithms we have used two different normalization techniques such as Z-Score and Min-Max to improve accuracy. We have implemented it to detect fake Twitter accounts and bots. Our approach achieved high accuracy and true positive rate for Random Forest and KNN.

Priyanka Kondeti, Lakshmi Pranathi Yerramreddy, Anita Pradhan, Gandharba Swain
Multi-objective Task Scheduling Using Chaotic Quantum-Behaved Chicken Swarm Optimization (CQCSO) in Cloud Computing Environment

Task scheduling is a challenging process with the increasing number of requests from the clients in a cloud system. Achieving efficient task scheduling with multiple objectives is much required in this modern era. A novel Chaotic Quantum-behaved Chicken Swarm Optimization (CQCSO) based task scheduling approach is presented in this paper. CQCSO is developed by applying chaotic theory and quantum theory to the standard Chicken Swarm Optimization to overcome its problem of premature convergence and local optima. CQCSO algorithm models the task scheduling as an optimization problem and solves it by formulating a multi-objective fitness function using task completion time, response time and throughput to ensure maximum Quality-of-service (QoS) satisfaction and minimum SLA violations. CQCSO identifies the task order and optimally schedules them to the suitable virtual machines with better performance. Experiments were conducted in CloudSim to evaluate the CQCSO approach and it provided efficient task scheduling than the prior existing algorithms.

G. Kiruthiga, S. Mary Vennila
ARIMA for Traffic Load Prediction in Software Defined Networks

Internet traffic prediction is needed to allocate and deallocate the resources dynamically and to provide the QoS (quality of service) to the end-user. Because of recent technological trends in networking SDN (Software Defined Network) is becoming a new standard. There is a huge change in network traffic loads of data centers, which may lead to under or over-utilization of network resources in data centers. We can allocate or deallocate the resources of the network by predicting future traffic with greater accuracy. In this paper, we applied two machine learning models, i.e., AR (autoregressive) and ARIMA (Autoregressive integrated moving average) to predict the SDN traffic. The SDN traffic is viewed as a time series. And we showed that the prediction accuracy of ARIMA is higher than the AR in terms of Mean Absolute Percentage Error (MAPE).

Sarika Nyaramneni, Md Abdul Saifulla, Shaik Mahboob Shareef
Improved Logistic Map Based Algorithm for Biometric Image Encryption

Tumultuous maps are ordinarily favored for the age of arbitrary numbers in encryption calculations because of high arbitrariness, capriciousness, aperiodic, and affectability. These calculations are completely required upon the underlying qualities and bear high correspondence to cryptographic calculations. Right now, the proposed encryption calculations are dependent on improved strategic guide for biometric picture encryption. The proposed calculated guide is improved to conquer its lacuna concerning haphazardness, irregular dispersion, unbound, and little parameter space. Besides the proposed calculated guide produces arbitrary numbers dependent on ongoing factors, haphazardly chose qualities and the last numbers are diffused arbitrarily through XOR activity. Trial results on different biometric unique mark pictures exhibit better encryption parameters and around level histogram of the encoded pictures that expand trouble during unapproved unscrambling. The proposed calculation can be viably considered for multi-biometric unique mark picture encryption in government and nongovernment associations because of its less intricacy and less encryption time.

Mahendra Patil, Avinash Gawande, D. Shelke Ramesh
Data Security System for A Bank Based on Two Different Asymmetric Algorithms Cryptography

Recently, a strong security system is very important for a safe banking system. To prevent hacking of important information of bank and client, a secure banking system is a must. This paper deals with a strong security system using a hash function and two different asymmetric algorithms (DSA and RSA) at a time, it will enhance data security. We are using RSA and DSA encryption algorithm to secure our system from unauthorized access. In RSA and DSA, we have to generate two keys called Public and Private Keys. If we use the signer’s private key for encryption, then we have to use the signer’s public key for decryption. The system will verify by confirmation and certificate and the sender will be sent an OTP via mobile phone of the receiver to confirm the authentication. This is the most efficient data security system to save the bank from hacktivism.

Md. Ashiqul Islam, Aysha Akter Kobita, Md. Sagar Hossen, Laila Sultana Rumi, Rafat Karim, Tasfia Tabassum
Digital Signature Authentication Using Asymmetric Key Cryptography with Different Byte Number

Nowadays, each and every system needs proper security. Only proper security can save important documents. There are different types of security systems that people use for safety. Digital Signature is one kind of a security system. Digital Signature is the public key primitive of different types of message authentication. Digital signature is one kind of technique that converts the handwritten signature in digital data. It is one kind of cryptographic value that is calculated from the data and a secret private key which is only known by the signer. In this paper, we want to make the digital signature more secure by using the ED25519 algorithm, which is an asymmetric algorithm. In this algorithm, the signature will be converted into different byte number, which will make the security system more strong.

Md. Sagar Hossen, Tasfia Tabassum, Md. Ashiqul Islam, Rafat Karim, Laila Sultana Rumi, Aysha Akter Kobita
Digital Signature Authentication for a Bank Using Asymmetric Key Cryptography Algorithm and Token Based Encryption

Nowadays security system is being with more important issues. In modern science, technology is updated day by day and we are getting insecure in our daily life. Through this project, a digital signature authentication security system has been designed to protect the bank from unauthorized access. In this paper, we are producing the most efficient and productive security system based on digital signature authentication using the asymmetric key cryptography algorithm and token-based encryption. In this project, we are using public and private keys for encryption and decryption data with the hash function and also provide digitally signed data, RSA algorithm to encrypt the data. The receiver will send a certificate and confirmation request to the sender to verify the certificate and the sender will send an OTP via phone through the Internet to authenticate the receiver. All of these works are producing a good and valuable security system in the banking sector.

Rafat Karim, Laila Sultana Rumi, Md. Ashiqul Islam, Aysha Akter Kobita, Tasfia Tabassum, Md. Sagar Hossen
Accuracy Analysis of Similarity Measures in Surprise Framework

Recommender Systems (RS) are growing technologies, which can be very useful for consumers in finding items of their interest on the web. Collaborative filtering(CF), a popular approach of Recommender Systems, recommends items to users based on other users with the same taste. The key step of the collaborative filtering method is to compute the similarity between users and items. The success of any recommender model hugely depends on how accurately the notion of similarity has been modelled. There are many open-source Python frameworks, which are useful in building and experimenting with RS models in the industry as well as academics such as Surprise, Python-recsys, Case Recommender, Polara, Spotlight, and RecQ. This work provides a brief introduction to the Surprise, a Python library for Recommender System, explaining its architecture, implementation and main features. Furthermore, a comparative study of the Surprise framework with other related frameworks is provided to demonstrate the fact why it is better than other frameworks in terms of implementing and handling new and complex recommender models. We have evaluated the accuracy of various built-in similarity measures provided by the Surprise framework using the real-world benchmark MovieLens datasets (100 k and 1 M). Thereafter, the accuracy of the recommendation is measured in terms of Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). The empirical results demonstrate that Pearson-baseline outperforms other built-in similarity measures available in the Surprise framework.

Sanket Kamta, Vijay Verma
Computational Method for Cotton Plant Disease Detection of Crop Management Using Deep Learning and Internet of Things Platforms

Cotton is a major crop from income point of view in India. Cotton crops are damaged due to early fall off leaf or leaf will get infected due to diseases. Due to sudden change in climatic conditions, plant diseases occurred either scorching temperature in the crop filed or some pesticides will be required within a time. There are multiple systems to detect and restrain the diseases on a cotton leaf through soil monitoring in classification and identification of numerous diseases like bacterial blight, Alternaria, and many more. After disease detection, will be provided to the farmers using various machine learning algorithms and IoT-based system. In this paper, the main focus is on a new deep learning method, which investigates to automatically identify a diseased plant from leaf images of the cotton plant and IoT-based platform in collecting various sensor data for detecting climatic changes. The deep CNN model is developed to perform cotton plant disease detection using infected and healthy cotton leaf images by collecting images through the complete process used in training and validation for image preprocessing; augmentation and fine-tuning. Different test cases were accomplished to check the performance of the created model and make this new system economical and independent. This newly created system gives accuracy as efficient as possible for cotton plant disease detection and restrains by improving crop production, this paper provides an innovative path to researchers for developing a cotton plant disease identification system.

Bhushan V. Patil, Pravin S. Patil
Foot Ulcer and Acute Respiratory Distress Detection System for Diabetic Patients

Health care and wellness management for a diabetic are one of the most promising information technology in the field of medical science. A healthcare monitoring system is necessary to constantly monitor diabetic patients’ physiological parameters. Hence the major scope of this proposed project work is to develop a smart health monitoring system that overcomes many complications in diabetic patients by periodically monitoring patients’ heartbeat rate, SPO2 (Peripheral capillary oxygen saturation) level, foot pressures, etc. Therefore, the IoT concept is used and sensors are connected to the human body with a well-managed wireless network that periodically monitors the physiological parameters of the body to avoid high risks in diabetic patients. Continuous health monitoring remotely works because of the integration of all components with wearable sensors and implantable body sensors networks that will increase the detection of emergency conditions at risk. Also, the proposed system is useful to operate remotely because of inbuilt Wi-Fi in the system.

M. S. Divya Rani, T. K. Padma Gayathri, Sree Lakshmi, E. Kavitha
Cluster-Based Prediction of Air Quality Index

The present world is facing the crucial issue of air pollution, which is threatening the human race and the environment equally. This situation requires effective monitoring of air quality and recording the pollution levels of different pollutants SO2, CO, NO2, O3 and particulate matters (PM2.5 and PM10). To achieve this, we need efficient prediction and forecasting models which not only monitors the quality of the air we breathe in but also forecast the future to plan accordingly. In this direction, various data mining methods are adopted for analysing and visualizing concentration levels of air pollutants using Data Mining on big data and data visualization. In this work, we have explored the partition-based clustering technique to extract the patterns from the air quality data. We have compared prediction methods, Auto-Regressive Integrated Moving Average (ARIMA) and Long Short-Term Memory (LSTM) on representatives of each cluster and also done forecasting for the year 2018. The results have proven that ARIMA works better than LSTM.

H. L. Shilpa, P. G. Lavanya, Suresha Mallappa
A Memetic Evolutionary Algorithm-Based Optimization for Competitive Bid Data Analysis

One of the most difficult decision-making problems for buyers is to identify which suppliers to provide contracts to of the biddable items for a competitive event after considering several conditions. This is an example of pure integer linear programming (ILP) problem with cost minimization where all the decision variables, i.e., the quantity of the biddable items to be awarded to suppliers, are always nonnegative integers. Normally for solving ILP, Gomory cutting plane or Branch and Bound technique using the simplex method are applied. But when the problem to be solved is highly constrained and a large number of variables are involved, finding a feasible solution is difficult and that can result in poor performance by these techniques. To address this, an improved memetic meta-heuristic evolutionary algorithm (EA) such as shuffled frog leaping algorithm (SFLA) is utilized to find the optimum solution satisfying all the constraints. The SFLA is a random population-based optimization technique inspired by natural memetics. It performs particle swarm optimization (PSO) like positional improvement in the local search and globally, it employs effective mixing of information using the shuffled complex evolution technique. In this paper, a modified shuffled frog leaping algorithm (MSFLA) is proposed where modification of SFLA is achieved by introducing supplier weightage and supplier acceptability to improve the quality of the solution with a more stable outcome. Simulation results and comparative study on highly constrained and a large number of items and suppliers’ instances from bidding data demonstrate the efficiency of the proposed hybrid meta-heuristic algorithm.

Pritam Roy
Tunable Access Control for Data Sharing in Cloud

Sabitha, S. Rajasree, M. S.Attribute-Based Encryption (ABE) suffers communication and computation overhead due to the linearly varying size of the ciphertext and the secret key, depending on the number of attributes in the access policy. This paper proposes a multilevel attribute-based access control scheme for secure data sharing in the cloud to reduce the overhead. It produces a constant size ciphertext and a compact secret key to efficiently utilize the storage space and reduce the communication cost. This method flexibly shares ciphertext classes among the randomly selected users with a specific set of attributes. All other ciphertext classes outside the set remain confidential. It allows dynamic data updates and provides access control of varying granularity, at user-level, at file-level, and attribute-level. Granularity levels can be chosen based on applications and user demands. This scheme tackles user revocation and attribute revocation problems, and prevents forward and backward secrecy issues. It allows the data owner to revoke a specific user or a group of users. It is very useful for secure data storage and sharing.

S. Sabitha, M. S. Rajasree
Methodology for Implementation of Building Management System Using IoT

In recent times the need for automation has grown many folds. The need of the hour is to control and display the working of systems utilized in everyday life with a click of a button. This paper presents the designing and implementation of the parking management system (PMS) as a part of a building management system (BMS). Also, the paper suggests the methodology for the effective working of the building management system. The implementation of the communication protocol will be both wired and wireless for effective working. The user will be able to see the display of the status of occupancy immediately as the data will be updated through the Internet periodically. The objective is to reduce the search time required to search for any vacant slot for parking and indicating it, using red or green LED indicators and thus sending all the real-time information to a master server for monitoring purposes or to a display at the entrance. The ultrasonic parking system has been successfully checked and its design has been verified which successfully detects the presence or absence of vehicles under it and indicates it by blinking red/green LED, respectively. PMS is planned and strategised for interconnection using internet/intranet connectivity.

Ankita Harkare, Vasudha Potdar, Abhishek Mishra, Akshay Kekre, Hitesh Harkare
A Brief Understanding of Blockchain-Based Healthcare Service Model Over a Remotely Cloud-Connected Environment

Blockchain technology is in the limelight due to its immutable and anonymous data recording without a centralized authority. In this paper, we present a blockchain-based healthcare model to demonstrate a large-scale blockchain. Blockchain-based model covers numerous sectors like financial, IoT, health care, etc. The blockchain technology has proved its authenticity as a virtually incorruptible cryptographic data storage which provides to the end user in a secure way. The blockchain model is composed of a series of machines that can give on-demand service over the safety net. The access point of heterogeneous medical records over the network is maintained by a hub of computers which operates as a pseudo-anonymous system. The requisite of medical sensitive data is a value from all aspects that range from patient symptoms information, medical certificate, transaction detail, etc. The beauty of the blockchain model is its decentralization by which it can fight against a single point attack. The requisite of sensitive data is growing stronger day by day which needs more sophisticated service for devices collecting personal information through web-based applications or cloud. This paper highlights a comprehensive insight on accessing structured medical data through blockchain. By prototyping, this model will identify, extract and analyze the real-life data metric point in further implementation. A medical certificate was counterfeited by a certain group of people; by using this they are creating a headache for the government and different services for a physically challenged person are misused by them inappropriately. Electronically medical records are using age-old technology for storing; they need modification and strict secure regulation to showcase our technological up-gradation. In this paper blockchain, the pattern-based model, makes medical data management more transparent for service seekers and providers. The giant step of blockchain is going to revolutionize the future e-healthcare the way it interacts with and stores data. But it’s not going to happen in the very next day. It can face certain smaller problems which can be mitigated at its nascent stage slowly. So this model hopes to solve the regulatory hurdles. Blockchain is emerging as the plateau of productivity for distributing information in health care. In this paper showcases a model that incorporates cryptographic evidence of work over multiple locations; this strong espionage among data blocks checks data against theft. It could influence a convergence point over an interoperable network and can decrease central administrative control. An electronic health record that uses blockchain is an integration of clinical data, medical certificate, etc.; it establishes in its entirety, medical data gathered across a range of smart devices, and hospital databases over the cloud and other sources could all be securely encrypted to individual demands for full chain access. Every service interaction is added with intrinsic security and by implementing healthcare blockchain.

Subhasis Mohapatra, Smita Parija
Ensemble Learning-Based EEG Feature Vector Analysis for Brain Computer Interface

Brain Computer Interface (BCI) can be normally defined as the process of controlling the environment with the EEG signals. It is a real-type computer-based process which translates the human brain signals into necessary commands. Because there are many strokes attacked or neurologically affected patients in the world and they are not able to communicate effectively or share their emotions, thoughts with the outside world. Considering some extreme cases, tetraplegic, paraplegic (due to spinal cord injury) or post-stroke patients are factually ‘locked in’ their bodies, incompetent to exert strive motor nerves control after the stroke, paralyses or neurodegenerative diseases, requiring alternative techniques of interactive communication and control of organs. So, BCI is one of the best solutions for this purpose. This paper mainly discusses ensemble learning approaches for EEG signal classification and feature extraction. Bagging, Adaptive boosting, and Gradient boosting are quite popular ensemble learning methods, which are very effective for elucidation and explication of many practical classification problems. EEG signals may be classified as adopting a set of features like autoregression, power spectrum density, energy entropy, and linear complexity. In this paper, we have used three illustrative procedures Adaptive Boosting, Bagging, and Gradient Boosting ensemble learning methods and extracted features using discrete wavelet transform (DWT), autoregression (AR), power spectrum density (PSD), and common spatial pattern (CSP).

Md. Sadiq Iqbal, Md. Nasim Akhtar, A. H. M. Shahariar Parvez, Subrato Bharati, Prajoy Podder
Cluster-Based Data Aggregation in Wireless Sensor Networks: A Bayesian Classifier Approach

The network is composed of wireless distributed sensor nodes with data computational capabilities. In each cluster, the cluster member nodes are used to send the sensed data to their respective cluster head to aggregate and classify the data effectively. In this work, the algorithm for cluster-based aggregation of data using the Naive Bayesian Classifier is proposed. The proposed scheme provides better performance rather than existing algorithms with accuracy, efficient energy utilization, and computation overhead.

Lokesh B. Bhajantri, Basavaraj G. Kumbar
An Optimized Hardware Neural Network Design Using Dynamic Analytic Regulated Configuration

Due to the inherent parallelism offered by the Artificial Neural Networks (ANNs) and the rapid growth of Field Programmable Gate Array (FPGA) technology, the implementation of ANNs in hardware (known as Hardware Neural Networks, HNN) for complex control problems have become a promising trend. The basic design challenge in such a realization is the effective utilization of FPGA resources and the high-speed restructuring of the ANN circuits. These two factors are having a direct impact on the size, response time and cost of any HNN system. But these two aspects are competing variables, and controlling one factor may result in the violation of the other. In this paper, a simplified optimization technique known as Dynamic Analytic Regulated Configuration (DARC) is proposed. It has been used as a basic design methodology to increase the performance of an ANN architecture by applying a regulated restructuring on FPGA. A single layer feed-forward ANN system has been considered for this work. A brief explanation has been given about the workflow, the DARC structure and the design challenges. A comparison is provided between the static and dynamic restructuring outcomes. The result shows that optimizing both the above constraints is not effective for larger systems at present due to the design limitations and it can be an effective approach for small-scale implementation.

V. Parthasarathy, B. Muralidhara, Bhagwan ShreeRam, M. J. Nagaraj
Conceptual Online Education Using E-Learning Platform of Cloud Computing

Education is the process of procurement of knowledge; accelerating learning, adroitness, worth, credence, and custom. An e-learning platform is built with a goal to provide knowledge to everyone without making them physically attain it and also at a minimum cost without any hassle. e-Learning presents solutions that include Learning Management Systems(LMS) integration, online courses, evaluation engines, and test prep systems on diverse distribution channels such as the Internet and mobile apps with rapidly evolving digital revolution and smartphone growth. Today’s e-learning platform requires a huge cost to set up their required resources and various applications related to software. The acceptance of cloud architecture and its computing can help them to reduce expenditure on resources required for the setup of infrastructure, software, and human assets to an extensive level. The paper highlights important aspects of the present e-learning framework and its related models along with issues in present applications of e-learning. The paper also throws light on cloud computing principles and analyzes benefits for adopting them. A solution for applications related to e-learning using cloud computing is proposed herewith.

Vivek Sharma, Akhilesh Kumar Singh, Manish Raj
Efficient Deployment of a Web Application in Serverless Environment

In the recent era to gain speed up, scale-up/down, and for cost-effective features, people are moving toward services provided by AWS or other computing paradigms, (salesforce, etc.) i.e., AWS Lambda which is applicable to build a serverless environment. Serverless environment doesn’t mean there is no server but tells AWS to not only give all services which are provided by the server but also give metering on a pay per second basis. This project will entertain users with a full stack-based interface where they can request a unicorn ride from any place they want.

Vivek Sharma, Akhilesh Kumar Singh, Manish Raj
Backmatter
Metadata
Title
Evolutionary Computing and Mobile Sustainable Networks
Editors
Prof. V. Suma
Dr. Noureddine Bouhmala
Dr. Haoxiang Wang
Copyright Year
2021
Publisher
Springer Singapore
Electronic ISBN
978-981-15-5258-8
Print ISBN
978-981-15-5257-1
DOI
https://doi.org/10.1007/978-981-15-5258-8