Skip to main content
Top

2021 | Book

ICCCE 2020

Proceedings of the 3rd International Conference on Communications and Cyber Physical Engineering

insite
SEARCH

About this book

This book is a collection of research papers and articles presented at the 3rd International Conference on Communications and Cyber-Physical Engineering (ICCCE 2020), held on 1-2 February 2020 at CMR Engineering College, Hyderabad, Telangana, India. Discussing the latest developments in voice and data communication engineering, cyber-physical systems, network science, communication software, image and multimedia processing research and applications, as well as communication technologies and other related technologies, it includes contributions from both academia and industry. This book is a valuable resource for scientists, research scholars and PG students working to formulate their research ideas and find the future directions in these areas. Further, it may serve as a reference work to understand the latest engineering and technologies used by practicing engineers in the field of communication engineering.

Table of Contents

Frontmatter
A Multi-tier Architecture for Soft and Hard Real-Time Systems Involving Multiple Data Sources for Efficient Data Processing

The advancement of technology has seen the growth of IoT based devices all around the globe. With the introduction of wearable devices, smart appliances, the amount of accumulated data has increased exponentially. For soft real time systems, it is a major issue when it comes to analytics and providing the accurate results for future strategies leading to profitability aspects of an organization to the estimation of life expectancy of an individual. Soft real-time systems, where huge amount of data processing is equally important to context awareness, pervasive computing systems can use another layer for its data flow and this paper looks at an idea which benefits such systems. The proposed paper introduces an intermediate layer between User interfaces and the databases along with the traditional application layer and context or networking layer that already exists. The proposed paper also explains at how this architecture will be implemented and can be used as a generic architecture model.

Suman De, Vinod Vijayakumaran
Types of Keyloggers Technologies – Survey

Keyloggers are rootkit malware that record the keystrokes of the victim’s system and log it into the attacker’s system. It can be used to capture sensitive data like passwords, PINs, usernames, some kind of confidential messages shared between two entities etc. We would be explaining different types of keyloggers and their working. We have also explained the different applications and measures needed to avoid keylogging activities on your system.

Ashley Tuscano, Thomas Shane Koshy
Edge Computing Approach to DEVOPS

Lagging or slow responsiveness is very common among Jenkins users, and you can find many reported issues on it. Slow Continuous Integration systems are frustrating, they increase the end to end Software development time. The combination of Edge Computing and Docker, a container-based technology could bypass these performance problems and increase the user experience. Our paper presents a way to use edge computing and docker approach for Continuous Integration and Continuous Delivery to run the Jenkins server on the local system/developer system and once the checks and builds are run successfully on it, the deployment will be done to target environment.

Shivankit Bisht, Pratyush Shukla
A Game-Theoretic Approach for Cognitive Radio Networks Using Machine Learning Techniques

Cognitive Radio has been viewed as a promising technology to enhance spectrum utilization significantly. In this work, we propose a model for Dynamic Spectrum Allocationin Cognitive Radio Networks using Game Theory. Furthermore, in order to accommodate for all cases, we have put to good use of Preemptive Resume Priority M|M|1 Queuing Model. To supplement it we introduce a priority-based scheduling algorithm called Incremental Weights-Decremental Ratios (IW-DR). As a means to ameliorate the efficiency, we have made use of Regression Models.

S. Mangairkarasi, Rooppesh Sarankapani, D. Arivudainambi
Classification Accuracy Comparison for Imbalanced Datasets with Its Balanced Counterparts Obtained by Different Sampling Techniques

Machine learning (ML) is accurate and reliable in solving supervised problems such as classification, when the training is performed appropriately for the predefined classes. In real world scenario, during the dataset creation, class imbalance may arise, where one of the classes has huge number of instances while the other class has very less in numbers. In other words, the class distribution is not equal. Such scenarios results in anomalous prediction result. Handling of imbalanced dataset is therefore required to make correct prediction considering all the class scenarios in an equal ratio. The paper mentions various external and internal techniques to balance dataset found in literature survey along with experimental analysis of four different datasets from various domains- medical, mining, security, finance. The experiments are done using Python. External balancing techniques are used to balance the datasets- two oversampling SMOTE and ADASYN techniques and two undersampling Random Undersampling and Near Miss techniques. These datasets are used for binary classification task. Three machine learning classification algorithms such as logistic regression, random forest and decision tree are applied to imbalanced and balanced datasets to compare and contrast the performances. Comparisons with both balanced and unbalanced are reported. It has been found that undersample technique loses many important datapoints and thereby predicts with low accuracy. For all the datasets it is observed that oversampling technique ADASYN makes some decent prediction with appropriate balance.

Tilottama Goswami, Uponika Barman Roy
CNN Model for American Sign Language Recognition

This paper proposes a model based on convolutional neural network for hand gesture recognition and classification. The dataset uses 26 different hand gestures, which map to English alphabets A–Z. Standard dataset called Hand Gesture Recognition available in Kaggle website has been considered in this paper. The dataset contains 27,455 images (size 28 * 28) of hand gestures made by different people. Deep learning technique is used based on CNN which automatically learns and extracts features for classifying each gesture. The paper does comparative study with four recent works. The proposed model reports 99% test accuracy.

Tilottama Goswami, Shashidhar Reddy Javaji
Power Quality Issues in Commercial Load - Impact and Mitigation Difficulties in Present Scenario

Power quality in electric power grid and at consumer side network is a very significant contributing factor to the development of any country in terms of economics. It is essential to achieve this through maintaining the system power quality. The major division of loads categories are residential, commercial and industrial type. In this paper, a commercial load is taken as a case study and covering wide range of power quality issues related to harmonic distortions, low power factor, unbalancing, triplen harmonic current and high neutral current problems. In the era of growing mitigation techniques, it is very difficult to choose one and find best one. Depend on various parameters the best suitable solution is split capacitor shunt active power filter is proposed and validate with MATLAB simulation to verify the results. The impact and power quality issues in commercial load is ignored earlier but now it’s time to pay attention on such type of loads to minimize the bills and other bad power quality dependent expenses. This case study offered a detailed power quality analysis and solution including economic analysis to go for easily and readily implementable with economical way out.

Karuna Nikum, Rakesh Saxena, Abhay Wagh
Optical Networks Implementation Using Survivability Capacity Connectivity Algorithm (SCCA)

Survivability is the basic phenomena which represent the data protection in terms of connectivity, to overcome Non-observance, Path performance and the integration [1]. Optical networks Multi Configurable Connectivity (MCC) is obtained by the path propagation in terms of short range and/or long range. In this paper multi hop network analysis represents the Capacity Connectivity (CC) and Min- Max Throughput (MMT) [2]. Further it improves the apportionment designed to perform packet scheduling. It represents the NXN topology in which min-max packet transmission and reception, according to the traffic pattern. It also provides multiple channel allocation and maximum channel utilization is achieved. It is expressed in terms of the topology connectivity, in which the traffic flow is distributed towards the adjacent flows. Further, it is non-centralized and globalized in order to achieve portability and extensibility.

K. V. S. S. S. S. Sairam, Shreyas Arunesh, Pranav Kolpe
A Comparison Analysis of Collaborative Filtering Techniques for Recommeder Systems

In E-commerce environment, a recommender system recommend products of interest to its users. Several techniques have been proposed in the recommender systems. One of the popular techniques is collaborative filtering. Generally, the collaborative filtering technique is employed to give personalized recommendations for any given user by analyzing the past activities of the user and users similar to him/her. The memory-based and model-based collaborative filtering techniques are two different models which address the challenges such as quality, scalability, sparsity, and cold start, etc. In this paper, we conduct a review of traditional and state-of-art techniques on how they address the different challenges. We also provide the comparison results of some of the techniques.

Amarajyothi Aramanda, Saifullah Md. Abdul, Radha Vedala
Biomimicry for Talent Acquisition

Hiring talented individuals is critical to an organization’s success. In the recent times, there has been unprecedented growth in demand for top quality candidates. Talent acquisition in the fierce job market is the biggest challenge faced by recruiters and HR managers today. In this paper, we proposed an approach to tackle this problem using Biomimicry. It is a research stream to seek sustainable solutions to complex problems by observing and drawing inspiration from the patterns and strategies used by nature. Our proposed solution takes the recruitment strategies employed in ant colonies and by swarm of bees as reference and applies these talent acquisition patterns to human recruitment. It turned out to be an excellent fit for tackling recruitment challenge in many industries like IT. The feedback for the application of Biomimicry in real-life recruitment scenarios has been very encouraging and regarded as an innovative and out of the box solution for a widely known, pressing issue.

Nikhitha Mittapally, Ashok Baggaraju, M. Kumara Swamy
Deployment of a Simple and Cost-Effective Mobile IPv6 Testbed for the Study of Handover Execution

In the performance evaluation of any networking system, the testbed approach is more difficult, expensive and time consuming compared to the analytical and simulation approaches; however, the former has the advantage of realistic results. The purpose of this paper is to present, in detail, the deployment of a simple and cost-effective Linux-based Mobile IPv6 Testbed for the study of handover execution with testing checkpoints and debugging procedures. Further, this paper evaluates performance metrics such as bandwidth, packet delay, jitter and handover delay with respect to TCP and UDP traffic, and compares the same with the MIPv6 NS2 simulation results. Numerical results show that there is a marked variance between testbed and simulation results.

B. R. Chandavarkar
The Dependency of Healthcare on Security: Issues and Challenges

Information security and privacy in the sector of healthcare is an important issue that has to be given importance. With the growing adoption of electronic health records of patients, the need of accessing and sharing information between different healthcare professionals is also increasing. This gives rise to the attention that has to be provided for securing the information. Also the adoption of the Internet of Things in wireless body sensor networks, leads to the usage of Cloud and Fog in healthcare systems. Thus pointing towards secure methods of accessing, storing, processing of sensitive data. In this paper, an overview of different issues and challenges pertaining to the security of healthcare systems is presented. Also, the solutions to address the security concerns in the healthcare systems are also discussed.

Lakshmi Jayant Kittur, Richa Mehra, B. R. Chandavarkar
One Time Password (OTP) Life Cycle and Challenges: Case Study

In today’s world of the internet, we give priority to secure many of our accounts over this. It could be our social media profile, our cloud storage account, or anything; everything can be accessed only via a login interface. Traditional login systems follow a simple procedure of asking for username and password to authenticate the user. But due to the exponential development in the field of technology, this login interface is vulnerable to many attacks and thus compromising the user’s privacy and data. One more level has been added to the login interface known as OTP or One Time Password. The concept of OTP is that a randomly generated fixed digit code is sent to the user’s physical device if the entered password is correct. This way, we can be surer that only the appropriate user is accessing the system. This whole system combined known as 2-Factor authentication system. OTP is sent directly to the user’s accessible physical device only. It has many other applications like bank transactions, deletion of a social media account or cloud account, and many others. It’s because of the dynamic nature of OTP. This paper discusses the lifecycle of the OTP and its issues and challenges in the current world.

Deepak Kumar, Uma Kant Gautam, B. R. Chandavarkar
Comparative Study Between RSA Algorithm and Its Variants: Inception to Date

RSA Public Key Cryptography (PKC) otherwise called asymmetric encryption, comprises of two keys known as public key and private key. While the sender utilizes receiver’s public key to encrypt the message, the receiver’s private key is utilized for decrypting the message, so there is no compelling reason to share a private key as in symmetric cryptography which requires sharing a private key. This paper means to investigate RSA and its variants, study its qualities and shortcomings, and propose inventive answers for conquer the shortcoming. RSA is extraordinary compared to other asymmetric key cryptographic algorithms in correspondence over systems.

Urvesh Rathod, S. Sreenivas, B. R. Chandavarkar
Ways of Connecting Illiterates with the Security Mechanism: Case Study

The digital communication faces lots of security threats and attacks in the network. Thus, security mechanisms are used as a measure to ensure safety and to implement the security services for the text documents, audio, video and other types of data communication over the network. These security mechanisms are helpful to recognise, prevent and recover from security breaches, threats and attacks. Some of the security mechanisms are Cryptography, Public key certification, Authentication, Digital signatures etc. Obviously, these are mostly used by educated people in communication for their data to be remain integrated, confidential and safe.But, a bitter fact is that there are 30.90% people who are illiterate and have no education background living in our country and their data is also very important to them. So, we have to connect those illiterate people to these security mechanisms, so that they can also communicate without being in fear of the data loss or data manipulation.There are many ways to connect illiterate people with the security mechanisms such as by using facial recognition and fingerprint as the password in various systems or machines like ATMs, banks, government offices and also provide information and knowledge about the prevention from the possible fraud, theft and malicious activities. This paper discuss those different ways to connect the illiterate people with security mechanisms and enlighten their lives.

Sunny Ranjan Kumar, Meghna Sonkar, B. R. Chandavarkar
Essential Requirements of IoT’s Cryptographic Algorithms: Case Study

Internet of Things (IoT) devices are increasing rapidly in today’s world, but the security of devices remains a major concern due to the unavailability of the memory and processing power in these devices, which is because of their smaller size. The trade-off lies between security and performance, i.e. if security is increased, which will come with high complexity and hence would deter the performance. On the other hand, if performance has to be increased, it would come with a cost in terms of security. Also, IoT devices can be used as bots as they are globally accessible without much of a security. The most secure cryptographic algorithms use a lot of resources, and in case of IoT, resources are not available on that scale, so there is a need to design a secure algorithm (lightweight cryptography) that would use less resources and hence won’t affect the performance either.

Shubham Kumar, Zubair Ahmad Lone, B. R. Chandavarkar
Prime Numbers and Its Applications in Security: Case Study

Prime Numbers are the major building blocks in integer universe. Prime numbers play an important role in number theory and cryptography. With this unique nature of prime number, it is mainly used in security. Many security algorithms have used prime numbers because of their uniqueness. In this paper, we have discussed the importance of prime numbers and their application.

Anshul Kumar Namdeo, Abhay Lomga, B. R. Chandavarkar
Nonce: Life Cycle, Issues and Challenges in Cryptography

We all are living in the era of online processing, where the maximum of the information is available online. As the facilities of computer technology have increased, threats of losing personal and sensitive information have also increased. Cryptographic software and algorithms are good at some extent but as we all are seeing several attacks like Plaintext attack, Replay attack on Apply pay, Interleaving attack on PKMv2, etc. show us that our cryptographic software is less likely to be broken due to the weakness in the underlying deterministic cryptographic algorithms.A nonce is another attempt to improve security from these kinds of attacks. A nonce is an input value that will not repeat in a given context. Nonce use to prevent replay and interleaving attacks. Nonce also protects websites against malicious exploits that are based on Cross-Site Request Forgery (CSRF). The main aim of this paper is to introduce, What is Nonce, how it works and what are the issues and challenges in cryptography that we can solve with Nonce.

Shivam Sharma, Sajal Jain, B. R. Chandavarkar
An Intelligent Public Grievance Reporting System-iReport

Public grievance is important in making governance very effective. Generally, the citizens are expected to report their grievances through a grievance reporting system. In the literature, efforts have been made through an interactive web portal, IVR system, mobile applications, etc. However, none of these approaches are providing a feedback mechanism and the status of grievance instantly. In this paper, we address the issues of public grievance reporting using cloud vision. We built an intelligent public grievance reporting system that tracks the grievance of the citizens instantly. The system captures the images and segregates them, and directly drops the information about the issue or a problem to the nearest responsible authority. The system resolves the issue effectively without much human effort and even it can be used by illiterate people. We have conducted the experiments on the real-time system and the results improve the grievance reporting effectively.

M. Laxmaiah, K. Mahesh
Challenges and Opportunities with ECC and Noncommutative Cryptography: A Survey Perspective

Due rapid growth in technological tools and techniques, there are various required considerations to take in the field of information security. The buzzwords of security and privacy are always being important observations in that regards. If consider the overall scenario Elliptic Curve Cryptography (ECC) is releasing one of the most powerful algorithm in relation to better performance and security related concerns than RSA algorithm. It is working as light-weight cryptography due to following reasons such as low computation costs, use of shorter key sizes, and the discrete logarithmic problem is nearly is hard to achieve on computational complexities. With ECC algorithm, to implement the same with noncommutative cryptography assumptions, which is one of the possible generalizations on noncommutative properties, that adds valuable research contributions in perfect secrecy. The prospective thoughts are considered in relation to the same.

Gautam Kumar, Sheo Kumar, Hemraj Saini
Feature and Sample Size Selection for Malware Classification Process

Android is a popular operating system due to its wide acceptance among people. As a result, several applications have been designed based on Android and a lot many people are using them. It is observed that, due to a large number of users, the malware applications are induced in Android apps. Generally, malicious applications cause serious damage such as password stealing, ransom attacks, viruses, etc. Efforts are made in the literature to address the detection of malware applications. However, the detection of new malware systems is an issue with the existing malware detection systems. In this paper, we proposed an improved malware detection approach using the classification technique. In the proposed approach, we identify the relevant features for characterizing malware. We conducted the experiments on the real-world Derbin and Malegenome datasets and achieve significantly better results.

Raghunath Reddy, M. Kumara Swamy, D. Ajay Kumar
Adithri – (F2) The Farmer’s Friend

In this 21st century of internet world, the adoption of social network and advanced technologies has nurtured a drastic change to automation for communication through messaging applications by using web robots so called chat bots. The stimulation of a real-time conversation between the human and the computer using computer programs are said to be chatbots. For agriculture purposes, it is important to know about the various variables, update rapidly and available easily for the use of farm management by the farmers. In the domain of agriculture using Machine Learning technology chat bot ADITHRI has been prepared. Adithri is developed focusing on the search and query of data by the user deployed on different types of crops by posing query in Facebook which is based on the messenger bot API. The chat bot gives the description of the query posed by the user in the FB messenger bot. Adithri the farmer’s friend is designed in such a way by bringing various individual app services that were developed previously to the farmer such as Government schemes, Weather information, Fertilizers etc., in the shade of one umbrella. ADITHRI provides the services applicable for different types of crops and does not stick to one particular crop. It is expected that with logical capacity over the mass data, it is possible to work towards harmful situations by the farmer.

P. N. V. Sai Sri Gayathri, Sheo Kumar
Automatic Rice Quality Detection Using Morphological and Edge Detection Techniques

In the industries of farming, the estimation of grain quality is immense challenge. The management of quality is the most significant in the industry of food since post harvesting, on the quality basis constraints food manufactured goods are categorized & ranked into distinct ranks. The evaluation of grain superiority is finished manually yet it is relative, constraint of time, might be differentiating in the results and the cost. The methods of image processing are the substitute clarification which could be utilized for the grain quality analysis for overcoming these restrictions and shortcomings. The rice quality can be accessed by properties like grain size, whiteness, moisture content etc. This paper presents an algorithm of identifying the quality of rice gain using properties of size, shape etc using image processing algorithms.

R. Jegadeesan, C. N. Ravi, A. Nirmal Kumar
Analysis of Machine and Deep Learning Approaches for Credit Card Fraud Detection

In modern days, digitalization increased more demand because of faultless, ease, and convenient use of payment online. More people are choosing to pay the money through online mode through a safe gateway in e-commerce or e-trade. Today’s reality seems we are on the fast-growing to a cashless society. As indicated by the World Bank Report in the year of 2018 most of transactions are non-cash and also increased to 25%. Because of so many banking and financial companies spending more money to develop a application based on current demand. False transactions can happen in different manners and can be placed into various classifications. Learning approaches to classification play an essential role in detecting credit card fraud detection through online mode. There will be two significant reasons for the challenges of credit card detection. In the first challenge as the usage of the card has normal behavior or any fraudulent and second as most of the datasets are misrepresented for challenging to classify. In this paper, we investigate the machine and deep learning approaches usage of credit card fraud detection and other related papers and that merits and demerits and, of course, discussed challenges and opportunities.

P. Divya, D. Palanivel Rajan, N. Selva Kumar
Behavioural Analysis Based Risk Assessment in Online Social Networks

Even though a spectacular enhancement in the utilization of OSN, there are still many security concerns. In that situation, it will be very helpful to have a system that has ability for assigning a score of risk to every user of OSN. Here, we recommend an estimation of risk based upon the initiative that additionally the behaviour of user deviates from which it could be believed as a ‘regular behaviour’, further it has to be regarded as unsafe. We took it into consideration that the population of OSN actually varied at particular behaviours. It is difficult to explain the behavioural model of an exceptional standard which would fit all the users’ of OSN behaviours. Whereas, we do anticipate the same people who tend to chase the same rules along with the outcomes of same behavioural forms. We suggest a risk measurement systematized into 2 stages for this reason: The related users are grouped first simultaneously, later, for every recognized group; we would like to construct more than one forms for casual behaviour.

N. Teja Sree, G. Sumalatha
Robust Multimodal Biometric Recognition Based on Joint Sparse Representation

In this paper, we concurrently consider the correlations and the information of coupling among the modalities of biometric. A computation of multimodal quality is also suggested for weighing every procedure as bonded. Moreover, we generalize the algorithm for handling the data by non-linearity. Also we have task i.e., optimizing is resolved by utilizing a method of proficient alternative direction. Several researches explain that the suggested method will compare favorably along with competing fusion-based schemes. The customary methods of the biometric recognition depend on a solitary biometric sign for confirmation. Although the benefit of utilizing the numerous resources of data to establish the uniqueness which has been broadly identified, the models that are computational for the multimodal biometric identification have only the attention of obtained recently. We recommend a representation of multimodal sparse technique, which will represent the figures of test by a scattered linear mixture of training records, whilst restraining the studies from dissimilar test subject’s modalities to allocate the sparse illustrations.

V. Sathiya Suntharam, Ravikumar Chandu, D. Palanivel Rajan
Wavelet Based Feature Extraction and T-Set Evaluation for Automatic Brain Tumor Detection and Classification

Brain image classification in order to detect various diseases like Tumor, stroke, Intracranial bleeding (ICB), etc., is being done manually from the Magnetic Resonance Imaging (MRI) results. However, the manual approaches are not accurate, tedious, and time consuming. The proposed work is to classify images using K-Nearest neighbor (KNN), Support vector machines (SVM), random forest, and Decision tree (DT) approaches, and the results obtained through these four approaches are compared. The input to the system is the MRI image, and it is preprocessed to remove various noise sources, then decomposed into structure and texture components. The Discrete wavelet transform (DWT) is applied to perform the noise removal, and decomposition of MRI images. Then classification techniques are applied on the decomposed images in order to detect the condition of the brain as normal, tumor, Alzheimer’s, and ICB. The classification techniques are implemented using R programming. The performance of the four approaches are measured in terms of precision, recall, and F-measure.

S. Ravi, V. SathiyaSuntharam, Ravikumar Chandu
An Clue-Based Route Search on Road Networks Using Keywords and Spatial Relations

It is currently relatively commercial for road systems towards consuming documented substances on the highpoints using the improvements in geo-positioning expertise as well as position-grounded amenities. In latest centuries, preceding effort on recognizing an optimum path which protects a series of interrogation keywords has remained premeditated. On the other hand, an optimum path may not continually remain anticipated in numerous applied situations. Let’s say, wherever the product can remain distant commencing the optimum one, a modified path enquiry is delivered in providing certain hints that label the spatial framework amid PoIs alongside the path. Hence, we explore the issue of clue-based route search (CRS) that permits a consumer for providing hints on keywords as well as spatial relations in this broadsheet. Mainly, we recommend a materialistic procedure as well as an energetic encoding procedure by means of standards. We cultivate a division-and-assured procedure which trims excessive highpoints in enquiry dispensation towards improving the effectiveness. We recommend an AB-hierarchy which stocks both the expanse as well as keyword data in hierarchy arrangement in direction towards rapidly locating applicant. We build a PBhierarchy in using the quality of two-party brand directory towards pin down the applicant towards additionally reduces the directory dimension. Wide-ranging experimentations remain accompanied as well as confirm the preeminence of our procedures as well as directory constructions.

K. Vijaya Babu, Mrutyunjaya S. Yalawar, Parameswar Maddela
Secure Data Sharing Using Two Fold Cryptography Key Protection, Proxy Re-encryption and Key Separation Techniques

In Data transmissions Expertise, Data sharing in cloud storage is getting considerable responsiveness, as it could offer consumers using well-organized as well as operational storing facilities. The cryptographic methods are generally practiced for protecting the privacy of the collective complex information. On the other hand for information distribution, the information safety remains quiet posturing important encounters in cloud storage. How to defend as well as withdraw the cryptographic basic remains the essential trial amongst them. We recommend a novel information safety device for cloud storage for tackling this that embraces the resulting properties. 1) Using twofold aspects, the cryptographic basic remains safeguarded. The privacy of the cryptographic basic remains detained, on condition that one of the dual features works. 2) In incorporating the substitution re-encoding as well as basic departure methods, the cryptographic basic could remain withdrawn proficiently. 3) Through implementing the characteristic grounded encoding method, the information is endangered in a finegrained approach. Moreover, the safety inquiry as well as presentation assessment demonstrate that our suggestion remains safe as well as well-organized, correspondingly.

D. Uma Vishweshwar, A. BalaRam, T. Kishore Babu
A Fire Alarm Detection System with Speech Output Using Thresholding RGB and YCbCr Colour Space

Fire branches in Malaysia constantly face the problem of reaching a fire site locality in a short time for rescue operations owing to complications like of lacking data pertaining to crowding on roads leading to the site. In addition to their existing complexities, identification of fake calls from unidentified callers is making it even more difficult. The framework proposed in the current work is based on increasing the visualization grounded fire detection scheme which observes for a substitute result to overcome the complexities. Nevertheless, these revision choice remains restricted for fire detection only. The presentation of the structure is conformed by means of hundred images. To ensure, the scheme remains tough towards dissimilar neighboring illumination, the images are put in use from morning through evening. However, the images shall experience numerous stages of pre-handling for minimizing sounds. “YCbCr” color space illustrates top presentation compared to RGB since the former technique can discriminate luminance from chrominance more efficiently compared to RGB color space.

M. Archana, T. Neha
Secure Cloud Encryption Using File Hierarchy Attribute

Secure Data Sharing enables account-to-account sharing of data through Snowflake database tables, secure views, and secure UDFs. With Secure Data Sharing, no actual data is copied or transferred between accounts. All sharing is accomplished through Snowflake’s unique services layer and metadata store. This is an important concept because it means that shared data does not take up any storage in a consumer account and, therefore, does not contribute to the consumer’s monthly data storage charges. The only charges to consumers are for the compute resources (i.e. virtual warehouses) used to query the shared data. Attribute-based encryption is a type of public-key encryption in which the secret key of a user and the cipher text are dependent upon attributes (e.g. the country in which he lives, or the kind of subscription he has). In such a system, the decryption of a cipher text is possible only if the set of attributes of the user key matches the attributes of the cipher text. There are mainly two types of attribute-based encryption schemes: Key-policy attribute-based encryption (KP-ABE) and cipher text-policy attribute-based encryption (CP-ABE). In KPABE, users’ secret keys are generated based on an access tree that defines the privileges scope of the concerned user, and data are encrypted over a set of attributes. However, CP-ABE uses access trees to encrypt data and users’ secret keys are generated over a set of attributes. Secure data sharing is a challenging problem currently being faced by the cloud computing sector. A novel method of Cipher text policy attribute based encryption (CP-ABE) is presented by paper to overcome the issues. The data from various sources like Medical and military sources, stored in the cloud, tend to have a multilevel hierarchy. But this hierarchy was not explored in CP-ABE.

P. Prashanthi, M. D. Gulzar, S. Vikram Sindhu
Detection of Suicidal Tendency in Users by Analysing the Twitter Posts

Suicidal conceptual detection in online social networks is a prominent research area with most challenges. The main challenge of suicide prevention is understanding and detecting the major threat level and warning signs that may trigger the event. In this paper, we present a new way that uses media platforms like Twitter to measure suicide signs for a single person and to identify posts or comments containing suicidal intentions. The main aim is to identify such intentions in user posts to detect that we use a general language-based technique like martingale framework with calculating user online behavior. Practically proved that our text-scoring approach identifies warning signs in the text compared to traditional machine learning. Moreover, the applications of the martingale framework focus on any change in online behavior and eventually detect behavioral changes.

Mahesh Kumar Challa, Bairy Mahender, N. Prashanthi
A Secure Pre-existing Routing Technique for Enhanced Protection in MANETs

The mobility & elasticity of MANETs (Mobile Ad hoc Networks) have been made them accepted in different types of usages. The protocols of security to look after these networks were expanded for protection of the routing and particulars of application. The given practices protect only either route or data transfer. The security practices of routing as well as communication have to be executed to grant the entire safety. These security procedures were developed initially for wired lines and wireless networks such as the Wi-Fi and could burden a huge load on the restricted MANET resources. A novel secure system (SUPERMAN) has been initiated to address these challenges. The structure has been planned for allowing the network of existence and routing practices to execute the purposes, at the same time authentication of node providing, controlling the access, & the mechanisms of communication safety measures. This document is going to present a novel safety outline for MANETs, SUPERMAN. The results of simulation that compare SUPERMAN along with IPsec, SAODV & SOLSR have been presented for demonstrating the suitability of structures initiated for safety wireless communication.

Ravi Kumar Chandu, Sathiya Suntharam, Ch. Sirisha
Attendance Management Automated System Based on Face Recognition Algorithms

The management of automated attendance method in the paper. This structure is completely depending on face discovery other algorithms of recognition, robotically would detect a candidate whenever he would enter the class and attendance scores by identifying the person. The system algorithms architecture utilized in every phase have been explained here. Dissimilar actual situations were believed to assess the act of several systems of face acknowledgment. It also recommends the practices to be utilized to handle the hazards like spoofing. This system would save the time and also would support to supervise the candidates when it is evaluated to the customary marking of attendance.

E. Suresh Babu, A. Santhosh Kumar, A. Anilkumar Reddy
Lightweight Fine-Grained Search Over Encrypted Document

In fog computing, we mainly focus on lightweight fine-grained ciphertexts search (LFGS) which is a composite of CP-ABE and SE technologies, fog computing is used as an additional technology of cloud computing. The LFGS can transfer computation and storage from user to fog nodes. Its main focus on using attributes update in order to prevent unauthorized access which automatically resist CKA and CPA. In the modern world, this demo proved LFGS system performance efficiency and workable nature.

U. Mahender, S. Kiran Kumar
A Complete Home Automation Strategy Using Internet of Things

In this cutting-edge world, everyone wants to make their lives simple and secured, which can be accomplished by using Home Automation. It is an inclining word that is getting prevalent on account of its numerous lineaments. The concept of IoT renders the chance of Implementation of Home Automation. Internet of Things makes it possible to control and monitor different devices through the Internet. By connecting the IoT devices to the Internet or Cloud network, users can experience an automated home. The reason for this considerable demand for networkenabled home automation systems is because of their simplicity, comfortability, and affordability. IoT uses several technologies like cloud computing, which is used to connect the IoT devices to different platforms on the Internet and to access them at any time and anywhere in the world. In this project, we are using the Blynk platform to monitor our connected devices through the Internet. It provides us with various widgets and userfriendly interface to control our device. To control our devices with a voice, we are providing access to voice assistants like Google Voice Assistant, Alexa, Cortana, so on through a web-based service called IFTTT (If This Then That). It is an automation platform that provides communication between different web-based services, apps, or devices. In this project, we are implementing a model through which we can control and monitor all the connected sensors and actuators, and we are also connecting them to our voice assistant.

Deva Sai Kumar Bheesetti, Venkata Nikhil Bhogadi, Saran Kumar Kintali, Md. Zia Ur Rahman
Retinal Vessel Tracking Using Gaussian and Radon Methods

Retinopathy is one the cause of impairment of eye vision which leads to damage to the retina. Irregular sugar levels in the blood flow, abnormal blood flow in the retina and hypertension causes retinopathy. with the help of computer application tracking and estimating the diameter of a blood vessel is possible. The MATLAB software is used to track and estimate the blood vessel. In this software, the retinal image is given as an input image and the image processing methods are carried out to determine the diameter and track the retinal blood vessel. This technique distinguishes bifurcation focuses which might be valuable for further post - quantitative and compositional investigation.

N. Jaya Krishna, Fahimuddin Shaik, G. C. V. Harish Kumar, D. Naveen Kumar Reddy, M. Bala Obulesu
Color Image Segmentation Using Superpixel-Based Fast FCM

A large number of improved variants of the Fuzzy c-means were used to segment the gray scale and color image. Nevertheless, some of them take time to yield the best and optimal results and for two reasons cannot produce adequate segmentation results for color images. This work proposes a simple FCM clustering algorithm based on the Super pixel Algorithm (SFFCM), which is considerably faster and more robust for image segmentation applications which are based purely on the color parameter. In this research work, to attain an accurate contour super pixel image for improved local spatial neighborhoods, an efficient algorithm referred as multi-scale morphological gradient reconstruction (MMGR) operation is originally described. The significance of the proposed method lies in the relationships between the pixels and how they are utilized in the flow of the application. Compared to the conventional adjacent fixed-size and shape frame, the super pixel image provides improved adaptive and exact amount of irregular local spatial communities which help to enhance color image segmentation. Coming to the next step, the original color image is essentially based on the obtained super pixel picture, and the number of pixels in each super pixel region easily decides its histogram. Ultimately, FCM is implemented to obtain final segmentation results that increase the histogram parameter on the super pixel picture. The performance of the proposed method is computed using Matlab computing language and also a statistical measure is carried out based on different parameters and their respective graphical representations.

Jala Himabindhu, V. Sai Anusha
An Investigation on the Impact of Machine Learning in Wireless Sensor Networks and Its Application Specific Challenges

The importance of Machine Learning (ML) in advanced system technologies are proven in literature. This chapter investigates the role of ML in Wireless Sensor Networks and the challenges specific to its applications. We discuss the background literature of the renowned ML concepts and ML techniques. Further we distinguish the role of ML in WSN with detailed literature review. Subsequently, ML techniques for WSN are discussed from the literature. This chapter ends with the description of Functional and application specific challenges.

K. Praghash, T. Karthikeyan, K. Suresh Kumar, R. Sekar, R. Ramesh Kumar, S. Arun Metha
Morphology and ADF Based Brain Tumor Detection System from MR Images

Brain tumor is a chronic disease, which can not be correctly identified without an MRI. To minimize the contrast between consecutive pixels in order to pave the way for morphological operation on the MRI image, the image must first be filtered from this paper using the Anisotropic Diffusion Filter (ADF) to. After that, the image must be resized and converted physically to a black and white image using a threshold value. The primary filter considers potential locations for a tumor to develop. In this semi-image, morphological operations would be performed, and information on the solidity and regions of appropriate locations was obtained. The minimum value of both characters is determined on a statistical average of various tumor-containing MRI images. The key aim of this work is to classify the tumor by collecting 2D tumor image data by MRI photographs obtained from various angles of a single individual, and to examine them to show the precise 2D location of the tumor. To achieve this, for greater precision, 2D detection and segmentation of tumors has been established, so that 2D detection can be more accurate. This research work is implemented in MATLAB method with version R2015a or higher.

Kanekal Chinna Kullayappa, G. Nagesham
Optic Disk Segmentation for Glaucoma Detection in Retinal Images

Segmentation of optical disk and optical cup from retinal fundus images help to diagnose the abnormalities such as Glaucoma and can help to create awareness among the common man to plan for proper treatment plan in order to avoid complete visual morbidity. The original input image is at first filtered by means of histogram processing and further subjected to morphological image processing in order to classify the positions of optic cup and optic disk. This complete computation procedure is simulated using Matlab technical computing language.

G. Obulesu, Fahimuddin Shaik, C. Sree Lakshmi, V. Vijay Vardhan Kumar Reddy, M. Nishanth, L. Siva Shankar Reddy
Speckle Based Anisotropic Diffusion Filter for Ultrasound Images

Imaging of Ultrasound (US) presents significant challenges in visual medical inspection and creation of automated speckle-based analytical approaches that adversely influence tissue boundary detection and the efficacy of automatic segmentation techniques. A number of filtering strategies are usually used as a pre-processing phase before automatic review or visual inspection methods to minimize the impact of speckle. Many state of the art filters seek to decrease the speckle effect without recognizing its significance to tissue structure classification. This loss of expertise is further magnified due to the iterative process of some speckle filters, e.g. diffusion filters, which tend to produce over filtering during the diffusion period due to a progressive shortage of critical details for diagnostic reason. In this one we suggest a filter of an anisotropic diffusion that contains probabilistic-driven memory of probabilistic-driven scheme which can solve problem of over filtering by pursuing philosophy of a selective tissue. In general, we can design formula for the function of memory as a diffusion differential equation for the tensor of diffusion whose behavior depends on statistics of the tissue, by speeding up the cycle of diffusion in unnecessary regions and by utilizing the effect of memory in places where valuable knowledge must have to be stored in reliable manner. Tests of two photos which are real ultrasound and synthetic photos confirm the usage of the mechanism of probabilistic memory to maintain scientifically appropriate frameworks that the state-of-the-art filters are removing.

P. Siva Kalyani, S. Nazeer Hussain, N. Vishnu Teja, S. Younus Hussain, B. Amarnatha Reddy
Investigation of Level Set Segmentation Procedures in Brain MR Images

The task in this research is to evaluate the efficiency of the six level-set algorithms in 2D brain segmentation on a given MRI image. For both algorithms and the comparison contour used for the computation of the dice criteria, the initialization used is the same MATLAB tool-backed application is used to measure the efficiency, particularly in biomedical image processing, of different level-based segmentation algorithms. This work includes a comparative study of clustering algorithms according to their performance. Although some findings indicate that MRI images segmentation of the brain tumor is time-consuming, it is an essential work.

S. Fayaz Begum, B. Prasanthi
Medical Imaging Analysis of Anomalies in Diabetic Nephropathy

In diabetic patients, diabetic nephropathy is thought to be the leading end-stage renal disease. Proteinuria (excretion of excess protein in the urine) slowly improves diabetic nephropathy. A non-invasive imaging algorithm is desperately required to identify anomalies early in order to provide faster and improved care. As pre-processing and segmentation approaches the proposed algorithm uses enhancement. Equalization of histograms increases the global image contrast. Another version of histogram equalization computes multiple histograms, each corresponding to a separate section of the image and using them to redistribute and lighten the image value. CLAHE histogram equalization is an improvement of the previous method that works on specific regions of the image called titles rather than the whole image, and another technique, called dilation-based morphological reconstruction, is often used for preprocessing. Here Otsu thresholding is used as a post-processing tool that is used for automatic thresholding of images. This method is carried out on the R2018b and above version of MATLAB computing language.

U. Sudha Rani, C. Subhas
Development of Hybrid Pre-coding Technique for Mimo Systems Based on Kalman Filter

Channel contact with millimeter wave is a crucial enabler to addressing bandwidth shortage of potential 5G networks. In order to overcome the higher route failure, millimeter wave communications throughout the 60 GHz range requires massive antennas clusters at the sender and recipient nodes to obtain signal amplification benefits. Analog approaches could not employ the power of evolutionary income. In addition, band pass filters with maybe vibrating strings processes can be electronically regulated, thereby reducing the likelihood of sophisticated processes and culminating in poor results. Hybrid systems are exciting nominee approaches which transcend their drawbacks of shaping mere virtual or analog beams, because they combine all methods’ advantages. Through utilizing multiple concurrent beam transfers, hybrid systems growing bandwidth learning relative to analog-only systems.Analog signal amplification will obtain a only a spectral efficiency of around 3 bps/Hz at 20 dB, and being under the similar circumstances our suggested hybrid analog/digital signal amplification gets 10 bps/Hz. We are proposing an incremental Kalman derived multi-user hybrid approach that reduces the error here between nodes (BS) transmitted declaration of independence as well as the cell station (MS) approximate information obtained. The system includes a specially constructed error model, followed by a two-step process to first measure the pre-coding/mixing matrix of RF and instead models the pre-coding of the digital firmware at the BS.

C. H. Najaraju, G. Chandana, B. Manoj Kumar, C. Kishore Kumar
Enhancement of Cerebral and Retinal Vascular Structures Using Hessian Based Filters

A large vascular disorders such as stenosis, aneurysm and malformations, which involve different anatomical locations, are detected and handled through a set of techniques for imaging them. The need to diagnose and manage vascular disorders early has now contributed to the creation of numerous techniques in vascular imaging. Image manipulation plays an significant part of medicine’s study of photographs from different methods for the treatment of eyes. The goal of the novel method for enhancing visualization by using hessian-based filters is to highlight the secret vessels, to improve angiographies as well as the possible pathological locations. The pictures found come from the computed tomography and retinal data. The goal of the novel method for enhancing visualization by using hessian-based filters is to highlight the secret vessels, to improve angiographies as well as the possible pathological locations. The pictures found come from the computed tomography and retinal data. The novel upgrading feature suggested has many applications such as retinal vasculatures, neck, lung and fundus, but only retinal and cerebral vasculatures are taken into account.

Fahimuddin Shaik, J. Chittemma, S. Mohammed Islam, B. Laksminath Reddy, S. Damodhar Reddy
Throughput Comparison of Majority Logic Decoder/Detector with Other Decoders Used in Communication Systems

The Low Density Parity Check (LDPC) codes are linear block codes, which are Shannon Limit codes. These codes are attained least error floors of data bits for data transfer applications used in communication systems. However, the proposed LDPC codes are more beneficial than Turbo codes because of reduction in the decoding complexity and detection of the errors in less cycle time. This results the reduction of decoding time, low decoding latency and as well as least error floors in communication, when the transmitted data contains multiple error bits. This paper is proposed to represent the majority logic decoding/detecting of LDPC codes. This paper proposes the Generation of Generator and Parity Check matrices for both Binary and Non-Binary LDPC Codes. Here, the proposed Majority Logic Decoder/Detector (MLDD) is Hard decision decrypting scheme and it uses majority logic decoding based on the data transmission and reception in communication channel. This paper also elaborates the effective implementation of encoding and decoding of LDPC Codes.

J. Chinna Babu, N. Mallikharjuna Rao
A Review on OTA with Low Power and Low Noise Techniques for Medical Applications

Wearable Electrocardiography (ECG) sensors are most commonly used in monitoring a patient heart condition to detect cardiovascular diseases like heart failure and cardiac arrhythmia and many more. The amplifier that records noise, power and linearity performance in an ECG sensor is the crucial part. In the existing systems, different approaches are proposed for optimization in power and noise. However, the OTA is implemented by using various techniques that can mainly either reduce the power consumption or have lower Noise Efficiency Factors (NEF). In the proposed paper, different research works are observed and studied and hence results are compared between the works and discussed here.

J. Chinna Babu, A Thrilokanatha Reddy
The LTE Indoor and Outdoor Performance Evaluation Using OFDM

In the present technology the rapid expand of wireless data digital communications call for wireless structures which can be reliable and feature an excessive spectral performance. Orthogonal Frequency Division Multiplexing (OFDM) has been diagnosed for its proper overall performance to achieve excessive statistics prices. Fast Fourier Transforms (FFT) has been used to provide the orthogonal sub-carriers. Due to the drawbacks of OFDM-FFT primarily based gadget consisting of high height-to-common ratio (PAR) and synchronization and plenty of other works have replaced the Fourier rework part via wavelet transform. In this paper, a suitable method for the OFDM system and the proper usage of FFT is provided. This system shows a superior overall performance with traditional OFDM-FFT systems through an Additive White Gaussian Noise (AWGN) channel. Bit error rate (BER) defines the overall performance of the device as a characteristic of signal to Noise Ratio (SNR). Here in this work OFDM is evaluated using LTE. LTE stands for long term evolution. The LTE performance is evaluated in indoor and outdoor applications. Moreover, the proposed gadget gives nearly an excellent reconstruction for the input signal inside the presence of Gaussian noise. This work concentrates on reduction of errors and improving the SNR by using some of the digital modulation techniques such as Phase Shift Keying, Quadrature Amplitude Modulation and the Fourier transforms.

K. Riyazuddin, S. Nazeer Hussain, O. Homa Kesav, S. Javeed Basha
Image Segmentation with Complex Artifacts and Correction of Bias

ChanVese (CV) model ultimately solves many image segmentation issues based on area. Nevertheless, this procedure does not succeed when the given images of any particular application are skewed by means of the objects (outliers) and lighting bias which compensate the real contrast values. Within a single operational energy, the following two points are implemented in this research work, firstly a complex artifact class that prohibits strength outliers of skewing the image segmentation, and then within Retinex type of procedure, which disintegrate the concerned image into a piece-constant structural element and a smooth biased part. The parameters of CV-segmentation then function only on the design, and only in regions that are not recognized as objects. The process of Segmentation is considered as parametric process using a phase-field, effectively reducing threshold dynamics. The proposed method on a compilation of representative images from various modalities representing artifacts and/or bias are mentioned in this work. This method is considered useful where image distortion prevents conventional CV segmentation of activity and where artifacts and bias are of particular concern in the application area of medical imaging, for instance the magnetic resonance imaging (MRI) modality, where identification of lesions and correction of bias area is most preferred.

Fahimuddin Shaik, P. Pavithra, K. Swarupa Rani, P. Sanjeevulu
Low Power Enhanced Leach Protocol to Extend WSN Lifespan

WSN (Wireless Sensor Network) is characterized as a wireless sensor nodal network where one of the most challenging issues is the routing technique. Some of the critical issues with Wireless Sensor Networks is that the network sensor nodes have insufficient battery power. The battery power plays an important role in improving node lifespan. In Wireless Sensor Network, energy usage is among the most essential considerations for routing between various routing techniques. The best-known protocols are hierarchical routing protocols to reduce energy consumption. For application-specific type an enhanced protocol architecture for a Wireless Sensor Networks (WSN), a Low-Energy Adaptive Clustering Hierarchy (LEACH) was introduced. The proposed En-Leach protocol (Enhanced Leach) is an enhanced energy-efficient routing protocol that saves a large portion of the power of communication within the network. In do so, the proposed network topology chooses CH (Cluster Head) nodes from the higher residual energy of the sensor nodes, Further and a lesser range from the BS (base station). It then properly manages the SN (sensor nodes) and generates clusters to maximize the lifespan of the WSN and reduce the average energy dissipation per each sensor node.

Shaik Karimullah, D. Vishnuvardhan, K. Riyazuddin, K. Prathyusha, K. Sonia
Automated Speed Braking System Depending on Vehicle Over Speed Using Digital Controller

The point of this extend is to construct a system for keeping the vehicle secured and protecting it by the intruders’ occupation. The project’s goal is to set up a programmed velocity control for vehicles and mischance evasion framework utilizing programmable logic controller (PLC) and encoder sensor. The encoder sensor send’s out signals persistently to the PLC. When wheels begin turning and once the speed will reach greatest constrain the PLC will off the control supply consequently. After accepting this flag PLC sends a flag to the engine to decrease the car speed consequently which can control car speed quickly so that the car is worked consequently without any manual operation conjointly deliver buzzer sound to caution to the driver.

Ch. Nagaraju, G. Thirumalaiah, N. Rajesh, B. Bala Manikanta, N. Sai Sivaram, T. Prakash Raj
Morphological Watershed Approach for the Analysis of Diabetic Nephropathy

The main cause of progressive kidney failure and a significant cause of coronary mortality is diabetic nephropathy. The application of Watershed segmentation and Gradient magnitude has produced encouraging results among the image processing methods for detecting anomalies. The suggested algorithms using optimization as pre-processing and as post-processing approaches for segmentation. Clahe histogram equalization is an improvement of the previous approach that operates on specific parts of the image named titles rather than the entire image, and even another tool named dilation-based morphological reconstruction is used for pre-processing. Otsu Thresholding is used as a post-processing tool and is used to do automate image Thresholding. The Median filter is also used to eliminate noise from the signal and often retains the image edges when eliminating noise. The Segmentation of the Morphological Wetlands will accurately distinguish items on the foreground and context. The picture collection for this phase is from CT photographs of patients with diabetic nephropathy, as well as from Diabetic research institutes.

P. Siva Kalyani, G. Sasikala
Robust Algorithm for Segmentation of Left Ventricle in Cardiac MRI

The left ventricle is one of four heart chambers. It is situated underneath the left atrium in the bottom left portion of a heart, divided by the mitral valve. The left ventricle was the thickest chamber in the heart and is essential for pumping oxygenated blood through tissues in the entire body. Left ventricular failure occurs where left ventricle dysfunction induces inadequate blood circulation to vital body organs causes breathing problems, which seems to be a threat to people. The non-invasive medical imaging techniques would be more effective in early diagnosis for left ventricle dysfunction. In this real connection different medical imaging techniques, such as image enhancement and image segmentation, were developed based only on the basics of image processing techniques. The objective of this study is to develop a novel and robust algorithm that can enhance the efficiency of automatic LV segmentation on short-axis cardiac resonance imaging (MRI). This project shall be carried out on the basis of different thresholding methods and related qualitative analysis, in order to determine the best algorithm. It can also be implemented with the Matlab R2015b method or above. The outcome of this work is aimed for early detection and also to carry out effective care and measures.

M. Venkata Dasu, P. Tabassum Khan, M. Venkata Swathi, P. Venkata Krishna Reddy
An Optimized Clustered Based Video Synopsis by Using Artificial Intelligence

The proposed paper is about a static video rundown strategy based on the development of Artificial Bee Colony, which refers to the outline of a video by the most indispensable edges present in that particular video. First the video outlines pixel bunches or regions of interest that capture the most important varieties of substances are differentiated. A tale set of highlights estimated as far as the normal tone estimations of every one of these areas is then used to describe the edges. In view of these highlights, the grouping of casings the Artificial Bee Colony Advancement Measurement divides the video into parts. The serving lengths are increased to the point that all the edges of a specific fragment have comparative highlights, while, the center edges of various sections are essentially not the same as one another. These center edges are viewed as the key-edges of the concerned video. Any excess present in the main outlines chosen is dispelled by looking at their histograms of shades. The proposed work is accepted on this freely accessible SumMe dataset and also, on other hazardously selected web video recordings.

G. Thirumalaiah, S. Immanuel Alex Pandian, D. Teja Sri, M. Karthik Chowdary, A. Kumarteja
Performance Analysis of LTE Based Transeiver Design Using Different Modulation Schemes

Creation of wireless transceiver designs based on Long-Term Evolution (LTE) architecture initiated by the 3GPP consortium. Initially, a performance comparison on Bit error rate (BER) and Signal to noise ratio (SNR) is evaluated for a single transmitter and receiver storage system, both in fading (Rayleigh channel) and non-fading (AWGN) channels. Specific modulation schemes are used including Binary Phase Shift Keying (BPSK), Quadrature Phase Sift Keying (QPSK), and Quadrature Amplitude Modulation (QAM).Later, the transceiver designs implement multiplexing of the Orthogonal Frequency Division (OFDM) with the required requirements for the LTE modulation formats. BER and SNR performance assessments on designed transceiver structures are analyzed. This paper further assesses the channel’s efficiency or throughput using Shannon efficiency equation for a band-limited AWGN channel with an average transmit power limit. The channel efficiency is evaluated using parameters such as bandwidth and conveying power as constraints. Such parameters describe communication device boundaries.

C. H. Najaraju, P. Veera Prasad Reddy, Nidiginti Suneel, Gona Naveen Kumar
Unsupervised Segmentation of Image Using Novel Curve Evolution Method

Here, a novel algorithm for unsupervised field-depth (DOF) image segmentation is defined. To detect the object of interest (OOI) in the saliency space, a multi-scale re-blurring technique is used first. Firstly blurring is carried out to remove artifacts, later reblurring procedure. Thereafter, an active contour model based upon hybrid energy system is suggested to evaluate the OOI boundary. A global energy element relevant to the saliency map is implemented in this model to find the globally minimum, and a local energy term about the low DOF picture is used to increase the precision of segmentation. Additionally, this model is equipped with an elastic parameter to offset the weight of global and local resources. In addition, an unsupervised approach for initializing curves is intended to reduce the amount of iterations for evolution. More the iterations, the complexity and computation time to obtain the results may hike up leading to slow up the process of acquiring precise contours. Lastly, we perform experiments on different low DOF pictures, and the resultant demonstrates the high precision and robustness of the proposed method.

Fahimuddin Shaik, B. Vishwaja Reddy, G. Venkata Pavankumar, C. Viswanath
A Genetic Algorithm with Fixed Open Approach for Placements and Routings

Multiple traveling salesman issues can model and resolve specific real-life applications including multiple scheduling, multiple vehicle routes and multiple track planning issues etc. Though traveling salesman challenges concentrate on finding a minimum travel distances route to reach all communities exactly again by each salesman, the goal of a MTSP is just to find routes for m sellers with a reduced total cost, the amount of the commute times of all sellers through the various metropolises covered. They must start by a designated hub which is the place of departure and delivery of all sellers. As the MTSP is an NP-hard problem, the new effective genetic methodology with regional operators is suggested to solve MTSP and deliver high-quality solutions for real-life simulations in a reasonable period of time. The new regional operators, crossover elimination, are designed for speed up searching process consolidation and increase the consistency of the response. Results show GAL finding a decent set of directions compared with two current MTSP protocols.

Shaik Karimullah, Syed Javeed Basha, P. Guruvyshnavi, K. Sathish Kumar Reddy, B. Navyatha
Big Data and Social Media Analytics- A Challenging Approach in Processing of Big Data

As the world is moving towards the new technologies, the enormous amount of growth in population constructing usage of social media in their everyday life, the social media statistics are scattered in various disciplines. The footprints of social media data analytics system growth into four different steps, data analysis, data formation, data discovery and data collection. The main objective of this article is to analyze the status and evaluation of social networks on big data. The challenges and difficulties faced during distinct data analysis methods, the distinct stages of research’s data discovery, collection and preparation hardly exist. The challenge not only to conquer big data although in scrutiny and providing valuable data, which can be used for decision-making. In this article, we review the major challenges faced by researchers to process social media data. The outcomes are frequently used to enlarge an existing framework on social media. This article represents a comprehensive argue of the different studies linked with big data in social media.

Mudassir Khan, Aadarsh Malviya, Surya Kant Yadav
Open Switch Fault Diagnosis of Switching Devices in Three Phase VSI

Condition-based maintenance of complicated electrical apparatus is a measure issue in the automation industry. Condition-based maintenance includes fault detection, diagnosis, and prognosis of the system. Different techniques are used for fault diagnosis including knowledge-based fault diagnosis, model-based fault diagnosis and pattern recognition based fault diagnosis. This paper presents an open switch fault diagnosis of a three-phase voltage source inverter used to drive the induction motor. The three-phase real-time current is sensed without using an additional sensor and transformed to the d-q plane. The different patterns are observed for the healthy and faulty condition of switching devices. The patterns are observed to diagnose a faulty switch. The real-time results are presented to validate the proposed method. This method is effective and robust.

N. D. Thombare
Analysis of Dynamic Scheduling Algorithm for Reconfigurable Architecture

A dynamic scheduling algorithm is proposed to organize the upcoming task to be executed by the newly described architecture. To exhibit the concept, two tasks are considered as the third-party tasks and the third task is deemed to be routine activity task. These tasks are operated in concurrent mode and priority driven mode using proposed dynamic scheduler. The proposed dynamic scheduler is conceptualized and implemented using concurrent architecture and the system is critically analyzed with respect to the possible traits of the applications. For cogent analysis, the system is time simulated using Modelsim and subsequently it is synthesized to juxtapose the outcomings and to pick up the best performing device with respect to the application need. Finally, the tightly optimized version of the proposed system is introduced at different integrating scale. Towards the end of the system, the task completion time and task switching time are computed and compared with the other previously reported data base.

Pushpa M. Bangare, M. B. Mali
Using Face Recognition to Find the Culprit from a Video Footage and Crime Mapping

Recently face credibility in the group transmission information access society attracts ample publicity. Through a lot of video, areas such as group security, compartmentalization and retrieval of content data, and video compression benefit from face credibility technology as a result of “people” square degree in the centre of focus. Face detection is that the freshest analytical room within the technical vision. It is a technology employed at some stage in a type of packages recognizing human faces in digital images [1]. In several regions of science such as technology, research underneath this area is growing. Face detection is one of a few the important talked regarding in technology. Localization of human faces is considered because the main and, in addition, the initial stage of face detection analysis is taken into account. In home video, for instance, police paintings etc. Localization of the face is often said as extraction of the pattern of popularity method of victimization of face expression. Every MATLAB and Open CV is used frequently to generate these prototypes and systems. Throughout this paper we have conducted our Open CV assessment victimization as a result of victimization, it finally ends up with a sluggish additional use of it and assets in the picture procedure and less encoding and crime visualization make it easier to see the wrongdoer from many locations.

Ch. Sushma, K. Padmini, P. Sunil
Comparison of Texture Based Feature Extraction Techniques for Detecting Leaf Scorch in Strawberry Plant (Fragaria × Ananassa)

This paper presents a comparison among the techniques which use the texture features as a basis of detection the diseases among the captured leaf images of the Strawberry Plant. Local Binary Pattern, Complete Local Binary Pattern and Local Ternary Patterns are used to extract the features and a comparison of accuracy is shown among them. LBP is used because its features remain unchanged even if the monotonic gray-scale changes like, the effects that are caused due to illumination variations, are present in the image, Complete Local Binary Pattern is used since it conveys more discriminant information of local structure which is ignored by LBP & Local Ternary Patterns are used since it is more robust to noise than LBP which may improve the accuracy of the system. The set of images are taken from Plant Village Dataset. The image dataset contains the healthy leaf images and the Leaf Scorch Diseased Leaf Images.

Kirti, Navin Rajpal, Mukta Arora
Robotic Application in Stress Management Among Students in India

This paper is trying to explore the use of robots, particularly in the educational sector with special reference in managing stress among student in India. As stress, anxiety, depression and in severe cases lead to the suicides had become common among student therefore, the mental health of student is as-most important in-order to bring an inclusive environment for learning. Robots can be an effective apparatus for students in hassles, stress-free learning environment, which will, in turn, reduce the stress and its negative impact in the long run. At present only a few start-ups have started this initiative and in future, we can hope that government too can follow it towards building a knowledgeable stress-free academic environment for learning and development.

K. S. Madhusudan., GeeVarghese
Periodical Fruit Quality Identification—A Broad View

Now a days to maintain the health in polluted environment is very crucial part of our routine life. If the persons are not focusing on their health they will be suffering from different types of diseases. Therefore, to maintain the good health we need to visit the nearest hospital or multispecialty or super specialty hospitals to diagnose the disease and further its treatments. To maintain the body energetics and healthy doctors are always preferred to suggest eating some fruits. Fruits plays a very important role to keep the body and health in proper condition. Therefore, in this paper fruit quality identification is studied. It is observed through the study there are so many researchers has worked on the finding the defects of fruit with different types of techniques. Overall periodically studied the work on fruit quality identification on the basis of color, texture and classification techniques.

Rahul J. Mhaske, Siddharth B. Dabhade, Suhas Mache, Khan Sohel Rana, Prapti Deshmukh
SMS Spam Filtering Using Machine Learning Technique

Sending and receiving SMS is very ordinary thing for any individual’s daily life. But when at the moment, we receive undesirable SMS frequently that waste our time and money as well and consequently this moment gives us unpleasant feeling. If undesirable messages are to be sent to a huge volume of recipients erratically have resulted in displeasure by consumers but gives large profit to spammers. There are lots of reasons like high internet speed, very cheap smart phones and user friendly interface of mobile web and mobile applications that attracts a huge volume of mobile phone users. These are the key factors expected to shape the future of the market. This paper focuses on SMS Spam filtering techniques and compared their performance. We compared the machine learning model’s performance and finally result indicates that the Logistic Regression model performed well with accuracy reaching up to 96.59%.

Arvind Kumar Vishwakarma, Mohd Dilshad Ansari, Gaurav Rai
A Review on IOT Technology Stack, Architecture and Its Cloud Applications in Recent Trends

The Internet of Things (IoT) senses, gather and transmit data over the internet without any human interference. This technology is a mixture of embedded technology, network technology and information technology. On various advancement of huge network and the broadcasting of (IoT), wireless sensored networks are considered to be part of the huge heterogeneous network. IoT architecture is the system of various rudiments like sensored networks, protocol, actuators, cloud service and layers. Internet of Things can also be called as an event-driven model. The IOT device is connected to gateway through Radio Frequency, LORA-WAN, Node MCU Pin-out. This review paper describes all protocol stack including its types of sensors in IOT their applications in real time environment and its architecture. In this paper we come together with the two different technologies Cloud Computing and IoT to observe the most common features, and to determine the benefits of their integration. The Cloud IoT prototype involves various applications, research issues and challenges.

Mandla Alphonsa
Lung Cancer Diagnosis from CT Images Based on Local Energy Based Shape Histogram (LESH) Feature Extration and Pre-processing

Lung cancer as of now is one of the dreaded diseases and it is destroying humanity never before. The mechanism of detecting the lung cancer will bring the level down of mortality and increase the life expectancy accuracy 13% from the detected cancer diagnosis from 24% of all cancer deaths. Although various methods are adopted to find the cancer, still there is a scope for improvement and the CT images are still preferred to find if there is any cancer in the body. The medical images are always a better one to find with the cancer in the human body. The proposed idea is, how we can improve the quality of the diagnosis form using pre-processing methods and Local energy shape histogram to improve the quality of the images. The deep learning methods are imported to find the varied results from the training process and finally to analyse the result. Medical examination is always part of our research and this result is always verified by the technicians. Major pre-processing techniques are used in this research work and they are discussed in this paper. The LESH technique is used to get better result in this research work and we will discuss how the image manipulation can be done to achieve better results from the CT images through various image processing methods. The construction of the proposed method will include smoothing of the images with median filters, enhancement of the image and finally segmentation of the images with LESH techniques.

Denny Dominic, K. Balachandran
Comparative Evaluation of SMMD Values of Popular Social Media Sites: PGF-A High SMMD Case

In recent years, various social media applications have been using content in different multimedia forms. For the distribution of content as well as social interaction on internet and social media, the multimedia content is extensively used, called as social multimedia. A major trend in the current studies on social multimedia is using the social media sites as a source of huge amount of data for solving various problems in computer science applications. The wisdom of social multimedia lies in the usage of these multimedia elements. A few social media websites along with PGF site are considered here for evaluation of their social multimedia degree (SMMD). PGF is a Peoples’ Governance Forum established in 2017 without any iota of personal benefit and with a good cause of shouldering the national responsibility of disseminating the national integration among the students and provide basic awareness of how to use internet positively. This paper presents an evaluation perspective of social multimedia degree (SMMD) in the form of table for “Social Multimedia” applications. The PGF is observed to have high SMMD.

B. Malathi, K. ChandraSekharaiah
Application of FACTS Controllers for Enhancement of Transient Stability

This paper illustrates the power system transient stability enhancement of an 11-Bus system using FACTS controllers. Here there cases are analyzed on IEEE 11 bus system. The cases are without fault, with fault and post fault system the simulation is done via PSAT software. Simulation results are compared with different series FACTS controllers and it has been observed that UPFC is the best series facts controller foe enhancement of transient stability as compared to other series FACTS controllers.

Lokesh Garg, Shagufta Khan
Cryptocurrency: Threat or Opportunity

Though a weak currency is a sign of a weak economy and a weak economy leads to a weak nation, here we gave an attempt to study about the cryptocurrency. Cryptocurrency is also known as the digital currency of the 21st Century which moves in the form of cryptographic codes in between people or institutions those who are connected to peer to peer networks (P2P). Blockchain Technology plays a major role in the flow of cryptographic codes among various nodes in the P2P networks to convert into Cryptocurrency in the decentralized ledger environment. This paper investigates the working of cryptocurrency and its impact on economies, especially to list the threats and opportunities to the Indian Economy. Also gave an attempt to differentiate the cryptocurrency and fiat or real currencies on various aspects in the world economy. We used descriptive cum exploratory research methodology to get the desired results in the present research work.

Venkamaraju Chakravaram, Sunitha Ratnakaram, Ester Agasha, Nitin Simha Vihari
The Role of Blockchain Technology in Financial Engineering

This research work is to study and list out the processes and operational areas where Blockchain Technology (BCT) is playing a greater role as a tool in the process of financial engineering (FE) in the insurance business. We studied the use of BCT as one of the InsurTech tools in the design and development of financially engineered insurance products. Here, the development of insurance products covers the design of new and innovative insurance policy models, its attractive features as per the needs and requirements of concerned target customers. Insurance processes cover the management and administration of insurance business i.e., marketing, sales and distribution, the underwriting process and claims management, etc. Financial Engineering is a process of creating a new and innovative insurance model by merging existing policy models OR creating a new and innovative insurance model. FE uses the tools and techniques of Statistics, Financial Mathematics, Econometrics, ICTs which includes, FinTech tools, InsurTech tools like Blockchain Technology, Artificial Intelligence, etc. In this research work, we used descriptive cum explorative research methodology. We have studied the role of BCT as an effective tool in the financial engineering process of the insurance business.

Venkamaraju Chakravaram, Sunitha Ratnakaram, Ester Agasha, Nitin Simha Vihari
Identification of Malignant Region Through Thermal Images: Study of Different Imaging Techniques

Human body has a unique defined structure which can be differentiated from any unwanted growth of different body tissues or muscles or sometimes diseased cells. In medical science, unwanted growth of mass in the body is termed as malignant region. Abnormal growth of any cell is called as cancer. This abnormal growth of the cancer cell can affect any part/location of the body. This different location is called as malignant region. In this paper we did study on different imaging techniques focus on the different types of malignant region discovery methods such as thermal imaging, X-Ray imaging, Magnetic Resonance Imaging (MRI), optical imaging. Also, how the thermal image detects the abnormal cell in the human body. It is analyzed that malignant regions of the body help to prove the cancer cells those are present within the body. The early detection of malignant region helps in saving the life of an individual. To obtain thermal images, a thermal camera is used. Capacity of thermal camera is able to detect wide range of temperature of our body from lowest 0.1 °C to highest temperature. The temperature can vary due to physiological and emotional state in the human body.

K. Lakshman, Siddharth B. Dabhade, Sachin N. Deshmukh, Mrudul Behare, Ranjan Maheshwari
Multi Criteria Decision Making Under Fuzzy, Intuitionistic and Interval-Valued Intuitionistic Fuzzy Environment: A Review

Multi Criteria decision making (MCDM) problems can be handled with the help of fuzzy set theory and moreover, this theory has been extended to a new concept called IFS named as Intuitionistic fuzzy set. There has been an increasing demand in the growing research under the fuzzy environment because of numerous applications such as artificial intelligence, medical science, logic programming, medical diagnosis, neural networks and machine learning. We have systematically conducted a review on this topic after a deep analysis of 50 research papers to provide a new framework from all the existing theoretical results, logic and applications.

Suman, Namita Saini, Neeraj Gandotra, Ravinder Kumar
Speech and Facial Based Emotion Recognition Using Deep Learning Approaches

Deep learning models dependent on static highlights vector just as standardized fleeting highlights vector, were utilized to perceive feeling state from discourse. Also, relative highlights got by registering the progressions of acoustic highlights of passionate discourse comparative with those of nonpartisan discourse were embraced to debilitate the impact from the singular contrast. The strategies to relativize static highlights and fleeting highlights were presented separately and tests on the basis of database Germany also, database of Mandarin were executed. The outcomes show that the exhibition of relative highlights exceeds expectations that of supreme highlights for feeling acknowledgment as an entirety. At the point when speaker is free, the half and half of static relative highlights vector and relative fleeting highlights standardized vector accomplishes best outcomes. The principle motivation behind this discussion is to give a few presentations about the necessities and employments of facial articulation acknowledgment. Non-verbal type of correspondence is outward appearance. It communicates the human frame of mind and furthermore perceives their psychological condition. Quantities of research have been completed in the course of recent decades for improving the human PC connection. This paper contains the a few data about outward appearance acknowledgment, application, related investigation of face demeanor acknowledgment systems and steps.

M. M. Venkata Chalapathi
Graph: An Efficient Data Structure to Represent and Interpret Semantic Information

A popular Data structure Graph is a quite a useful structure to model variety of real-life problems. In language understanding domain, Semantic analysis makes an attempt to map syntactic structures of a language such as sentences and paragraphs as a whole, to their language-independent meanings. Main focus of Semantic analysis mainly focuses the context window frame surrounding individual words. To represent the context between words in the form of relationships between two words, Graph data structure is quite suitable. A Knowledge Base (KB) is a special type of graph which stores data in the form of entities as nodes and relation between entities as edges. Knowledge Bases majorly follow Resource Description Framework (RDF) standard to store relational data. Semantic Knowledge Graphs automatically identify relationships between entities to form a compact graphical representation from a data corpus to represent a knowledge in the given domain.

Ashwini V. Zadgaonkar, Avinash J. Agrawal
Application and Impact of Power System Optimization on Non Linear Problem

There are several methods have been presented to eliminate and obtain the optimized value of a given function in power system optimization. The main goal of all these issues is to minimize the required effort or maximized the benefit desired in any practical situation. The literature on Engineering Optimization is vast and diverse [2]. Energy power systems can easily be seen at its peak level in all the sectors which involve the generation of power system to transmission of energy and finally its distribution is the major issue [1]. With the increase in the usage of electric system, there can be seen a huge rise in complexity in the system. And likewise the utility of planning in generation came into focus and cannot be ignore. Thus there is an interconnection between the system which involve transmission and utility system [7].

Sadaf Qasim, Geetika Pandey
Compressive Sensing and Contourlet Transform Applications in Speech Signal

This paper explains a new method for performing two different processes compact and encode in a single algorithm. Speech compression is the way toward changing over discourse signals into a structure that is neatly packed so it has good quality in performance for correspondence and capacity by minimizing the dimensions of the data without losing the information standard (quality) of the original speech. On the other hand Speech encryption is the process of converting usual formal into an unrecognized format to give security to the data across an insecure channel in the transmitter. These two processes can be achieved by a compressive sensing algorithm. In addition to compressive sensing, the transformation of the outline is advantage to demonstrate the compressive sensing concept. It is a two-dimensional transform method for image representations. Contourlet transform plays an important for representing the sparse signals in the signal.

Korla Ramya, Vijayasri Bolisetti, Durgesh Nandan, Sanjeev Kumar
An Overview of Fog Computing

All know that cloud computing is used for the processing, analyzing and storage of data from the client devices or networks. After the evolution of IoT technology the data generated in large scale ad this can be handled with only cloud computing, moreover more than 45 billion IoT devices by the year of 2021 because of this change the present cloud computing network technique is not sufficient to handle that large amount of data due to its volume, latency and huge bandwidth requirements. The present Fog computing model is used to control all the concerns faced by Cloud computing.

Jagadeeswari Sambangi, Parvateesam Kunda, Durgesh Nandan, Sanjeev Kumar
Multi-point Data Transmission and Control-Data Separation in Ultra-Dense Cellular Networks

In this paper, investigation about the Cell-Planning for the upcoming wireless communication such as mobile, radar, etc. is done. Cell arranging (CP) is the most significant stage in the existence cycle of the cell framework. Cell administrators are managing system issues. This paper gives answers to a portion of these issues. Be that as it may, the way that little cells, a significant segment of future systems, are foreseen to be sent in an off the cuff design makes CP for future systems 5G interchanges. Besides, in developing cell frameworks that join a wide range of cell sizes and types, heterogeneous systems (HetNets), vitality proficiency, self-arranging system highlights, control, and information plane split designs (CDSA), gigantic numerous information various away (MIMO), cloud radio access system, and millimeter-wave-based cells in addition to the requirements to help Internet of Things (IoT) and gadget to-gadget (D2D) correspondence necessitate a significant change in perspective in the manner in which cell systems have been arranged before. This paper likewise manages the programmed determination and setup of base stations for versatile cell systems.

Krishna Pavani Karri, R. Anil Kumar, Sanjeev Kumar
Review of 5G Communications Over OFDM and GFDM

The future generations of wireless communication’s main aim is to provide increasing and required demands of the users, mainly focusing on the flexibility, bandwidth, low latency, spectral efficiency. Orthogonal frequency division multiplexing (OFDM) is used in 4G communication systems. To reduce the drawbacks of 4G moved in 5G communication systems in which Generalized frequency division multiplexing (GFDM) is worn. In this manuscript, mainly learn the modulation techniques used for 5G communications like GFDM and OFDM. These modulation techniques are operated in additive white Gaussian noise channel (AWGN). Fast Fourier Transform (FFT) and Inverse Fast Fourier transform (IFFT) are used for performance evaluation and analysis in OFDM. In GFDM, zero forcing receiver (ZFR) is used which represents a crucial task in digital broadcasting. When ZFR is considered the GFDM performs better interference.

Pasupuleti Sai Deepthi, Vura Sai Priyanka, R. Anil Kumar, Sanjeev Kumar
An Overview of Biometrics and Face Spoofing Detection

Biometric systems have been using widely and these systems have improved substantially which are being used for persons Authentication and verification. This Biometrics is playing a key role in Personal, National and Global security. The main approach of this paper is to tell the importance of Biometric systems. Different types of biometric systems have been used these days. These biometric systems are iris recognition, palm vein biometrics, voice recognition, face recognition. The main threat to all these Biometric systems is Spoofing. Facial recognition biometric system is being widely used when compared to other biometric systems. Different types of facial spoofing include mask attack; photo attack & video attack and Different methods used to detect Face spoofing are discussed in this paper. These face spoofing detection techniques include the use of LBP, DMD, SVM, PReLU, IDA, LFHOG and CNN. The Other Techniques Include a Combination of LBP, CNN, and IDA. However, the results strongly prove that the Implementation of the above techniques will detect Face spoofing.

Sista Venkata Naga Veerabhadra Sai Sudeep, S. Venkata Kiran, Durgesh Nandan, Sanjeev Kumar
Efficient Dual Axis Solar Tracking System

This generation is facing problems regarding energy crisis in an enormous way. Also the level of generation of electrical energy is not reaching the demand of this energy. In order to overcome this scarcity renewable energy would be a better answer. Using solar energy which is the most dominant resources of renewable energy could be a major solution to this problem. The performance of the dual axis solar tracker using Arduino is presented in this paper. This research helps to find which one is efficient among solar tracker and static solar panel. The work involves two parts, Software and Hardware. The hardware part resembles Light dependent resistors which detects the source from sun. Servo motors are used to move the solar panel to the place where the light source is maximum indicated by LDR. Software part is carried by using C programming. The result of this tracker is compared with static solar panel and can be found that solar tracker is more efficient capturing maximum light source and also produces more power.

H. N. Shashank, C. Hithashree
Prediction of Water Consumption Using Machine Learning Algorithm

Machine Learning has been successfully implemented in the real-world problems, its use in real world problems is to extract and identify valuable and new knowledge from the given data. In this paper a model which can predict water consumption by a person annually is developed. These days, water scarcity is major problem in metropolitan cities, so the main goal is to predict the water usage for upcoming years. The prediction of water consumption per year is crucial for conservation of water for future generations. In this regard, on regular basis important data and information are gathered and in order to maintain the quality set they are considered at appropriate authorities and standard. We collect the data of water usage by a person in the past few years, and integrate the data and utilise it. The gratification of the analysis of water consumption is attained by converting this data into knowledge. The supervised algorithms are used to predict the amount of water consumed and based on their accuracy predicted, user-friendly characteristics and ease of learning the performance of the learning methods were evaluated.

P. Poornima, Sushmitha Boyapati
Simulation of Cascaded H-Bridge Multilevel Inverter Using MATLAB/SIMULINK

The growing demand for power finds wide range of applications in hybrid/electric vehicles, portable consumer device, industrial control systems, and solar power systems. Since a multilevel inverter has low harmonics, these are widely used in energy distribution and control. Cascaded or H-Bridge inverter topology is preferred the most because of simple control, reliability and capacitor balance. Sinusoidal Pulse Width Modulation (SPWM) is commonly implemented in an inverter circuit since it improves efficiency. This paper consists of the basic theory of a single phase and three phase multilevel inverter of different levels using SPWM technique, its Simulation model, result and its switching pattern. The design is made by the H-bridge topology using MOSFET as a switch. The simulation for the system is constructed with the help of MATLAB/SIMULINK.

C. Hithashree, M. K. Bharath, H. N. Shashank
Design of Two Way Solar Tracking

Solar tracking has created a great trend in the field of producing renewable source of energy. In this paper we propose a two way solar tracking system which would increase the total efficiency, this involves tracking of the sun in dual axis. To accomplish this we have used two motors for the movement in two axis. This paper would imply that the use of the mentioned method would result in more power consumption, high operating cost.

M. Ashwin, S. Yashwanth Gowda
Authenticated and Privacy Ensured Smart Governance Framework for Smart City Administration

Managing cities efficiently requires a great combination of top-down and bottom-up ICT-enabled methodologies in order to make the city governance a complex phenomenon. To develop the Smart City solutions, the upgrading of a major source of urban, socio-economic and environmental data needs to be considered. This builds the notion of Smart Cities a huge momentum and attracts the researchers to work on this issue in the past decade. From the administrator’s point of view, the ICT potentially collects and processes the city data to aim for secure effectiveness and efficient city planning, decision making, and smart governance. We understand that, the developing innovative domain-specific applications to access the urban data and administrate the city with the help of Smart devices is a challenging task. There arises a strict need for privacy and security issues to be handled by the administration. We suggest the studies to ensuring the security which gives various benefits, such as uninterrupted smart governance, authentication and ensuring security with the potential help of high-performance computing paradigms like cloud computing enrich the Big Data management approaches to provide the smart governance.

Srinivas Jangirala, Venkamaraju Chakravaram
Booth Multiplier: The Systematic Study

Booth multiplier plays a major role in digital integrated circuits. Multipliers are used for arithmetic operations. There are several digital multipliers used in different applications in VLSI. This paper reviews different types of booth multipliers, comparison, Advantages, drawbacks and extensions, the basic architecture of the booth multiplier and its algorithm. The power consumption, delay time and area occupied by the chip, also better performance are taken into the consideration, we can justify the efficiency of multipliers and remodeling the modules in multipliers reduces partial product generation in booth encoder. Wallace booth multiplier uses modified encoder to overcome the drawback that occurred in 2009 paper and drawbacks in array multiplier are overcome by Wallace booth multiplier. We observed that modifying the modules in booth multiplier we can reduce power consumption and increase scalability.

B. Venkata Dharani, Sneha M. Joseph, Sanjeev Kumar, Durgesh Nandan
Systematic Observation on Non-orthogonal Multiple Access for 5th Generation Communication Technology

Non-Orthogonal Multiple Access (NOMA) has become popular optimizing automation that offers huge power, less dormancy and has a high connection to meet vivid opportunities in the fifth—stage of the cable fewer networks. It is a multiple access scheme. Since the rearmost is regarded to be Heterogeneous Networks (Het Nets), the accomplishment of NOMAon5GHet Nets is greatly considered. In this paper, temporarily reveal that the NOMA strategies have grown step by step starting Single-Carrier NOMA (SC-NOMA) into Multi-Carrie NOMA (MC-NOMA). At that point we enquired concerning essentials, authorizing formats and advancements of the two mainly encouraging MC-NOMA strategies, in particular, Pattern Division Multiple Access (PDMA) and Sparse Code Multiple Access (SCMA). Visible light communication (VLC) constructs over the dynamic purpose of the present-day glittering platform for wireless communication. VLC is most competent, assured and is a high yield wireless access technology. Target on the multi-user VLC systems is an attempt to transfigure VLC into a measurable and completely fret worked wireless technology.

Muppana Sonika, S. B. G. Tilak Babu, Durgesh Nandan
Interactive Security of Ransomware with Heuristic Random Bit Generator

Nowadays internet is an important part of our life but we should use it carefully because so many cyber threats are there in web. One of crucial attack is ransomware attack. In 1995, the basic concept of ransomware was introduced as a cryptovirus. Nevertheless, since then, it has been considered for more than a decade merely a philosophic topic. Throughout 2017, Ransomware came to life, with many popular ransomware incidents targeting critical computer systems around the world. For starters, the damage caused by CryptoLocker and WannaCry is massive and worldwide. We encrypt the data of criminals which need an enormous amount of money in order to decrypt them. The key to recover cannot be found on the victim’s system ransomware footprint as they use public key encryption.Consequently, after being damaged, the system cannot be replaced without recovery costs. Antivirus researchers and network security experts have developed various methods to counter this risk. Nevertheless, cryptographic security is assumed to be infeasible because it is computationally as difficult to recover the files of a victim as breaking a public key cryptosystem. Recently, various techniques have been suggested to protect an OS’ crypto-API from malicious codes. Almost all ransomware uses the random number generation services offered by the victim’s operating system to develop encryption keys. Therefore, if a user can monitor all the random numbers created by the program, he/she will be able to recover the random numbers used during encryption key by the ransomware. We suggest a flexible ransomware security approach in this paper which substitutes the OS’ random number generator with a user-defined random number generator. Given that the proposed method causes the virus program to generate keys based on the user-defined generator output, an infected file system can be recovered by reproducing the attacker’s keys used to perform the encryption.

Rahul Rastogi, Gaurav Agarwal, R. K. Shukla
Comparative Study of RSA with Optimized RSA to Enhance Security

Asymmetric crypto algorithms are a robust technology used in the communication of texts on the channel to reduce security risks. Mathematical approaches are one of the drawbacks and it’s because they involve a larger amount of calculation which contributes to the need for enhanced use of computing power. This paper aims to refine the algorithm for RSA encoding and therefore enhance information security, reliability and availability. The results show the information security efficiency and usability of the RSA algorithm. We can also see that when performing encoding and decoding, time, space, processor and network output are lower than other RSA solutions since computing is performed on the client and server.

Amit Taneja, R. K. Shukla
A Generalized Framework for Technical Education and Implementation of Machine Learning Techniques

A framework is described in this paper. To enhance teaching learning procedure in education it is important to find critical factors and the way they are affecting this process. In this paper a framework is designed for achieving the highly specified and target oriented objective in education. This paper describes study in two sections, first focusing on framework and second related to Machine Learning in education along with critical factors affecting Teaching Learning.

Dipti Verma Nashine, K. Nirmala
Impact Study of Internet of Things on Smart City Development

The Internet of Things (IOT) is the best technology to develop a smart city. This paper gives a piece of brief information about developing a smart city with the help of IOT. It decreases the expenses and providing efficient services, reduces the wastage of time. IoT smart city mostly common problems are parking system, water, smart environment, and drainage system. This paper gives solutions to the above-mentioned problems. IOT can be sent the data and receive the data and it also stores the data. Smart city development using electronic devices, advanced sensors, and thousands of gadgets are used. The Internet of things consists of sensors, networks, etc. All these are connected to the internet and cloud, to develop the city as a smart city. Nowadays the technology is well developed and people are more interest in smart work intend of doing hard work.

U. M. V. V. Hemanth, N. Manikanta, M. Venkatesh, M. Visweswara Rao, Durgesh Nandan
Modeling and Analysis of Security in Design Phase of IoT Based Applications Using Security Patterns

The Internet of Things (IoT) claims the growing popularity of its applications due to its ability to transfer data through Internet. Security is a critical concern in IoT based applications since sensitive data is communicated through Networks. The major problem of IoT system is that the security is seldom considered. To consider the security in IoT, security features must be integrated in each phase of IoT application Development life cycle. In this paper, we model security features using mitigation use case diagrams which includes security patterns solutions in the design phase of IoT applications.

E. R. Aruna, A. Rama Mohana Reddy, K. V. N. Sunitha
Trends in 6G Wireless Molecular Communications: A Succinct Study

In this paper, we have discussed the capability of trends in 6G wireless molecular communications (MC) into upcoming generations of wireless networks. While 5G expected to be more significant in 2019, 6G is the burning topic of interest among researchers due to various drawbacks of 5G. Already initiatives have been taken in numerous republics focusing on the conceivable research on 6G machinery. The objective of this paper is to analyse the different aspects of 6G communication networks and motivate further investigation in this field. At first, the advantages of 6G wireless MC has been explained and compared with the traditional wireless communication systems using electromagnetic waves at different micro and macro scales. Subsequently, the main challenges have been identified that are restricting the implementation of 6G wireless communication into the upcoming generation of wireless networks. Finally, the significant application of 6G wireless molecular communications in the field of health care and smart groundwork has been discussed.

O. T. Ratna Deepthi, P. Sai Bhaktanjana Rao, P. Krishna Veni, Durgesh Nandan
Traffic Accident Injury and Severity Prediction Using Machine Learning Algorithms

Traffic crashes are the severe issues confronting the world as they are the root reason for numerous deaths, wounds, and fatalities just as financial misfortunes consistently. Effective model to deduce the severity of an accident is very helpful for both traffic department and general public. This examination sets up models to choose a lot of compelling variables and to develop a model for arranging the severity of wounds. These models are planned by different machine learning methods. Both supervised and unsupervised learning methods are actualized on accident data. The major aim of this is to find the relationship between various sorts of the accidents with the kind of the injuries that might have occurred. The discoveries of this investigation demonstrate that unsupervised learning methods can be a promising instrument for anticipating the damage severity of Traffic crashes.

Nithin Kashyap, Hari Raksha K. Malali, Koushik S. E, Raju G, T. H. Sreenivas
A Survey on Diabetes Prediction Using Machine Learning

Diabetes mellitus is a problem that affects people around the world, nowadays the persons from young to old are suffering from diabetes. We can replace the traditional methods of diabetes prediction by modern technologies which saves time. There are many researches carried by researchers to predict diabetes, most of them have used pima Indian dataset. We are planning to use machine learning algorithms like Support Vector Machine and Naïve Bayes. By using these algorithms to predict diabetes we can save time and obtain more accurate results.

K. J. Amulya, S. Divya, H. V. Deepali, S. Divya, V. Ravikumar
E-governance for Public Administration

The objective of public administration is to make excellent governance in our country and the plan of the one country is different from other countries. Public administration is an efficient, translucent to clear instructions to citizens and reacting quickly for their service requests in positive manner is more important for proper functioning of the country. The better e-governance is having some government strategies to accomplish the multiple objective services can be executed. Online interface likewise goes about as a stage for Citizens to benefit data and for the Administration Office to spread mindfulness. Utilizing this office, office working can be made known to the general population subsequently making the framework straightforward.

Mahesh Kaluti, K. C. Rajani
Phishing URL Detection Using Machine Learning Techniques

A criminal act performed online by impersonating others to obtain confidential data like passwords, banking details, login credentials, etc., is known as phishing. Detecting such websites in real-time, is a complex and dynamic problem, which involves too many factors. This work focuses on identifying the important features that distinguish between phishing URLs and legitimate URLs. To detect significant features, statistical analysis is done on the phishing as well as legitimate datasets. Based on the statistical exploration, certain features based on the URL, HTML, JavaScript and Domain were extracted. The prominent and most relevant features to identify the phishing URLs are identified using correlation. The identified subsets of features are then used to train different machine learning based classifiers and the accuracies obtained have been compared. From the experimental analysis it is observed that the extracted features have efficiently detected phishing URLs and the Decision Tree classifier has found with highest accuracy for making the predictions.

A. Sirisha, V. Nihitha, B. Deepika
Stock Market Prediction Using ARIMA, ANN and SVR

The forecasting and estimation is the process to estimate the future price of the market stock as well as other financial commodities during the exchange. The efficacious estimation of the company’s stock price may yield fruitful results for the company in term of their increased turnover. The efficient-market hypothesis advocates that current price of the stock market be a sign of all presently accessible information and a little change in the stock market price are not based on not only the newly revealed information thus are inherently unpredictable and irregular. Others deviate and those with this viewpoint possess myriad models, methods and expertise which purportedly permit them to estimate future price information. Machine Learning methods such as Support Vector Regression (SVR), Artificial Neural Network (ANN) and other models may be thought of as mathematical function approximators. The most familiar form of ANN for stock market prediction is the feed forward network employs the backward propagation of the errors algorithm to update the network weights. The dataset for the proposed work has been collected from MSFT (Microsoft Inc) in which historical daily prices data is taken and all stock price data is kept for deliberation. The proposed work is based on the development of the stock prediction model based on SVR.

Divya Sharma, Sandeep Kumar Singla, Amandeep Kaur Sohal
A Mining Framework for Efficient Leakage Detection and Diagnosis in Water Supply System

A smart city smart meter water grid have to be reliable and capable to safeguarding the 24 * 7 trustworthy water distribution network that guarantees less wastage by leakages in the pipeline. Distributors and Consumers are turning to the Internet of Things and deep learning to meet requirement. Continuously monitoring the system and taking requirements manually is tedious job. Smart nodes with hall sensors provide continuous measurements and warehoused in database captured from the smart city water distribution network using smart meters. This paper deals with detection of leakages using deep learning technique. In order to find out leakage estimation and exact leakage position in water distribution pipelines the proposed framework uses the pulse rate, flow rate and quantity as prime attributes. Experiments carried had exhibit the significance of deep learning in leakage detection.

P. Vasanth Sena, Sammulal Porika, M. Venu Gopalachari
Wireless Powered Uplink of NOMA Using Poisson Cluster Process with Two Orthogonal Signal Sets

NOMA is the mobile technology that embraces and satisfies all the needs of the upcoming mobile communication generation. In this regard, we are analyzing the behavior of NOMA in wireless communication with two orthogonal waveforms. In this paper, we overviewed the previous generations of mobile communication technology and their features. We discuss the need, basic principle involved in NOMA and the features of NOMA. We focus on the drawbacks and research challenges in NOMA by discussing various methods to approach NOMA by comparing different processes. We discussed on some parameters that NOMA should achieve and have already achieved to produce high efficiency.

Ashok Kumar Kona, R. Anil Kumar, Sanjeev Kumar
Documentation on Smart Home Monitoring Using Internet of Things

Internet of things is eventually a connection of many things and appending them into a real-world environment by interconnecting them. These days, we get to come across a frequent subjective regarding home automation system which has bagged enormous popularity in the last few decades using the internet of things (IoT). In this tremendous growth of technology, IOT marks its own identity by making life easier in this busiest world. It mainly focuses on the safe and secure quality of life. In this era, the internet of things (IoT) plays a crucial role by handling a plethora of connections with billions and trillions of things with devices and also with people. We people starting from the day to the end of the day we need to manage many things around us and sometimes there is no time for us to take better care towards home automation. IOT specifically interrelates a set of things in a single base and controls them over. It not only controls and takes care of the devices but also keeps informing the users. Homes are very reliable places where people crave for more security and care in the current world. So, IOT deserves its place by giving a satisfactory outcome in better home automation. In this paper we are going to see how I build a huge network of connection and how it controls and handles plenty of home appliances using different types of communication, and the better and cheaper types of connections for a safe home and how user can control his home effectively by staying far away by sending a single text and also it will get to know what are the pros and cons and different types of applications providing by IoT.

S. K. Hajara Munvara Siddiqa, K. Apurva, Durgesh Nandan, Sanjeev Kumar
Implementation of Cloud Based Traffic Control and Vehicle Accident Prevention System

Road accidents are the most undesirable thing to happen to a road client, however, they happen regularly. Road accidents cost many nations 3% of their GDP. The greater part of all street traffic passing is among unprotected road costumers. This problem can be reduced by traffic sensors on streets interfacing with drivers through a 4G scheme. Not all roads are fitted with such sensors. We speak about various methods in this article. We examine in particular an open GTS and MongoDB IoT Cloud Framework for traffic surveillance and alert notification. In addition, we use H2O and WEKA mining tools as another operation. We can predict the age, gender and mishap of drivers. VCC traffic management system is another scheme. It scrutinizes VCC’s role in the management of highway traffic.

Geetanjali Gundabathula, Parvateesam Kunda, Durgesh Nandan, Sanjeev Kumar
Modern Health Monitoring System Using IoT

In the Present world Health Monitoring of patients becomes very difficult for all the doctors and family members some people are died in their sleep because of the lack of monitoring of patients. So many technologies are used by physicians and doctors by linking their equipment to the Internet using IoT and cloud technologies to monitor patient Health condition every minute and every second. This paper is mainly focusing on applications which are related to health monitoring of patients based on Internet of Things. IoT provides many benefits to improvement of eHealth.

Satish Nimmakayala, Bhargav Mummidi, Parvateesam Kunda, Sanjeev Kumar
An Improved Method for Face Recognition with Incremental Approach in Illumination Invariant Conditions

In this paper we propose an enhanced method with an acceptable level of accuracy for face recognition with an incremental approach in invariant conditions like illumination, pose, expressions and occlusions. The proposed method hold the class-separation criterion for maximizing the input samples as well as the asymmetrical characteristics for training data distributions. This enhanced approach helps the learning model to get adjusted the weak features inline with enhanced or boosted feature classifier for online samples. This enhanced model also helps in calculating feature loses during the training process of offline samples. For representing the illumination invariant face features local binary pattern (LBP) are extracted from the input samples and IFLDA is used for representation and classification. This modified algorithm with incremental approach gives the acceptable results by detecting and recognizing the faces in extreme illuminations varying conditions.

Riyazoddin Siddiqui, Feiroz Shaikh, P. Sammulal, A. Lakshmi
A Robust Image Security System for Cloud-Based Smart Campus Using LBP and PCA

Most of the educational institutes now days because of increased threat to the campus security have started installing the biometric security support system for personal identification. In real time situation, a large number of biometric data get accumulated which could be an unbearable burden for the biometric security system of a digital smart campus. This had attracted the research community in the field of computer vision and security to use the convenience of cloud computing and store their large data set to cloud servers. During the transform of the bulky data to the cloud source there is necessity of reducing the computational burden and storage burden on the intelligent biometric security system. In this paper a enhanced method for personal identification using biometric system in cloud environment is proposed in which first biometric images are encrypted using either block encryption technique or pixel encryption technique. Out of these encrypted images local binary pattern (LBP) features are extracted for identification or classifications due to which privacy can be preserved. Further PCA is applied on LBP which helps to reduce the computational time and data transfer time to cloud. These extracted features using LBP and PCA are used for equivalent image retrieval using histogram equalization technique for personal identification.

Mohd Ahmed Abdul Mannan, Gulabchand K. Gupta
Colour Image De-noising Analysis Based on Improved Non-local Mean Filter

In a non-linear filter, open resources filter is a particular scenario that is used to reduce the Gaussian noise in our paper and it performs well to reduce it. The major advantage of non-local means filter is to preserve the limits and particulars of a unique image. In this paper, combined both open means filter and mutual filter to recommend an enhanced filter for colour picture de-noising. Novel influence significance is computed by addition consistency in sequence into the weight to evaluate the parallel of the patch. At the final stage of this paper deals that the proposed method of NLM and BILF is a suitable method to reduce the Gaussian sound and combination of sound.

Kanuri Alekya, Konala Vijayalakshmi, Nainavarapu Radha, Durgesh Nandan
Effective Data Acquisition with Sensors Through IoT Application: A Succinct Study

Today the Internet of Things is increasing day-by-day due to its wide applications in many aspects. Internet of Things is identified as one of the emerging techniques in the coming years as technology is turning towards the world of the internet and in smart living. The concept of IoT leaves us in a place of computer networks because it has a wide range of applications from our home to the entire world. In the future, for the increase in demand IoT requires a large necessity from sensors. In this paper we are going to know about WSN usage which is acting as a long-term environment data acquisition this application as a wider use because of its accuracy, collection of data in an efficient way and other method is Zigbee and BLE which also serves the same purpose like WSN these techniques are mostly used because of their flexibility, low cost. Even when sensors fail in the collection of data, through IoT application we can get information easily because IoT works with the physical environment too. Recently these type of sensors with IoT application is utilized in many ways such as in industrial development, smart home, smart irrigation purposes. At the end of this paper, we came to know how data acquisition plays a major role in everyone’s life.

P. Lakshmi Mounika, A. Konda Babu, Durgesh Nandan
Design of Dynamic Comparator for Low-Power and High-Speed Applications

Most of the real world signals have analog behavior. In order to convert these analog signals to digital, we need an analog to digital converter (ADC). In the architecture of ADC’s, comparators are the fundamental blocks. The usage of these dynamic comparators are maximized because of demand for low-power, area efficient and high-speed ADC’s. The dynamic comparator performance depends on technology that we used. This paper presents the design and analysis of dynamic comparators. Based on the analysis, designer can obtain a new design to trade-off between speed and power. In this paper, a p-MOS latch is present along with a pre-amplifier. p-MOS transistors were used as inputs in pre-amplifier and latch. The circuit operates by specific clock pattern. At reset phase, the circuit undergoes discharge state. During evaluation phase, after achieving enough pre-amplification gain, the latch is activated. The cross coupled connection in the circuit enhances the amplification gain and reduces the delay. This design has optimum delay and reduces the excess power consumption. The circuit simulations are done by using mentor graphics tool having 250 nm CMOS technology. Index Terms: Analog to digital converter (ADC), static comparator, dynamic comparator, two-stage comparator, low-power, high-speed.

G. Murali Krishna, G. Karthick, N. Umapathi
Predicting Students’ Transformation to Maximum Depressive Disorder and Level of Suicidal Tendency

Suicide is an instance of taking own life intentionally. The second main reason behind the death of young persons aged amid 10 to 24 years is suicide. Depression and unhappiness leads the people to commit suicides. Major Depressive Disorder (MDD) or simply ‘depression’ can be mild or severe that engages lack of curiosity and joy towards usual activities and low mood. It can be short-lived or chronic. The paradox is that, among the most treatable problems, depression is the one and it can be cured with the help of medication and psychotherapy. Suicides can be prevented by measuring the level of depression. An online questionnaire has been developed to assess the depression level of a person and predicted the suicide tendency by applying two machine learning algorithms LVQ(Learning Vector Quantization) and KNN(K-Nearest Neighbor). The study shows that LVQ, an exceptional case of neural network gives more accuracy than the KNN model.

G. Surya Narayana, Chalumuru Suresh, Kamakshaiah Kolli
To Identify the Sinkhole Attack Using Zone Based Leader Election Method

Wide ranging of research occurs across the world onlooker the significance of Wireless Sensor Network in the present day application world. In the current history, a variety of routing techniques are present to rise life time of WSN. Grouping mechanism is extremely victorious in giving security for network behavior and has turn into promising field for researches. Several unequal algorithms are presented to resolve this intrusion difficulty in wireless sensor network. This paper is used to overcome this as well as to offer lofty security Zone Based Leader Election Method is proposed. ZBLEM is utilized to identify the compromise node in each zone of the network. The proposed algorithm detects the malicious node in the inter or intra zone. The malicious nodes simply communicate with any node in the zone and try to compromise the nodes and in revolve ruin the nature of the network. To decrease the damages be felt on the compromised nodes, it ought to be effectively identified and revoked as earlier as possible. Here, the leader will be chosen region wise based on the energy level to identify the intruder and revoke the compromised node.

Dabbu Murali, P. Sunil Gavaskar, D. Udaya Suriya Rajkumar
Cascaded Adaptive Nonlinear Functional Link Networks for Modeling and Predicting Crude Oil Prices Time Series Data

Contrast to traditional neural networks functional link neural network (FLNN) is preferred for its single layer structural design, lower computational complexity and higher convergence rate. It achieves high dimensional representation space of input patterns through functional expansion of input signals. However, its nonlinear approximation capability is limited up to certain extent. Further improvement in performance may require enlargement in dimensionality of the input pattern, which increases the computational overhead significantly. Chebyshev FLNN (CFLNN) is a special case of FLNN and has universal approximation capacity along with faster convergence. Legendre neural network (LeNN) uses simple polynomial expansion functions and posses computational gain over FLNN. This paper develops two cascaded neural networks in order to improve the performance of FLNN. The first model combines the input expansion capacity of FLNN and better approximation of CFLNN to develop a model termed as CCFLNN. Similarly, the second model takes the advantages of FLNN and LeNN to develop another model termed as CLeFLNN. The weight and bias vectors are adjusted by gradient descent based back propagation learning method. The proposed models are evaluated on forecasting crude oil prices. Extensive simulation outcome and comparative performance investigation suggests suitability of the proposed model.

Sarat Chandra Nayak, Santosh V. Kulukarni, Karthik Jilla
Fruit Detection Using Recurrent Convolutional Neural Network (RCNN)

An accurate image based fruit detection model is crucial for agriculture task, Robotic harvesting. The features such as color similarity, shape irregularity and back ground are complex. Hence the fruit detection turns to be a difficult task. Many machine learning techniques such as Support Vector Machine (SVM), K-Nearest Neighbors (KNN), Naïve bayes, have been used for the fruit recognition system which doesn’t yield a good accuracy. This paper brings out the various techniques used in the fruit detection model and also how the deep learning techniques can be used for detecting the fruit by considering the various features of fruit.

Kotagiri Ramadevi, A. Poongodai
Comparison of Diabetic Retinopathy Detection Methods

Diabetic Retinopathy (DR) is an eye disease associated with long-standing diabetes. Sugar levels in the blood cause Diabetic Retinopathy (DR) is an eye disease associated with long-standing diabetes. Sugar levels in the blood cause harm to veins in the retina. These veins can expand and break or they can close, preventing blood from going through. Some of the time strange fresh recruits vessels develop on the retina. These progressions can prompt loss of vision. As of now, identifying DR is a tedious and manual procedure that requires a prepared clinician to analyze and assess advanced shading Fundus photos of the retina. When human perusers present their surveys, regularly a day or two later, the deferred outcomes lead to lost development, miscommunication, and postponed treatment. In this paper, we compare the Diabetic Retinopathy Detection via Deep Convolution Networks for Discriminative Localization and Visual Explanation and Mechanized Detection of Diabetic Retinopathy utilizing Fluorescein Angiography Photographs.

Heena, Vijaya Kumar Koppula
IoT Based Automatic Irrigation System Using Wireless Sensor Networks

In ancient times, farmers estimated the maturity of the soil and the reserves to generate revenue. A lower concentration of humidity, water level and certain climatic conditions is becoming increasingly difficult for a farmer. Wireless Sensor Network (WSN) contains different sensor nodes with the option of detection, computer and wireless communication. WSN technology is used to control and monitoring of the environment and soil parameter in the field. WSN used as part of farming for a few reasons, such as indicates high Interpretation, increase the production of harvest, low energy consumption and collection distributed data. Effective management of water plays an important role Agriculture. Shortage of water resources and high pumping costs make good water more critical management. Today is one automatic irrigation system (AIS) used to improving the use of water resources into increase production. This one part of the irrigation system allows development in different places with water deficit. In this way a productive planning of watering system gives the highest efficiency low amount of water.

N. Penchalaiah, Jaladanki Nelson Emmanuel, S. Suraj Kamal, Kadiyala Ramana
IoT Based Smart Farming Using Thingspeak and MATLAB

Climate changes have contributed to the growing importance of monitoring of the climate. A continuous monitoring of the environmental parameter is important to assess the value of the atmosphere. The IoT technology had taken revolution to any area of human life, making it digital and insightful. IoT is a collection of things that make up a network for self-configuration. Since the IoT is the most advanced technology, the collection of data from the sensor system plays a key role. This paper presents a Arduino UNO Wi-Fi module (ESP8266) which helps processing and transfers sensed information to the thing speak cloud, usually comprised of various sensors such as temperature, humidity and moisture, etc. Then, the obtained parameters are stored on the cloud server. A cloud computing system tracks environmental changes as a repository. Things talk will provide a function for a public channel, which is measured and calculated by the general public. A free access to measurement parameters is provided with an Android framework. This paper is aimed at proposing a new smart IoT based farming that supports farmers in obtaining live data (temperature, soil humidity) for successful environmental monitoring so that they can make smart farming and increase overall production and value of their products. The Novel Intelligent IoT Farming is embedded with Arduino Technology and the breadboard can be obtained from Thingsspeak.com with different sensor modules and live data feeds. A supporting, open API platform for IoT’s Thingspeak’s internet services is a host for a range of sensor systems that manage sensitive information at cloud level and incorporated a special feature of transferring sensed data to MATLAB R2019a. A Channel ID and API key assigned through the services can be used to monitor data quality at specific intervals.

N. Penchalaiah, Jaladanki Nelson Emmanuel, S. Suraj Kamal, C. V. Lakshmi Narayana
Clustering Methods Analysis in the E-Learning

In this paper, the analysis of students who were part of the distance learning program has been done using various educational data mining clustering algorithms on the basis of their performance and activities carried out as a part of the process. Clusters have been used to find the relation between attributes (activities involved in the E-learning process such as downloading of study related material, exchange of messages with tutors and colleagues and posting in the discussion forum) and the final grades of students, and to find which characteristics mostly affect the classes. To sort out this, we have used various data mining techniques such as Agglomerative Hierarchical Clustering (using Ward method), and Non-Hierarchical clustering methods such as KMeans, KMeans ++, and CMeans The implementation of these algorithms is done in R language. The comparison of Hierarchical and non-Hierarchical clustering methods is made to find the most efficient algorithm.

Ravinder Ahuja, Prem Prakash Agarwal, Tarun Kumar
Optimized KFCM Segmentation and RNN Based Classification System for Diabetic Retinopathy Detection

Finding and diagnosis of various eye illnesses for ophthalmologist to helpful by using Human retinal image. Automated blood vessel segmentation diagnoses numerous eye infections like diabetic retinopathy, retinopathy of prematurity or glaucoma. In this work, we propose the Optimized Kernel-based Fuzzy C-Means (OKFCM) Segmentation and Recurrent Neural Network (RNN) based Classification system for Diabetic Retinopathy detection. In the proposed segmentation section consist of two main stages such as optic disc removal and Modified Ant Colony Optimization (ACO) based KFCM Segmentation. For the Diabetic Retinopathy classification, GLCM and moment built features are used. The proposed system is also named as an OKFCM-MACO-RNN. The OKFCM-MACO-RNN classification assessment process is complete on the diaretDB1 dataset by manipulative the value of features like accuracy, sensitivity, and specificity of the OKFCM-MACO-RNN method respectively 99.33, 81.65 and 99.42%. The OKFCM-MACO-RNN method is predictable to be able to notice exudates well. The OKFCM-MACO-RNN Segmentation performance is analyzed in terms of jacquard coefficient, dice coefficient and accuracy respectively 85.65, 72.84 and 93.15.

K. Loheswaran
Review on Predicting Student Performance

In the present educational system, student performance prediction is very useful. Predicting student performance in advance can help students and their teacher to track the performance of the student. Many institutes have adopted continuous evaluation system today which is done manually. Such systems are beneficial to the students in improving performance of a student. In data mining applications, it is seen that neural networks are widespread and has many successful implementations in a wide range. The goal is to know whether neural networks are right classifiers to predict the student performance in the domain of educational data mining. Neural network surpass many algorithms which are tested on particular dataset and can be used for successful prediction of student performance. Classification is used as a popular technique in predicting student performance. Several methods are used under the classification such as decision tree, naïve bayes tree, support vector system, k nearest neighbor, random forest and logistic regression.

Monagari Swathi, K. L. S. Soujanya, R. Suhasini
A Novel Association Approach to Generate Patterns for Multi-valued Data in Efficient Data Classification

The real-world applications increased demand for heterogeneous data classification for text, pictures, music, movies and medical data sets. The complexity of the learning class for an object that is associated with the set of values is a key issue for multi-value data sets. The present learning approaches are based on the features characteristics favouritism observed for similar sets of class values, but favouritism characteristics measured an object deviation value instead of association class. Such methods may be unfavourable for classification as each value is made up of specific features of characterization. However, few studies have tried to solved the problem through associating values learning over multi-value data sets. This paper presents a Multi-Value Association (MVA) Approach for efficient data classification based on associated pattern generation using binary multi-value association among the data. The objective of the proposal is to identify a Single-Class-Value (SCV) which can be most suitable along with the additional features patterns which can describe it most. To evaluate the efficiency we compare with some existing proposals using multi-value datasets which shows an improvisation.

LNC Prakash K., K. Anuradha, G. Surya Narayana
Social Media Analytics: Techniques, Tools, Platforms a Comprehensive Review

To determine which social media analytics tools, techniques, and platforms were developed in recent times, this paper reviews tools, techniques, and platforms related to social media analytics. In this paper, we talk about the tools used to deal with various social media data (social networking, media, etc.). In the past decade, there has been advancement in the technologies used to deal with social media as there has been an increase in the number of people using social media to share information and also the development of the new social media platforms that have let to increase in the amount of data that we have to deal with. Social media platforms have a considerable number of users across the world, which is overgrowing. These people are sharing information through these sources. There is a large quantity of social data comprising of data related to users, videos, web-based relations, and interactions, etc. which needs to be analyzed. Therefore analyzing social media data has become a significant activity for researchers, mainly due to the availability of the web-based API from social media platforms like twitter, facebook [1], Gmail, etc. This has also led to the development of data services, software tools for analyzing social media data. In this paper, there is a detailed review of the leading software tools and techniques that are used for scraping, cleaning, and analyzing social media data.

Ravinder Ahuja, Anupam Lakhanpal, Surendra Kumar
A Novel Approach for Detecting Near-Duplicate Web Documents by Considering Images, Text, Size of the Document and Domain

Web mining is a part of data mining in which the web consists of enormous amount of data. The search engines faces large amount of problems due to the presence of Near duplicate documents in web which leads to irrelevant answers. The performance and reliability of search engines are critically affecting since the near duplicate documents present in web. For detection of near duplicate web documents two attempts are found in the literature. The former considered domain and size of the document and the later considered text and image as the search parameters. This article proposes a novel approach combining the parameters such as text, image, size and domain of the document to detect near duplicate documents. The approach extracts the keywords and images of the crawled document and compares them with the existing documents for similarity measure. If the similarity score measure value is less than 19.5 and image comparison value is greater than 70%, then it is detected as near duplicate document.

M. Bhavani, V. A. Narayana, Gaddameedi Sreevani
Comparative Analysis of Horizontal and Vertical Etched Fiber Bragg Sensor for Refractive Index Sensing

Etching of Fiber Bragg Grating (FBG) section is necessary process to make the FBG sensitive with the external environment changes. Hence, HF solution is used for this purpose. In the present work, a real time monitoring based system for the vertical and horizontal etching of Fiber Bragg Grating (FBG) in a solution of hydrofluoric (HF) acid is designed. In our experiment we used 40% concentrated HF solution at room temperature, which is corrosive in nature and it capable for dissolving different materials. Because of its reactive nature with glass, we used Teflon container to carry out HF solution. It is easy to analyze the results through the variation of the shifting of wavelength with respect to the etching time response of FBG for horizontal and vertical etching process when dipped in HF solution. The result shows the comparison between vertical etching and the horizontal etching process.

Azhar Shadab, Yadvendra Singh, Sanjeev Kumar Raghuwanshi, Mohd Dilshad Ansari
Formalizing Open Source Software Quality Assurance Model by Identifying Common Features from Open Source Software Projects

Not long ago many scholars used to believe and write that open source software or projects do not follow any decisive software quality assurance methods as traditional projects do. They would argue that the style adopted by open source communities is more or less ad hoc. However successful open source collaborative projects have over the years developed a typical software development quality assurance methods of their own which may be coined as continuous develop and release state quality assurance model. As a result many have started acknowledging this change and important writings are being published. In this tune the present paper discusses the quality assurance model adopted by few famous open source projects and attempts to strike a formalization of the common steps involved from practices employed by these open source software projects.

Ekbal Rashid, Mohan Prakash, Mohd Dilshad Ansari, Vinit Kumar Gunjan
An Approach for Morphological Analyzer Rules for Dravidian Telugu Language

Machine translation is a computational method for automating user queries or information posed over search engine or social media in local Dravidian language such as Telugu. Computer based translation has become global due to majority of the domains are likely to use local languages for universal resources accessing. Machine translation is an application of the major area for transforming one language to another target universal language. In this era analyzing Telugu language at syntax granularity level is essential to tackle through grammar. This article emphasizes on classification of approaches for machine translation syntactical grammar for Telugu Dravidian language. The authors also significantly presented investigations noticeable research issues in this article towards Telugu language for machine translations.

Midde Venkateswarlu Naik, Mohd Dilshad Ansari, Vinit Kumar Gunjan, G. Surya Narayana
A Traditional Analysis for Efficient Data Mining with Integrated Association Mining into Regression Techniques

A Wal-Mart salesman was trying to surge the sales data of the store by combining the commodities together and putting discounts on those products. The goods are clearly distinct thus he found that nurturing kids is exhausting. And to release pain, guardian decided to buy beer. Data Mining, also known as KDD, to find irregularities, associations, arrangements, and tendencies to forecast consequences. Apriori algorithm is a standard process in data mining. It is utilised for mining recurrent sets of items and related association rubrics. It is formulated to work on a database comprising of a lot of transactions. It is very vital for operative Market Basket Investigation and this assistance the patrons in buying their substances with more effortlessness which escalates the sales of the markets. While finding goods to be associated together, it is imperative to have some association on which the commodities can be listed together. In this research work a hybrid method has been proposed to attenuate association rules using optimization algorithm Differential Evolution with Apriori Algorithm. Firstly, Apriori algorithm is applied to get frequent itemsets and association rules. Then, AMO is employed to scale back the amount of association rules with a brand new fitness function that comes with frequent rules. it’s observed from the experiments that, as compared with the opposite relevant techniques, ARMAMO greatly reduce the computational time for frequent item set generation, memory for association rule generation, and also the number of rules generated. Data mining could be a process that uses a spread of information analysis tools to find patterns and relationships in data which will be accustomed make valid predictions. Association rule is one in every of the favored techniques used for mining data for pattern discovery is that the KDD. Rule mining is a very important component of information mining. to seek out regularities/patterns in data, the foremost effective class is association rule mining. Mining has been utilized in many application domains. during this paper, an efficient mining based algorithm for rule generation is presented.

G. SuryaNarayana, Kamakshaiah Kolli, Mohd Dilshad Ansari, Vinit Kumar Gunjan
On Sudoku Problem Using Deep Learning and Image Processing Technique

Sudoku has been a challenging puzzle for many people, in fact it is considered as a brainstormer. Many find it difficult to solve the puzzle. We generally apply different math techniques to solve the Puzzle. It would be great, if a computer could solve Sudoku. We have used deep-learning concepts to solve the Sudoku puzzle in a most optimized manner. Moreover, image processing and object localization algorithms are used to detect the Sudoku puzzle. Initially, an image consisting of Sudoku is provided, by using object localization & CNN algorithms we detect the Sudoku square box. A 9 × 9 square box is detected. By using ANN, we detect the numbers and empty spaces, later Backtracking algorithm is used in solving the puzzle in a more efficient manner. Puzzle is detected and digits are recognized, which can be digital or handwritten and finally the puzzle is solved. We achieved an accuracy rate up to 99% in detecting the puzzle.

Dharma Karan Reddy Gaddam, Mohd Dilshad Ansari, Sandeep Vuppala
On Security and Data Integrity Framework for Cloud Computing Using Tamper-Proofing

The cloud computing has become an emerging issue in IT industry as well as in academics due to its variant advantages. Cloud computing consist of many computing possessions like memory, storage and processor which are not physically available at the user’s site. These devices are stored outside and controlled by the cloud service providers, there is always a risk of losing data or the data may be altered by internal as well as external attackers. However, it may not be fully trustworthy because the user does not have copy of all stored data which refer to the fact that while users can put their files into the cloud server, nobody knows that where exactly they should be. Further, it poses various security challenges as generating secure and reliable data storage over unreliable cloud service provider. The data integrity is one of the prime concern in cloud computing. This paper presents a framework for data integrity in cloud computing using tamper-proofing algorithms (TPA) such as tamper-proof checking code (TPC) and tamper- resistance (TPR).

Mohd Dilshad Ansari, Vinit Kumar Gunjan, Ekbal Rashid
A Framework for Private Hospitals Service Cost Recommendation Based on Page Ranking Technique

The Search engine is a software program designed to identify and respond to specific questions called keywords and display relevant informational data available on the web. Thousands of websites are available but we want only specific information. The Solution is a search engine, it will find the relevant information related to that keyword and display that information in an aggregated format. With the rapid growth of data and information sources on the internet, finding the relevant and required information is becoming more tedious as well as important for internet users, for this reason, web mining is becoming popular day by day. We proposed a system for private hospital cost aggregation for hospital recommendation system by page ranking algorithm. Sample results are collected. This paper gives depictions of different web mining methodology. It gives an examination of three classifications of web mining. The page ranking algorithm assumes a noteworthy job in making the client look route less demanding in the after-effects of a web crawler. The correlation rundown of different page rank algorithms is recorded in this paper which helps in the best usage web assets by giving expected data to the guides. For performance evaluation, we have collected samples as well as real-time data set from UCI data repository and hospitals. For case 1 and case 2 implementation is done in python with numpy package and panda’s package. For Case 3 implementation is done in python as well as C#. Deep learning methodology can be applied for greater efficiency and accuracy. The improved result can be in the range of 85 to 95%.

Ahmed Yasin Syed, P. V. R. D. Prasada Rao
Sequence Alignment By Modified Teaching Learning Based Optimization Algorithm (M-TLBO)

Sequence alignment is a most important first step to a wide variety of analyses that can be performed on the biological sequences like DNA, RNA or protein. Sequence alignment is a daily practice of many biologists to determine the similarity among biological sequences. It is considered as an optimization problem. Researchers developed many meta-heuristic optimization algorithms inspired by nature to produce optimal alignment. In all these heuristic algorithms mutation and crossover are the most prominent steps. Every algorithm is having different criterion for mutation and crossover operations. Recently in 2011, R.V. Rao and et al. proposed a new algorithm called Teaching Learning Based Optimization algorithm (TLBO) to deal with constrained and unconstrained optimization problems. This paper uses TLBO to solve the sequence alignment problem and also proposes a new optimization algorithm called Modified-TLBO (M-TLBO). Both the algorithms, TLBO & M-TLBO are analysed by conducting experiments with bench mark data sets from “prefab4ref” & “oxbench” and observed that the newly proposed algorithm M-TLBO outperformed TLBO in solving the sequence alignment problem by producing the best fitness scores in reduced computational time.

Lakshmi Naga Jayaprada Gavarraju, Kanadam Karteeka Pavan
A Comparative Study of Feed Forward Hybrid Neuro-Computing Framework with Multilayer Perceptron Model for Prediction of Breast Cancer

Cancer originates when cells start to grow disorderly and there is a possibility that all the cells of the body have the scope to become cancer and can spread to other areas of body. A malignant tumor that initiates in the cells of breast turns out to be breast cancer. The presence of tumor anywhere in the body can be either malignant or benign i.e. cancerous and non-cancerous. Many research works have been carried out to diagnose the cancer disease. In this proposed research, A hybrid neural network model (SOM and LVQ) has been proposed. The output of SOM has been fed as an input to LVQ model. The classification of the dataset is done with SOM network using a competitive learning algorithm where as LVQ is trained with a vector quantization method. The patient’s data set contains 9 attributes which have been considered as input to the model. The inputs are then given to an SOM where each data point is classified into various clusters by using competitive learning process. Gradually, the classes obtained from SOM are appended back to the training input data for the training of supervised LVQ. After training, LVQ can be used to classify any unknown input data. The output thus obtained from this supervised learning algorithm is used to diagnose the presence of tumor leading to breast cancer. The labelled data from SOM is given as input to Multilayer Perceptron (MLP) and performance of the network is compared with the hybrid network. It has been observed that, The hybrid model performed equally well with that of the MLP model in diagnosing the cancer disease.

M. R. Narasingarao, V. Satya Aruna
Analysis of Shape Signature in First and Second Derivatives by Using Wavelet Transformation

The object recognition techniques are popular in computer vision and pattern recognition research field. The present paper focuses on the design of a novel shape signature based on angular information. The Wavelet coefficients are also used to formulate the shape signature. Further, the angular information is captured at two different derivatives of the input image. The angular information is used to estimate the tangential measure for each of the representative point of the input image. The represented shape signature is described with the Fourier transformation. The Fourier descriptors are used for the classification stage. The classification stage uses Euclidean distance measure for the classification. The proposed approach is evaluated on the standard database. The estimated performance measures show the efficiency of the proposed approach.

M. Radhika Mani, B. N. Jagadesh, Ch. Satyanarayana, D. M Potukuchi
An Ideal Big Data Architectural Analysis for Medical Image Data Classification or Clustering Using the Map-Reduce Frame Work

In the present day scenario, where huge volumes of data are being generated from various sources, as such storing and processing these data using traditional systems is a big challenge. The majority of data is of unstructured; hence necessary architectures should be designed to meet the continuous challenges. Among the possible solutions for the big data problem, one of the best solutions to address the huge volumes of unstructured data was Hadoop. In the medical field, huge volumes of clinical image data are resulting from the respective hardware tools. The necessary methods that are required to store, analyze, process and classification of these medical images can be done with map-reduce architecture using the Hadoop framework thereby reduces the computational time for the overall processing as the mapper will perform parallel processing. This paper includes a detailed review of Hadoop and its components. The main motive of this work is to deal with the medical image data using an efficient architecture such that automatic clustering or classification of images will be done within the architecture itself. The clustering of these medical images for future predictions and diagnosis for the disease is essential. In the map-reduce architecture, along with the map and reduce phases, the usage of combiners and partitioners will improve the efficiency of medical image processing for clustering the image data. The other responsibilities of this paper are to review the recent works in the image data clustering along with the state of art techniques for image classification. The clustered medical images will be used for automatic predictions and diagnosis of various patient diseases by applying Convolution Neural Network (CNN) techniques on top of the clustered or classified images.

Hemanth Kumar Vasireddi, K. Suganya Devi
Prediction of Guava Plant Diseases Using Deep Learning

Plant pathology is a field which deals with the analysis, diagnosis and treatment of diseases in plants. These days agriculture is the main source of income in the Indian economy as well as important for livelihood. Identification of diseases in plants and crops are quite difficult unless someone have great knowledge and experience. Diseases in plants might cause severe damage to whole crop that leads to loss of income for farmers and results in descend of revenue for agriculture in Indian economy, if not identified and controlled forefront. Early prediction can help this situation. This documentation represents the prediction of plant diseases using images of the leaves that are given as input by the user and predicts the type of disease. In background we have used convolution neural network algorithm followed by image classification and deep learning techniques. The accuracy is obtained with the help of confusion matrix. The algorithm is implemented using Python language which uses Flask as the micro web framework for graphical user interface (GUI). User gives the input image and the predicted disease is printed.

B. Srinivas, P. Satheesh, P. Rama Santosh Naidu, U Neelima
Deep Learning in IVF to Predict the Embryo Infertility from Blastocyst Images

In Vitro Fertilization (IVF) is used to solve infertility problem caused due to damaged, blocked, weak, total absence of fallopian tubes and issues in sperm or endometriosis. Successful IVF depends on assessment of embryo quality. In visual morphology, assessment produced by embryologists are different, as an outcome low success rate of IVF is seen. To develop the success rate multiple embryos are planted which lead to several pregnancies and complications. Artificial Intelligence (AI) method can be followed to analyze embryo quality apart from human involvement. Deep learning model is proposed to analyze human blastocyst quality and to achieve 85% of test accuracy.

Satya kiranmai Tadepalli, P. V. Lakshmi
Towards a Framework for Breast Cancer Prognosis: Risk Assessment

Breast cancer is one of the alarming and lethal type of cancer which causes most of the deaths in women across the world. The research contributions towards early detection of breast cancer, its prognosis and treatment helped to improve situation in terms of decrease in mortality rate. Nevertheless, the problem of breast cancer still attracts attention of researchers and healthcare organizations. Prognosis with comprehensive intelligence can improve the situation of breast cancer treatment further. This is the motivation behind this research work which is aimed at proposing and implementing a framework with three-fold mechanisms to have better prognosis. It includes breast cancer risk assessment models, breast cancer recurrence prediction models and breast cancer survivability prediction models. However, in this paper we present empirical results of breast cancer risk assessment while the other two parts of the prognosis research are deferred to our future research papers. We built a prototype application to demonstrate the performance of different mechanisms employed for breast cancer risk assessment. The empirical results revealed insights of performance of those mechanisms.

Ravi Aavula, R. Bhramaramba
Role of Advanced Glycated End Products (AGEs) in Predicting Diabetic Complications Using Machine Learning Tools: A Review from Biological Perspective

AGEs are highly stable products formed by glycation of reducing sugars, proteins and lipids. Diabetes mellitus is the condition; where the cells cannot utilize the glucose present in body tend to glycate free proteins in surroundings and cause various complications. Once, the AGEs glycate tissue proteins, they accumulate over lifetime and therefore contributing to complications such as neurodegeneration, polyneuropathy, retinopathy, nephropathy and other macrovascular complications. AGEs can be formed in many ways and bind with RAGE cell surface receptor. Since the etiology of these diseases is now well understood and the biomarkers of the disease are available, it is important to consider them in all machine learning disease prediction tools. Machine learning and deep learning algorithms and tools are being used in predicting various diseases mostly on basis of medical history and symptoms. This review focuses about how different AGEs play role in pathogenesis of different diabetic complications and discusses the why these glycated proteins should be considered in prediction of diabetic complications using machine learning or by other means.

Vamsi Krishna Battula, P. Satheesh, B. Srinivas, A. Chandra Sekhar, V. Aswini Sujatha
A Comparative Study of Performance Metrics of Data Mining Algorithms on Medical Data

Computers have brought about significant technological improvements leading to the creation of enormous volumes of data, particularly in health care systems. The availability of vast amounts of data contributed to a greater need for data mining techniques to produce useful knowledge. Accurate analyzes of medical data are gaining early detection of illness and patient care with the increase of data in biomedical and health care communities. The data mining is one of the major approaches for developing sophisticated algorithms for classification of data. Some have castigated Data mining for not meeting all of the humanistic statistics specifications [5]. Classification of diseases is that one of the main applications of data mining and many important attempts have been made in recent years to improve the accuracy of the diagnosis of diseases through data mining. We used four prominent data mining algorithms such as Naive Bayes Classifier, K-Nearest Neighbors (KNN) Classifier, Artificial Neural Networks (ANN) and Support Vector Machine (SVM) algorithms to develop predictive models using that the ILPD (Indian Liver Patient Data Set) from the UCI Machine learning repository. For performance comparison purposes, we used the 10-fold cross validation method to calculate the estimation of six predictive models. We find that the support vector machine delivers the best results in a 74.82 percent accuracy classification and 56.55 percent accuracy of the Naive Bayes performed the worst. The performance metrics of classifiers were analyzed on medical dataset further sections below.

Ashok Suragala, P. Venkateswarlu, M. China Raju
Sentiment Classification on Online Retailer Reviews

Sentiment Classification is a continuing area of research in text mining. Sentiment Analysis the automatic representation of the ideas, emotions and subjectivity of text, whose purpose is to define the polarity of the content of text, and opinion of the expresses in the form of binary ratings such as likes or dislikes, or a more granular set of choices, such as a 1 to 5 rating. This paper focuses primarily on high-level, end-to-end workflow to solve text classification problems using machine learning algorithm such as Naive-Bayes classifier for text classification issues to mining opinions and Amazon User Reviews.

Kolli Srikanth, N. V. E. S. Murthy, P. V. G. D. Prasad Reddy
Effect of Excessive Alchol on Liver: A Comprehensive Approach Using Machine Learning

Long-term use of alcohol can cause liver damage, leading to liver disease such as fatty liver, alcoholic hepatitis, and cirrhosis. Gastrointestinal problems, including pancreatitis and gastritis, can also be side effects of excessive drinking, as can various cancers and cardiovascular diseases complications, such as myocardial infarction, atrial defibrillation, alcoholic cardiomyopathy, and hypertension. Long-term misuse of opioids can lead to respiratory infections, constipation, damage to the liver and kidneys, sexual dysfunction and heart lining infections Both alcohol and opioid addiction can be associated with sleep problems and psychiatric disorders such as depression and anxiety. Alcohol overuse is believed to damage the liver for many years. When the person consumes too much alcohol, the liver of the person begins metabolizing the alcohol so the poison can be released from his or her body. Alcohol is metabolized before other drugs and the liver will work very hard to perform its tasks if a person drinks large amounts of alcohol. Some or no evidence of alcohol-free steatohepatitis is present at these early stages. Gradually, over the period, the patients start to experience boredom, weight gain, and helplessness. Here I build a Python based Machine Learning model that integrates blood tests deemed prone to liver disorders, which may result from excessive drinking of alcohol and comparing it with the amount of alcoholic drinks consumed every day.

Pativada Rama Santosh Naidu, Golagani Lavanya Devi
Detection and Analysis of Pulmonary TB Using Bounding Box and K-means Algorithm

Improving TB studies has long been a overlooked area, partly because of the complexities involved in getting the infection at risk. The novelty of this job lies in the strategy in which both computer vision methods and laboratory-based work are carried out to improve the human race. In this job, the gap between clinical research and technical studies has been decreased by bringing both the job together. Image processing is a field that does not require contact processes to be detected with patients. Over the previous two centuries, several algorithms have been created to extract the contours of homogeneous areas within the digital image. It is possible to acquire the input for image processing algorithms from scanned Lungs X-ray images. To detect the lung region, the fundamental image processing methods are applied to the CT scan picture. In this project, Image segmentation of the input pictures is carried out using a suggested technique developed from the K Means algorithm and bounding box algorithm along with Morphological Image Processing to acquire the output pictures and outcome comparison.

Vinit Kumar Gunjan, Fahimuddin Shaik, Amita Kashyap
A Tuberculosis Management Through ADR Study, Feature Extraction and Medical Bio Informatics

Enhanced research into tuberculosis has long been ignored because of the complexity of the risk of infection. The novelty of this project is the approach in which computer vision technology as well as laboratory work for improving mankind are carried out. The techniques of image processing are important for automatic research. In this work, the gap between clinical and technical research has been reduced by the collaboration and successful analysis.

Vinit Kumar Gunjan, Fahimuddin Shaik, Amita Kashyap
Design and Implementation of System Which Efficiently Retrieve Useful Data for Detection of Dementia Disease

To analyze Hadoop techniques like MapReduce, which will help to process the data faster and in efficient way to detect dementia. For given voluminous dementia dataset, current solution uses different data partitioning strategies which experiences large communication cost and expensive mining process due to duplicate and unnecessary transactions transferred among computing nodes. To clear this issues proposed algorithm uses data partitioning techniques such as Min-Hash and Locality Sensitive Hashing which will reduce processing time and improve efficiency of final result. We are taking help of MapReduce programming model of Hadoop [3]. We implement this technique on a Hadoop platform. For pattern matching we use FPgrowth algorithm. Finally we shows that the proposed system requires less time to finding frequent item sets. The idea behind research is to adopt to cope with the special requirement of health domain related with patients.

Sandhya Waghere, P. RajaRajeswari, Vithya Ganesan
Metadata
Title
ICCCE 2020
Editors
Amit Kumar
Stefan Mozar
Copyright Year
2021
Publisher
Springer Nature Singapore
Electronic ISBN
978-981-15-7961-5
Print ISBN
978-981-15-7960-8
DOI
https://doi.org/10.1007/978-981-15-7961-5