Skip to main content

2018 | Buch

Intelligent Computing and Information and Communication

Proceedings of 2nd International Conference, ICICC 2017

herausgegeben von: Subhash Bhalla, Prof. Dr. Vikrant Bhateja, Dr. Anjali A. Chandavale, Dr. Anil S. Hiwale, Prof. Dr. Suresh Chandra Satapathy

Verlag: Springer Singapore

Buchreihe : Advances in Intelligent Systems and Computing

insite
SUCHEN

Über dieses Buch

The volume presents high quality research papers presented at Second International Conference on Information and Communication Technology for Intelligent Systems (ICICC 2017). The conference was held during 2–4 August 2017, Pune, India and organized communally by Dr. Vishwanath Karad MIT World Peace University, Pune, India at MIT College of Engineering, Pune and supported by All India Council for Technical Education (AICTE) and Council of Scientific and Industrial Research (CSIR). The volume contains research papers focused on ICT for intelligent computation, communications and audio, and video data processing.

Inhaltsverzeichnis

Frontmatter
35 W GaN Solid-State Driver Power Amplifier for L-Band Radar Applications

In this paper, 35 W driver power amplifier was designed and simulated using GaN HEMT for L-band radar. GaN HEMT is used because it can provide high output power and high gain as compared to other semiconductor technologies. The 35 W output power is generated using CGHV40030 GaN HEMT which is sufficient to drive further stages of power amplifier. The driver amplifier is designed at 1.3 GHz of center frequency. This amplifier is designed in class AB and 60.5% of PAE is achieved.

Vivek Ratnaparkhi, Anil Hiwale
Validation of Open Core Protocol by Exploiting Design Framework Using System Verilog and UVM

Today’s scenario of semiconductor technology is a tremendous innovation; it includes a large number of intellectual property (IP) cores, interconnects, or buses in system on chip (SOC) design and based upon the necessity its complexity keeps on increasing. Hence, for the communication between these IP cores, a standard protocol is developed. The necessity of IP reuse, abridging the design time and the complexity makes large-scale SOC more challenging in order to endorse IP core reusability for SOC designs. An efficient non-proprietary protocol for communication between IP cores is open core protocol (OCP). OCP comes under socket-based interface and openly licensed core concentric protocol. This paper addresses on the verification of implemented design of OCP. The proposed paper is to verify the implemented design by using System Verilog and Universal Verification Methodology (UVM) in SimVision tool.

Gopika Rani Alekhya Pamarthy, M. Durga Prakash, Avinash Yadlapati
Cellular Automata Logic Block Observer Based Testing for Network-on-Chip Architecture

Necessity to test the logic circuits has increased rapidly due to the increase in number of applications being hosted on a single chip. This in turn has demanded the design of testing architectures which are capable of providing high fault coverage with less resource utilization and minimal power usage. Cellular Automata Logic Block Observer (CALBO), a technique homologous to Built-In Logic Block Observer (BILBO), has been considered in this paper to test the routers which are considered as important components of a Network-on-Chip (NoC) architecture. The resource utilization and power report of the design have been successfully generated to list out the advantages of the CALBO in comparison to BILBO for the architecture considered.

Shaik Mohammed Waseem, Afroz Fatima
Structural Strength Recognizing System with Efficient Clustering Technique

Internet of Things (IoT) visualizes future, in which the objects of everyday life are equipped with sensor technology for digital communication. IoT supports the concept of smart city, which aims to provide different services for the administration of the city and for the citizens. The important application of IoT is Structural Strength Recognition (SSR). This approach is becoming popular to increase the safety of buildings and human life. Proper maintenance of historical buildings requires continuous monitoring and current conditions of it. Sensor nodes are used to collect data of these historical buildings or large structures. Structural strength recognition covers huge geographical area and it requires continuous monitoring of it. It involves more energy consumption during these activities. Hence, there is need for efficient energy management technique. Clustering is one of the important techniques for energy management in Wireless Sensor Networks (WSN). It helps in reducing the energy consumed in wireless data transmission. In this paper, SSR system is designed with efficient clustering algorithm for wide network and also finds out optimum number of clusters.

Sumedha Sirsikar, Manoj Chandak
Trajectory Outlier Detection for Traffic Events: A Survey

With the advent of Global Positioning System (GPS) and extensive use of smartphones, trajectory data for moving objects is available easily and at cheaper price. Moreover, the use of GPS devices in vehicles is now possible to keep a track of moving vehicles on the road. It is also possible to identify anomalous behavior of vehicle with this trajectory data. In the field of trajectory mining, outlier detection of trajectories has become one of the important topics that can be used to detect anomalies in the trajectories. In this paper, certain existing issues and challenges of trajectory data are identified and a future research direction is discussed. This paper proposes a potential use of outlier detection to identify irregular events that cause traffic congestion.

Kiran Bhowmick, Meera Narvekar
A Privacy-Preserving Approach to Secure Location-Based Data

There are number of sites that provide location-based services. Those sites use current location of user through the web applications or from the Wi-Fi devices. Sometimes, these sites will get permission to the user private information and resource on the web. These sites access user data without providing clear detail policies and disclosure of strategies. This will be used by the malicious sites or server or adversaries breaches the sensitive data and confidentiality of the user. User shares original context of the location. An adversary learns through the user’s original context. Due to the lack of secure privacy-preserving policies, it has shifted them to specific goals for various hazards. In order to secure or preserve privacy of user, new privacy-preserving technique called FakeIt is proposed. In FakeIt, system works around privacy, security to satisfy privacy requirements and the user decides context before sharing. If the current location context is sensitive to the user, then user decides to share the fake location context to location-based services instead of original. System restricts the adversaries to learn from the shared sensitive location context of the user.

Jyoti Rao, Rasika Pattewar, Rajul Chhallani
Analysis of Blind Image Watermarking Algorithms

This paper presents an overview of blind image watermarking algorithms. In this paper, we analyzed these algorithms for different criteria like robustness, security, and imperceptibility. We also compared pros and cons of using blind method for embedding and extraction of watermark. Most of these algorithms are implemented using MATLAB 2011 and tested on the standard image dataset. We used true color images of size 256 × 256 and binary watermarks of size 32 × 32 for testing. This paper will help watermarking researcher to choose the particular algorithms depending on their need for the application they are working on.

Chhaya S. Gosavi, Suresh N. Mali
ClustMap: A Topology-Aware MPI Process Placement Algorithm for Multi-core Clusters

Many high-performance computing applications are using MPI (Message Passing Interface) for communication. The performance of MPI library affects the performance of MPI applications. Various techniques like communication latency reduction, increasing bandwidth, and increasing scalability are available for improving the performance of message passing. In multi-core cluster environment, the communication latency can be further reduced by topology-aware process placement. This technique involves three steps: finding communication pattern of MPI application (application topology), finding architecture details of underlying multi-core cluster (system topology), and mapping processes to cores. In this paper, we have proposed novel “ClustMap” algorithm for the third step. By using this algorithm, both system and application topologies are mapped. The experimental results show that the proposed algorithm outperforms over existing process placement techniques.

K. B. Manwade, D. B. Kulkarni
Mobile Agent-Based Frequent Pattern Mining for Distributed Databases

In today’s world of globalization, business organizations produce information from many branch offices of their business while operating across the globe and hence lead to large chunk of distributed databases. There is an innate need to look at this distributed information that leverages the past, monitors the present, and predicts the future with accuracy. Mining large distributed databases using client–server model is time-consuming and sometimes impractical because it requires huge databases to be transferred over very long distances. Mobile agent technology is a promising alternative that addresses the issues of client–server computing model. In this paper, we have proposed an algorithm called MADFPM for frequent pattern mining of distributed databases that use mobile agents. We have shown that the performance of MADFPM is better compared to the conventional client–server approach.

Yashaswini Joshi, Shashikumar G. Totad, R. B. Geeta, P. V. G. D. Prasad Reddy
A Hybrid Approach for Preprocessing of Imbalanced Data in Credit Scoring Systems

During the last few years, classification task in machine learning is commonly used by various real-life applications. One of the common applications is credit scoring systems where the ability to accurately predict creditworthy or non-creditworthy applicants is critically important because incorrect predictions can cause major financial loss. In this paper, we aim to focus on skewed data distribution issue faced by credit scoring system. To reduce the imbalance between the classes, we apply preprocessing on the dataset which makes combined use of random re-sampling and dimensionality reduction. Experimental results on Australian and German credit datasets with the presented preprocessing technique has shown significant performance improvement in terms of AUC and F-measure.

Uma R. Salunkhe, Suresh N. Mali
Vision-Based Target Tracking Intelligent Robot Using NI myRIO with LabVIEW

Robots are worked to do tasks that are risky to people, for example, defusing bombs and discovering survivors in unsteady environments and investigation. The rising exploration field on scaled-down programmed target following robot is given significance in hazardous, military and industrial areas for navigation, observation, safety, and goal acknowledgment by image processing. The real-time vision-based technique is used for target tracking. The objective of this system is to develop the proto robot capable of following a target using image processing. Normally, the way of a mobile robot is controlled by a pre-identified data about objects in nature. In this developed system, the robot tracks the desired destination goal in a systemized way.

Anita Gade, Yogesh Angal
Detection of Misbehaviors Nodes in Wireless Network with Help of Pool Manager

As in recent time, wireless networks were developed in various extreme ends but then also facing from the problem of malicious nodes in the network. In this case, networks drop the packets at the receiver side and create various routing attacks. By the help of Dijkstra’s algorithm, network can escape from the problem of dropping packets. In that, nodes can calculate shortest path on the basis of Fuzzy logic. Biggest downsides of existing propose are that it is unable to handle occurring of blackhole and grayhole in the network and nodes can easily befool with each other. To avoid occurring of attacks, network can use a scheme in which it creates pool manager at the starting of communication by which load of the nodes get reduces and routing attacks can easily detect. And at the time of further communication, that node is banned and does not engage for delivery packets from source to destination. By ejection of malicious nodes from the network with the help of pool manager that can iterative more established data by comparing hash function.

Asha Chaudhary, Pournima More
Evaluation of Multi-label Classifiers in Various Domains Using Decision Tree

One of the commonly used tasks in mining is classification, which can be performed using supervised learning approach. Because of digitization, lot of documents are available which need proper organization, termed as text categorization. But sometimes documents may reflect multiple semantic meanings, which represents multi-label learning. It is the method of associating a set of predefined classes to an unseen object depending on its properties. Different methods to do multi-label classification are divided into two groups, namely data transformation and algorithm adaptation. This paper focuses on the evaluation of eight algorithms of multi-label learning based on nine performance metrics using eight multi-label datasets, and evaluation is performed based on the results of experimentation. For all the multi-label classifiers used for experimentation, decision tree is used as a base classifier whenever required. Performance of different classifiers varies according to the size, label cardinality, and domain of the dataset.

V. S. Tidake, S. S. Sane
An Effective Multilabel Classification Using Feature Selection

Recently, multilabel classification has received significant attention during the past years. A multilabel classification approach called coupled k-nearest neighbors algorithm for multilabel classification (called here as CK-STC) reported in the literature exploits coupled label similarities between the labels and provides improved performance [Liu and Cao in A Coupled k-Nearest Neighbor Algorithm for Multi-label Classification, pp. 176–187, 2015]. A multilabel feature selection is presented in Li et al. [Multi-label Feature Selection via Information Gain, pp. 346–355, 2014] and called as FSVIG here. FSVIG uses information gain that shows better performance when used with ML-NB, ML-kNN, and RandSvm when compared with existing multilabel feature selection algorithms.This paper investigates the performance of FSVIG when used with CK-STC and compares its performance with other multilabel feature selection algorithms available in MULAN using standard multilabel datasets. Experimental results show that FSVIG when used with CK-STC provides better performance in terms of average precision and one-error.

S. S. Sane, Prajakta Chaudhari, V. S. Tidake
High-Performance Pipelined FFT Processor Based on Radix-22 for OFDM Applications

This paper introduces high-performance pipelined 256-point FFT processor based on Radix-22 for OFDM communication systems. This method uses Radix-2 butterfly structure and Radix-22 CFA algorithm. Radix-2 butterfly’s complexity is very low and Radix-22 CFA algorithm reduces number of twiddle factors compared to Radix-4 and Radix-2. The proposed design is implemented in VHDL language, synthesized using XST of Xilinx ISE 14.1, and simulated using ModelSim PE Student Edition 10.4a successfully. Also, MATLAB code has been written and simulated with MATLAB R2012a tool. The computation speed of proposed design is observed to be 129.214 MHz after the synthesis process and SQNR is 50.95 dB.

Manish Bansal, Sangeeta Nakhate
An Image Processing Approach to Blood Spatter Source Reconstruction

Blood spatter analysis is a part of Criminal Justice system and subpart of the forensic science. Traditional blood spatter analysis has a problem with the crime scene contamination. The crime scene contamination can led to the unacceptance of the evidences. The blood spatter analysis is the process which heavily relies on the expertise of the forensic scientist. The human intervention also creates the problem of errors and misjudgments. Use of image processing to the whole will deal with automation of removing human factor. The proposed method takes the image from blood spatter using image processing and reconstructs the source of the blood. The proposed methodology uses the Otsu’s method for thresholding and Hough transform for edge detection.

Abhijit Shinde, Ashish Shinde, Deepali Sale
Overlapping Character Recognition for Handwritten Text Using Discriminant Hidden Semi-Markov Model

The field of handwritten character recognition has always attracted a large number of researchers. The proposed methodology uses Discriminant Hidden Semi-Markov Model for tackling the problem of recognition of handwritten characters. Preprocessing on the input image such as denoising and adaptive thresholding is done for input conditioning, followed by segmentation for finding the area which contains text. The text image is then passed through the second stage of segmentation, which separates overlapping characters. Then, these segmented characters are digitized using feature extraction. For feature extraction, Discriminant Hidden Semi-Markov Model is used. For feature matching and character extraction, the proposed methodology uses KNN Classifier. The training feature library, consisting 180 samples of each character in the capital and small, processed using training algorithm of Discriminant HsMM. Paragraphs of 80–120 characters are processed in recognition module. The 86% average accuracy rate is achieved for a large set of characters.

Ashish Shinde, Abhijit Shinde
A Literature Survey on Authentication Using Behavioural Biometric Techniques

With technological advancements and the increasing use of computers and internet in our day to day lives, the issue of security has become paramount. The rate of cybercrime has increased tremendously in the internet era. Out of the numerous crimes, identity theft is perhaps the one that poses the most dangers to an individual. More and more voices strongly declare that the password is no longer a reliable IT security measure and must be replaced by more efficient systems for protecting the computer contents. Behavioural biometrics is an emerging technology that resolves some of the major flaws of the previous scheme. This paper is the first stage of a project which aims to develop a novel authentication system using behavioural biometrics. It presents a comprehensive survey of various techniques and recent works in the respective fields.

Rajvardhan Oak
Number System Oriented Text Steganography in English Language for Short Messages: A Decimal Approach

Information or data transfer between nodes are the necessary part of computing but the biggest need is to secure the data from evil sources. There are many technique but they are insufficient in case of information larceny. Our approach with number system oriented text steganography for short messages is the outcome for preventing such purpose. The combination of mathematical, computation rules, and knack of us hides the data from view. The approach used to hide data is innovative among data transfer protocols as word to word or rather alphabet to alphabet including numbers and special characters is derived. This approach makes data invisible when moving in any one of the ways such as SMS, WhatsApp, Email, or Facebook messenger. Good mixtures of any number, characters form a pair of sets that can be used to hide information. Security agencies like navy, army or air force can use such kind of techniques transferring data from one node to another for sake of setting aside their native soil.

Kunal Kumar Mandal, Santanu Koley, Saptarshi Mondal
Novel Robust Design for Reversible Code Converters and Binary Incrementer with Quantum-Dot Cellular Automata

This work, we employ computing around quantum-dot automata to construct the architecture of the reversible code converters and binary incrementer. The code converter and binary incrementer are made up of Feynman gate and Peres gate, respectively. We have presented the robust design of Ex-OR in QCA, which is used for the construction of code converters and binary incrementer. The layouts of proposed circuits were made using the primary elements such as majority gate, inverter, and binary wire. A novel binary-to-gray converter design offers 59% cell count reduction and 36% area reduction in primitives improvement from the benchmark designs. Being pipeline of PG gate to construct the 1-bit, 2-bit, and 3-bit binary incrementer, we can use this robust layout in the QCA implementation of binary incrementer. By the comparative result, it is visualized that the binary incrementer such as 1-bit, 2-bit, and 3-bit achieved 60.82, 60.72, and 64.79% improvement regarding cell count from the counterpart.

Bandan Kumar Bhoi, Neeraj Kumar Misra, Manoranjan Pradhan
Routing with Secure Alternate Path Selection for Limiting the Sink Relocation and Enhanced Network Lifetime

The main challenge of wireless sensor network is its lifetime. In this type of network, single static sink node is present; a sensor device node requires more energy for estimating information packet specifically those that are available in the area of the sink node. Such nodes separate the energy so fast due to the numerous tone traffic patterns and at the end they die. This uneven event is named as hotspot issue which gets more real as the numbers of sensor nodes increase. Generally, replacement of such energy sources is not a feasible and cost-effective solution. For this problem, there is one solution regarding to distance. If the distance among sensor and sink node is minimized, the energy consumption will be effectively reduced. This paper presents the solution for enhancing network lifetime with energy saving of sensor nodes. Here, we also discuss on the limitations and advantages of previous methods. The sensors nodes consume more battery power which is at minimum distance from sink node. Therefore, energy of sensor nodes in network will quickly consume their energy. So that, the lifetime of a sensor nodes will be produced. To overcome this drawback of this system, we propose alternate shortest path technique. To enhance the efficiency of energy along with network lifespan, this approach is used. Furthermore, we developed a novel technique known as Energy Aware Sink Relocation (EASR) for remote base station in WSN if the energy of alternate path is going to die. This system exploits information recognized with the remaining energy of sensor nodes battery for increasing the range of transmission of sensor node and relocation technique for the sink node in network. Some calculated numerical and theoretical calculations are given to demonstrate that the EASR strategy is used to increase the network energy of the remote system essentially. Our system proposes secure data sending using ECC algorithm and increases more network lifetime.

Renuka Suryawanshi, Kajal Kapoor, Aboli Patil
Gene Presence and Absence in Genomic Big Data for Precision Medicine

The twenty–first-century precision medicine aims at using a systems-oriented approach to find the root cause of disease specific to an individual by including molecular pathology tests. The challenges of genomic data analysis for precision medicine are multifold, they are a combination of big data, high dimensionality, and with often multimodal distributions. Advanced investigations use techniques such as Next Generation Sequencing (NGS) which rely on complex statistical methods for gaining useful insights. Analysis of the exome and transcriptome data allow for in-depth study of the 22 thousand genes in the human body, many of which relate to phenotype and disease state. Not all genes are expressed in all tissues. In disease state, some genes are even deleted in the genome. Therefore, as part of knowledge discovery, exome and transcriptome big data needs to be analyzed to determine whether a gene is actually absent (deleted/not expressed) or present. In this paper, we present a statistical technique to identify the genes that are present or absent in exome or transcriptome data (big data) to improve the accuracy for precision medicine.

Mohamood Adhil, Mahima Agarwal, Krittika Ghosh, Manas Sule, Asoke K. Talukder
A Survey on Service Discovery Mechanism

A wide range of web services are available, with a deployment of a huge number of the Internet of Things (IoT) applications, e.g., smart homes, smart cities, intelligent transport, e-health and many more. In the IoT paradigm, almost every device around us will be on the Internet to provide specific services to end users. A large number of services will be available to help end users, however, providing and selecting appropriate service from a massive number of available services is a challenge of IoT. Service discovery plays a vital role in web service architecture. Traditionally, keyword search has been used for service discovery. The traditional approaches of service discovery cannot be applied as it is for web services in the IoT environment as devices providing, and accessing services are resource constrained in nature, e.g., limited size, limited energy source, limited computational capability, and limited memory. Hence, there is need of a lightweight and semantic service discovery approach for the IoT. This work analyzes different service discovery approaches. This analysis will focus on the key requirements of service discovery mechanism needed for the IoT environment.

Gitanjali Shinde, Henning Olesen
Analysis of Multiple Features and Classifier Techniques Combination for Image Pattern Recognition

Automatic visual pattern recognition is complex and highly researched area of image processing. This research aims to study various pattern recognition algorithms, cloth pattern recognition is presented as research problem and to find out best combination suited for the cloth pattern recognition problem. The dataset is collected from CCNY clothing pattern dataset and contains 150 samples of each category (Patternless, Striped, Plaid, and Irregular). The presented study compares all combinations of three different feature extraction techniques and three classifier techniques. Feature extraction techniques used here are Radon Feature Extraction, projection of rotated gradient, and quantized histogram of gradients. The classifiers used are KNN, neural network, and SVM classifier. The highest recognition rate is achieved using Radon Signature feature and KNN classifier combination which reaches to 93.7% of accuracy.

Ashish Shinde, Abhijit Shinde
Smart and Precision Polyhouse Farming Using Visible Light Communication and Internet of Things

Recently, Polyhouse farming has taken the place of traditional farming. This technique reduces dependency on rainfall and dramatically changing environmental conditions and makes the optimum use of land and water resources in a controlled environment. Manually controlled system requires a lot of attention and care. To overcome this limitation, novel technologies like Internet of Things (IoT) and sensor network (SN) can be used. Case studies show that use of electromagnetic or radio spectrum for communication degrades the growth and quality of crops. The visible light communication (VLC) provides an effective solution to this problem using visible light spectrum for communication. VLC outperforms when compared with traditional communication techniques in terms of available frequency spectrum, reliability, security, and energy efficiency. LED source used in VLC acts as communication element as well as is used for illumination. This paper proposes use of sensor network to sense the data like temperature, humidity, soil moisture, and luminosity along with VLC as a medium of communication from node to the network gateway. IoT performs its part in automating polyhouse countermeasures and fertilizer suggestion based on the data processed in cloud for Business Intelligence and Analytics. Thus, the proposed system will ensure automated and sustainable polyhouse farming using VLC and IoT.

Krishna Kadam, G. T. Chavan, Umesh Chavan, Rohan Shah, Pawan Kumar
Acceleration of CNN-Based Facial Emotion Detection Using NVIDIA GPU

Emotions often mediate and facilitate interactions among human beings and are conveyed by speech, gesture, face, and physiological signal. Facial expression is a form of nonverbal communication. Failure of correct interpretation of emotion may cause for interpersonal and social conflict. Automatic FER is an active research area and has extensive scope in medical field, crime investigation, marketing, etc. Performance of classical machine learning techniques used for emotion detection is not well when applied directly to images, as they do not consider the structure and composition of the image. In order to address the gaps in traditional machine learning techniques, convolutional neural networks (CNNs) which are a deep Learning algorithm are used. This paper comprises of results and analysis of facial expression for seven basic emotion detection using multiscale feature extractors which are CNNs. Maximum accuracy got using one CNN as 96.5% on JAFFE database. Implementation exploited Graphics Processing Unit (GPU) computation in order to expedite the training process of CNN using GeForce 920 M. In future scope, detection of nonbasic expression can be done using CNN and GPU processing.

Bhakti Sonawane, Priyanka Sharma
Research Issues for Energy-Efficient Cloud Computing

Everyone is using highly computing devices nowadays and indirectly getting part of cloud computing, IOT, and virtual machines being either as a client or server. All this was possible because of virtualization techniques and upgradation of technologies from networking to ubiquitous computing. However, today, there is a need of considering the cost of computing devices versus the cost of power consumed by computing devices. Everybody required mobile of more computing capacity and along with that more battery backup. This also is needed to be thought about cloud computing, as energy consumption by data centers was increased year to year which also causing footprints of CO2 behind. According to Gartner, worlds 2% CO2 emission was only due to IT industry. This paper deals with investigating such research issues for energy-efficient cloud computing.

Nitin S. More, Rajesh B. Ingle
Implementation of REST API Automation for Interaction Center

AVAYA Interaction Center is developed for communication between customer and the company. The Interaction Center provides business-class control of contact center communications across several channels like voice, email, web chat. Chat is a progressively popular communication channel due to growth in the era of Internet. Existing chat generator tool supports only IE browser. So, there is a need that the tool should be browser independent. The paper proposes the tool based on REST API. In the proposed system, the REST API of Chat media of CSPortal is captured through SOAPUI tool. The request–response of REST API is captured for different resources like /csportal/cometd/of CSPortal Chat Media. The result shows the proposed approach is more efficient than existing tool and chat is live for more than 2 min.

Prajakta S. Marale, Anjali A. Chandavale
Predictive Analysis of E-Commerce Products

For the past few years, there has been increasing trend for people to buy products online through e-commerce sites. With the user-friendly platform, there is loop hole which does not guarantee satisfaction of the customers. The customers have the habit of reading the reviews given by other customers in order to choose the right product. Due to high number of reviews with mixture of good and bad reviews, it is confusing and time-consuming to determine the quality of the product. Through these reviews, the vendors would also want to know the future trend of the product. In this paper, a predictive analysis scheme is implemented to detect the hidden sentiments in customer reviews of the particular product from e-commerce site in real-time basis. This serves as a feedback to draw inferences about the quality of the product with the help of various graphs and charts generated by the scheme. Later, an opinion will be drawn about the product on the basis of the polarity exhibited by the reviews. Finally, prediction over the success or failure of the product in the regular interval of the timestamp is done using time series forecasting method. A case study for iPhone 5s is also presented in this paper highlighting the results of rating generation, sentiment classification, and rating prediction.

Jagatjyoti G. Tuladhar, Ashish Gupta, Sachit Shrestha, Ujjen Man Bania, K. Bhargavi
User Privacy and Empowerment: Trends, Challenges, and Opportunities

Today, the service providers are capable of assembling a huge measure of user information using big data techniques. For service providers, user information has become a vital asset. The present business models are attentive to collect extensive users’ information to extract useful knowledge to the service providers. Considering business models that are slanted towards service providers, privacy has become a crucial issue in today’s fast-growing digital world. Hence, this paper elaborates personal information flow between users, service providers, and data brokers. We also discussed the significant privacy issues like present business models, user awareness about privacy, and user control over personal data. To address such issues, this paper also identified challenges that comprise unavailability of effective privacy awareness or protection tools and the effortless way to study and see the flow of personal information and its management. Thus, empowering users and enhancing awareness are essential to comprehending the value of secrecy. This paper also introduced latest advances in the domain of privacy issues like User-Managed Access (UMA) that can state suitable requirements for user empowerment and will cater to redefine the trustworthy relationship between service providers and users. Subsequently, this paper concludes with suggestions for providing empowerment to the user and developing user-centric, transparent business models.

Prashant S. Dhotre, Henning Olesen, Samant Khajuria
Evolution Metrics for a BPEL Process

A Business Process Execution Language (BPEL) process in Service-Oriented Architecture (SOA) evolves over time. The study of evolution helps in analyzing the development and enhances maintenance of a BPEL process. In this paper, we study the evolution using metrics which measure the changes and the quantity in which the changes have occurred. A process could evolve on its own or because of the evolution in the partner services. In both the cases, changes could be to the internal logic of the process or involve changes in the interaction with the partner services. The evolution metrics proposed in this paper are—BPEL Internal Evolution Metric (BEM I ) and BPEL External Evolution Metric (BEM E ). The time complexity for computing the metrics is linear. The metrics are theoretically validated using Zuse framework and are found to be above the ordinal scale. In our previous work, metrics were proposed for a single service under evolution. The cohesiveness, between the changes of an evolving service and an evolving BPEL process which uses this service, is demonstrated using metrics.

N. Parimala, Rachna Kohar
Development of Performance Testing Suite Using Apache JMeter

Testing a product has become one of the most important tasks for any organization (Be it small scale or large scale). Without testing the product, it is not delivered to the customer. Testing is an ongoing activity from the beginning of a product’s development. A performance testing suite shall be developed using Apache JMeter for the purpose of testing a product. To perform performance testing on client- and server-type softwares, a 100% pure Java application named Apache JMeter is used. Apache JMeter is not a browser, it works at protocol level. Static and dynamic resources performance testing can be done using JMeter. A high level performance testing suite will be developed in capturing aspects of performance at UI and System level. Developing the testing suite helps in saving the time and cost of the organization. The discussion follows and describes benefits of performance testing and the performance testing suite.

Jidnyasa Agnihotri, Rashmi Phalnikar
Characterizing Network Flows for Detecting DNS, NTP, and SNMP Anomalies

Network security can never be assured fully as new attacks are reported every day. Characterizing such new attacks is a challenging task. For detecting anomalies based on specific services, it is desirable to find characteristic features for those service specific anomalies. In this paper, real-time flow-based network traffic captured from a university campus is studied to find if the traditional volume-based analysis of aggregated flows and service specific aggregated flows is useful in detecting service specific anomalies or not. Two existing techniques are also evaluated to find characteristic features of these anomalies. The service specific anomalies: DNS, NTP, and SNMP are considered for study in this paper.

Rohini Sharma, Ajay Guleria, R. K. Singla
Periocular Region Based Biometric Identification Using the Local Descriptors

Biometric systems have become a vital part of our present day automated systems. Every individual has its unique biometric features in terms of face, iris and periocular regions. Identification/recognition of a person by using these biometric features is significantly studied over the last decade to build robust systems. The periocular region has become the powerful alternative for unconstrained biometrics with better robustness and high discrimination ability. In the proposed paper, various local descriptors are used for the feature extraction of discriminative features from the regions of full face, periocular and city block distance is used as a classifier. Local descriptors used in the present work are Local Binary Patterns (LBP), Local Phase Quantization (LPQ) and Histogram of Oriented Gradients (HOG) and Weber Local Descriptor (WLD). FRGC database is used for the experimentation to compare the performance of both periocular and face biometric modalities and it showed that the periocular region has a similar level of performance of the face region using only 25% data of the complete face.

K. Kishore Kumar, P. Trinatha Rao
Model-Based Design Approach for Software Verification Using Hardware-in-Loop Simulation

Increasing demand in high quality products with high safety requirements and reduced time-to-market are the challenges faced during the development of embedded products. These products irrespective of different domains (consumer, automotive, medical, aerospace) incorporate multidisciplinary (electrical, mechanical, electronic) systems as a part of hardware along with complex software that controls the hardware. Late integration of these multidisciplinary systems in the development cycle followed by the software verification and validation may lead to expensive redesigns and delayed time-to-market. Model-based design (MBD) approach can be used to overcome these challenges. Hardware-in-loop (HIL) verification is an effective method that can be used to verify the control software. Plant modeling is the crucial part for HIL verification. This paper will provide a review of one of the steps of model-based design (plant modeling) that can be used for software testing along with the impact of fidelity of model on the verification.

Pranoti Joshi, N. B. Chopade
Silhouette-Based Human Action Recognition by Embedding HOG and PCA Features

Human action recognition has become vital aspect of video analytics. This study explores methods for the classification of human actions by extracting silhouette of object and then applying feature extraction. The method proposes to integrate HOG feature and PCA feature effectively to form a feature descriptor which is used further to train KNN classifier. HOG gives local shape-oriented variations of the object while PCA gives global information about frequently moving parts of human body. Experiments conducted on Weizmann and KTH datasets show results comparable with existing methods.

A. S. Jahagirdar, M. S. Nagmode
Unified Algorithm for Melodic Music Similarity and Retrieval in Query by Humming

Query by humming (QBH) is an active research area since a decade with limited commercial success. Challenges include partial imperfect queries from users, query representation and matching, fast, and accurate generation of results. Our work focus is on query presentation and matching algorithms to reduce the effective computational time and improve accuracy. We have proposed a unified algorithm for measuring melodic music similarity in QBH. It involves two different approaches for similarity measurement. They are novel mode normalized frequency algorithm using edit distance and n-gram precomputed inverted index method. This proposed algorithm is based on the study of melody representation in the form of note string and user query variations. Queries from four non-singers with no formal training of singing are used for initial testing. The preliminary results with 60 queries for 50 songs database are encouraging for the further research.

Velankar Makarand, Kulkarni Parag
Predict Stock Market Behavior: Role of Machine Learning Algorithms

The prediction of a dynamic, volatile, and unpredictable stock market has been a challenging issue for the researchers over the past few years. This paper discusses stock market-related technical indicators, mathematical models, most preferred algorithms used in data science industries, analysis of various types of machine learning algorithms, and an overall summary of solutions. This paper is an attempt to perform the analysis of various issues pertaining to dynamic stock market prediction, based on the fact that minimization of stock market investment risk is strongly correlated to minimization of forecasting errors.

Uma Gurav, Nandini Sidnal
Stability of Local Information-Based Centrality Measurements Under Degree Preserving Randomizations

Node centrality is one of the integral measures in network analysis with wide range of applications from socioeconomic to personalized recommendation. We argue that an effective centrality measure should undertake stability even under information loss or noise introduced in the network. With six local information-based centrality metric, we investigate the effect of varying assortativity while keeping degree distribution unchanged, using networks with scale free and exponential degree distribution. This model provides a novel scope to analyze the stability of centrality metric which can further find many applications in social science, biology, information science, community detection and so on.

Chandni Saxena, M. N. Doja, Tanvir Ahmad
Hybrid Solution for E-Toll Payment

E-toll system has been huge improvement in decreasing the over traffic jams that have become a big problem in metro cities nowadays. It is the best method to handle the huge traffic. The traveler passing through the traditional type of transport have to pay toll bill by waiting in the long queue, so it results in loss of petrol, loss of time, pollution, and tension of carrying cash with us. So by using our system of E-Toll payment the traveler revokes the tension of waiting in the queue to make the payment, which decreases the fuel consumption and taking cash with them can be avoided.

Ajinkya R. Algonda, Rewati R. Sonar, Saranga N. Bhutada
Enhancing Distributed Three Hop Routing Protocol in Hybrid Wireless Network Through Data Weight-Based Scheme

The reason for improvising of our concept is to involve multiple nodes in data transmission that diminishes delay in a routing of a network. In hybrid wireless networking, most of the existing routing performs in a linear way. In such possibility sometimes a maximum number of nodes are in idle state. So for that, there is a concept of distributed three hop routing that depreciates the burden of a load on the network. Dividing data and load on a node is an emphatic method of transmission and pronouncement the shortest path using a fuzzy classifier that makes the faster routing approach. But one of the biggest flaws of the existing scheme is that it does not perform as per exceptions due to which it has higher overhead and fails to achieve better congestion control. This paper promoted weight-based assignment technique by using distributed three hop routing protocol. By considering immediate one hop that evolved in shortest path and analyzing a response delay on two nodes on that basis data is fragmented by using weight-based assignment technique. To intensify the efficiency of a routing protocol in the hybrid wireless networks a weight-based data assignment technique is used for data allocation in distributed routing protocol using the artistry of least delay detection to maintain less data congestion in the network.

Neha S. Rathod, Poonam Gupta
VLSI-Based Data Hiding with Transform Domain Module Using FPGA

In this rapidly growing internet era, researchers are giving more and more attention toward robust, secure, and fast communication channels while hiding sensitive data. The concealment steps can be done through a spatial domain or the transform domain. This paper proposes a data hiding system with an adaptive Very Large-Scale Integration (VLSI) module to enhance the security and robustness of embedded data. The Field Programmable Gate Arrays (FPGA) implementation approach of data hiding technique provides not only pipelined and parallel operations, but also gives the perfect guard against malicious attacks. The proposed algorithm is implemented on a Xilinx Virtex 5 FPGA board. Further, the transform domain technique also optimizes memory space and reduces the execution time through pipelining. The performance of the implemented system is measured using different parameters like resource utilization, Mean Squared Error (MSE), and Peak Signal-to-Noise Ratio (PSNR).

Latika R. Desai, Suresh N. Mali
A Novel Approach of Frequent Itemset Mining Using HDFS Framework

Frequent itemset extraction is a very important task in data mining applications. This is useful in applications like Association rule mining and co-relations. They are using some algorithms to extract the frequent itemsets, like Apriori and FP-Growth. The algorithms used by these applications are inefficient to support balancing, distributing the load, and automatic parallelization with good speed. Data partitioning and fault tolerance is also not possible because of excessive data. Hence, there is a need to develop algorithms which will remove these issues. Here, a novel approach is used to work on the extracting the frequent itemsets using MapReduce. This system is based on the Modified Apriori, called as Frequent Itemset Mining using Modified Apriori(FIMMA). To automate the data parallelization, well balance the load and to reduce the execution time FIMMA works concurrently and independently using three mappers. It uses decomposing strategy to work concurrently.

Prajakta G. Kulkarni, S. R. Khonde
Issues of Cryptographic Performance in Resource-Constrained Devices: An Experimental Study

Many gazettes, toys, sensors, instruments, etc., devices are mushrooming in the electronics market these days which are running with different operating systems and application frameworks. In Mobile phones also many platforms are available with its own stack, right from hardware level to application level. So it is difficult to all the mobile phones or devices with same characteristics, performance, and features. The conclusive success of a platform entirely depends on its security to the user data ultimately, it constructs the global market. We present here a comparative study of four Mobile Operating Systems architecture and security perspectives of each. Moreover, experiments are carried out with Windows and android smartphone to study the cryptographic performance in these devices which plays as decisive parameters for future applications and system developers to implement secured devices and applications. We show that although algorithms characteristics are fixed, due to different architectures and dependencies in different devices with different operating systems, same efficiency, and performance is not guaranteed by cryptographic as well as other data structure algorithms.

Balaso Jagdale, Jagdish Bakal
Assessment of Object Detection Using Deep Convolutional Neural Networks

Detecting the objects from images and videos has always been the point of active research area for the applications of computer vision and artificial intelligence namely robotics, self-driving cars, automated video surveillance, crowd management, home automation and manufacturing industries, activity recognition systems, medical imaging, and biometrics. The recent years witnessed the boom of deep learning technology for its effective performance on image classification and detection challenges in visual recognition competitions like PASCAL VOC, Microsoft COCO, and ImageNet. Deep convolutional neural networks have provided promising results for object detection by alleviating the need for human expertise for manually handcrafting the features for extraction. It allows the model to learn automatically by letting the neural network to be trained on large-scale image data using powerful and robust GPUs in a parallel way, thus, reducing training time. This paper aims to highlight the state-of-the-art approaches based on the deep convolutional neural networks especially designed for object detection from images.

Ajeet Ram Pathak, Manjusha Pandey, Siddharth Rautaray, Karishma Pawar
Implementation of Credit Card Fraud Detection System with Concept Drifts Adaptation

There is a large number of credit card payments take place that is targeted by fraudulent activities. Companies which are responsible for the processing of electronic transactions need to efficiently detect the fraudulent activity to maintain customers’ trust and the continuity of their own business. In this paper, the developed algorithm detects credit card fraud. Prediction of any algorithm is based on certain attribute like customer’s buying behavior, a network of merchants that customer usually deals with, the location of the transaction, amount of transaction, etc. But these attribute changes over time. So, the algorithmic model needs to be updated periodically to reduce this kind of errors. Proposed System provides two solutions for handling concept drift. One is an Active solution and another one is Passive. Active solution refers to triggering mechanisms by explicitly detecting a change in statistics. Passive solution suggests updating the model continuously in order to consider newly added records. The proposed and developed system filters 80% fraudulent transactions and acts as a support system for the society at a large.

Anita Jog, Anjali A. Chandavale
Intelligent Traffic Control by Multi-agent Cooperative Q Learning (MCQL)

Traffic crisis frequently happens because of traffic demands by the large number vehicles on the path. Increasing transportation move and decreasing the average waiting time of each vehicle are the objectives of cooperative intelligent traffic control system. Each signal wishes to catch better travel move. During the course, signals form a strategy of cooperation in addition to restriction for neighboring signals to exploit their individual benefit. A superior traffic signal scheduling strategy is useful to resolve the difficulty. The several parameters may influence the traffic control model. So it is hard to learn the best possible result. The lack of expertise of traffic light controllers to study from previous practice results makes them to be incapable of incorporating uncertain modifications of traffic flow. Defining instantaneous features of the real traffic scenario, reinforcement learning algorithm based traffic control model can be used to obtain fine timing rules. The projected real-time traffic control optimization model is able to continue with the traffic signal scheduling rules successfully. The model expands traffic value of the vehicle, which consists of delay time, the number of vehicles stopped at the signal, and the newly arriving vehicles to learn and establish the optimal actions. The experimentation outcome illustrates a major enhancement in traffic control, demonstrating the projected model is competent of making possible real-time dynamic traffic control.

Deepak A. Vidhate, Parag Kulkarni
Digital Tokens: A Scheme for Enabling Trust Between Customers and Electronic Marketplaces

In electronic marketplaces, when the supply of a particular product is limited, and when there is a huge demand for the same, the questions of transparency and integrity prop up. We propose Digital Tokens—defined using proven cryptographic techniques—as a mechanism to assure trust for customers, and issued by a reliable, transparent and third-party intermediary, called digital token service provider (DTSP). The digital tokens are issued to a customer on behalf of a vendor and could be authenticated by both Vendor and the DTSP. This paper details the architecture involving the DTSP, protocols for communication, implementation details, the potential uses and benefits of the system and performance evaluation of such a system.

Balaji Rajendran, Mohammed Misbahuddin, S. Kaviraj, B. S. Bindhumadhava
BMWA: A Novel Model for Behavior Mapping for Wormhole Adversary Node in MANET

Wormhole attack has received very less attention in the research community with respect to mobile ad hoc network (MANET). Majority of the security techniques are toward different forms of wireless network and less in MANET. Therefore, we introduce a model for behavior mapping for Wormhole Attacker considering the unknown and uncertain behavior of a wormhole node. The core idea is to find out the malicious node and statistically confirm if their communication behavior is very discrete from the normal node by formulating a novel strategic approach to construct effective decision. Our study outcome shows enhanced throughput and minimal overhead-latency with increasing number of wormhole node.

S. B. Geetha, Venkanagouda C. Patil
AWGN Suppression Algorithm in EMG Signals Using Ensemble Empirical Mode Decomposition

Surface Electromyogram (EMG) signals are often contaminated by background interferences or noises, imposing difficulties for myoelectric control. Among these, a major concern is the effective suppression of Additive White Gaussian Noise (AWGN), whose spectral components coincide with the spectrum of EMG signals; making its analysis problematic. This paper presents an algorithm for the minimization of AWGN from the EMG signal using Ensemble Empirical Mode Decomposition (EEMD). In this methodology, EEMD is first applied on the corrupted EMG signals to decompose them into various Intrinsic Mode Functions (IMFs) followed by Morphological Filtering. Herein, a square-shaped structuring element is employed for requisite filtering of each of the IMFs. The outcomes of the proposed methodology are found improved when compared with those of conventional EMD-and EEMD-based approaches.

Ashita Srivastava, Vikrant Bhateja, Deepak Kumar Tiwari, Deeksha Anand
Visible-Infrared Image Fusion Method Using Anisotropic Diffusion

In this paper, Visible and Infrared sensors are used to take complementary images of a targeted scene. Image fusion thus aims to integrate the two images so that maximum information and fewer artifacts are introduced in the fused image. The concept of merging two different multisensor images using the combination of Anisotropic Diffusion (AD) and max–min approach is carried out in this paper. Herein, each of the registered source images are decomposed into approximation and detailed layers using AD filter. Later, max–min fusion rules are applied on detail and approximate layer, respectively, to preserve both spectral as well as structural information. Image-quality assessment of the fused images is made using structural similarity index (SSIM) , fusion factor (FF), and entropy (E) which justifies the effectiveness of proposed method.

Ashutosh Singhal, Vikrant Bhateja, Anil Singh, Suresh Chandra Satapathy
Fast Radial Harmonic Moments for Invariant Image Representation

The main objective of this paper is to reduce the reconstruction error and fast calculation of Radial Harmonic Fourier Moments (RHFM). In the proposed work, the fast RHFM has been applied on the original gray image for reconstruction. Before applying RHFM on grayscale image, the image in portioned into radial and angular sectors. Results are compared with traditional methods. The proposed approach results in better reconstruction error. Also, moments can be calculated at high speed using proposed approach.

Shabana Urooj, Satya P. Singh, Shevet Kamal Maurya, Mayank Priyadarshi
A Wide-Area Network Protection Method Using PMUs

This paper proposes the idea of utilizing the data of PMU’s (Phasor Measurement Unit) to detect different types of faults in transmission lines. Data measured from PMUs is collected at a system control center. Proposed method equates the positive sequence voltages of different buses to detect the faulted bus. With the help of synchronized phasor measurements, bus voltages are estimated by using different paths, furthermore, they are matched for the detection of the faulted line also. The proposed protection scheme is verified for balanced and unbalanced faults. Simulation has been done with MATLAB/SIMULINK for a 230 kV IEEE 9-bus system.

Namita Chandra, Shabana Urooj
Analysis and Prediction of the Effect of Surya Namaskar on Pulse of Different Prakruti Using Machine Learning

“Surya Namaskar” is the key for Good health! Today’s social life can be made easier and healthier using the mantra of “YOGA”. Nadi Parikshan is a diagnostic technique which is based on the ancient Ayurvedic principles of Wrist Pulse analysis. Nadi describes the mental and physical health of a person in great depth. This information can be used by practitioners to prevent, detect as well as treat any ailment. Surya Namaskar is a Yoga exercise which has multiple health benefits and a direct impact on Pulse. Prakruti of a person is a metaphysical characteristic and a combination of the three doshas in Ayurveda viz. Vatta, Pitta, and Kapha which remains constant for the lifetime. Experimentation was carried out to analyze the effect of Surya Namaskar exercise on the Pulse of different Prakruti. The Pulse was recorded for a group of young students aged between 19 and 23 years with different Prakruti before Surya Namaskar and after Surya Namaskar for a period of 4 days. This paper analyzes the effect of Surya Namaskar on human Pulse and proposes a framework to predict Pulse after Surya Namaskar. The changes that Surya Namaskar causes in the Pulse are studied and used to predict Pulse a fter performing Surya Namaskar. This analysis helps understand how Surya Namaskar benefits the health of a person. Performing Surya Namaskar in our daily routine would improve the health of the society as a whole making the subjects energetic and active.

Jayshree Ghorpade-Aher, Abhishek Girish Patil, Eeshan Phatak, Sumant Gaopande, Yudhishthir Deshpande
Cognitive Depression Detection Methodology Using EEG Signal Analysis

This paper illustrates a new method for depression detection using EEG recordings of a subject. It is meant to be used as a computerised aid by psychiatrists to provide objective and accurate diagnosis of a patient. First, data from the occipital and parietal regions of the brain is extracted and different channels are fused to form one wave. Then DFT, using FFT, is applied on the occipito-parietal wave to perform spectral analysis and the fundamental is selected from the spectrum. The fundamental is the wave with the maximum amplitude in the spectrum. Then classification of the subject is made based on the frequency of the fundamental using rule-based classifier. Detailed analysis of the output has been carried out. It has been noted that lower frequency of the fundamental tends to show hypoactivation of the lobes. Moreover, low-frequency characteristics have also been observed in depressed subjects. In this research, 37.5% of the subjects showed Major Depressive Disorder (MDD) and in all 80% of the subjects showed some form of depression.

Sharwin P. Bobde, Shamla T. Mantri, Dipti D. Patil, Vijay Wadhai
Biogas Monitoring System Using DS18B20 Temperature Sensor and MQTT Protocol

Nonrenewable energy resources such as coal, petroleum, and natural gas are becoming extinct and thus there is a need for renewable energy resources in the long run. One of the most important renewable energy resources is biogas. Biogas is the gas produced after anaerobic digestion of organic matter by micro-organisms. Biogas mainly contains methane (about 60%). There are a number of factors affecting the production of biogas and one of them is temperature. The temperature of the biogas plant should be held constant with variation <1 °C and within the range 30–55 °C. Thus, there is a need of a proper monitoring system for the biogas plant. In this study, we try to develop a monitoring system for the biogas plant using temperature sensor DS18B20, MQTT protocol, Mosquitto broker, and Raspberry Pi. A web-based system, where the temperature sensor values will be uploaded periodically and an end user will monitor the temperature of biogas plant remotely, is proposed.

Suruchi Dedgaonkar, Aakankssha Kaalay, Nitesh Biyani, Madhuri Mohite
Time-Efficient and Attack-Resistant Authentication Schemes in VANET

VANET (Vehicular Ad hoc Network) brings evolution in the transportation system. It is vulnerable to different kind of attacks. Authentication is the first line of defense against security problems. The previous researcher comes with some cryptographic, trust-based, ID-based, and signature-based authentication schemes. If we consider the performance of previous schemes, process time and efficiency need to be improved. Faster authentication can help to establish communication in a short time and RSU can serve more number of vehicles in a shorter time span. We presented AECC (Adaptive Elliptic Curve Cryptography) and EECC (Enhanced Elliptic Curve Cryptography) schemes to improve the speed and security of authentication. In AECC key size is adaptive, i.e., different sizes of keys are generated during the key generation phase. Three ranges are specified for key sizes small, large, and medium. In EECC, we added an extra parameter during transmission of information from the vehicle to RSU (Road Side Unit) for key generation. This additional parameter gives the information about vehicle ID and location of the vehicle to RSU and other vehicles.

Sachin Godse, Parikshit Mahalle
Inferring User Emotions from Keyboard and Mouse

This chapter emphasizes on retrieving user emotions from keyboard and mouse using different parameters. These parameters can be user keyboard typing style, mouse movements, and some physiological sensors are used. This field of retrieving emotions from machines comes under the field of affective computing.

Taranpreet Singh Saini, Mangesh Bedekar
An Overview of Automatic Speaker Verification System

Biometrics is used as a form of identification in many access control systems. Some of them are fingerprint, iris, face, speech, and retina. Speech biometrics is used for speaker verification. Speech is the most convenient way to communicate with person and machine, so it plays a vital role in signal processing. Automatic speaker verification is the authentication of individuals by doing analysis on speech utterances. Speaker verification falls into pattern matching problem. Many technologies are used for processing and storing voice prints. Some of them are Frequency Estimation, Hidden Markov Models, Gaussian Mixture Models, Neural Networks, Vector Quantization, and Decision Trees. Mainly speaker verification depends upon speaker modeling and this paper represents a brief overview of the speaker verification system with feature extraction and speaker modeling. Bob spear toolkit is used for evaluation and experiment for the result and analysis. Bob spear is an open-source toolkit for speech processing. For evaluation purpose, three algorithms are proposed which are GMM, ISV, and JFA with the same preprocessing and feature extraction techniques.

Ravika Naika
Topic Modeling on Online News Extraction

News media includes print media, broadcast news, and Internet (online newspapers, news blogs, etc.). The proposed system intends to collect news data from such diverse sources, capture the varied perceptions, summarize, and present the news. It involves identifying topic from real-time news extractions, then perform clustering of the news documents based on the topics. Previous approaches, like LDA, identify topics efficiently for long news texts, however, fail to do so in case of short news texts. In short news texts, the issues of acute sparsity and irregularity are prevalent. In this paper, we present a solution for topic modeling, i.e, a word co-occurrence network-based model named WNTM, which works for both long and short news by overcoming its shortcomings. It effectively works without wasting much time and space complexity. Further, we intend to create a news recommendation system, which would recommend news to the user according to user preference.

Aashka Sahni, Sushila Palwe
Spam Mail Detection Using Classification Techniques and Global Training Set

Emails are Internet-based services for various purposes like sharing of data, sending notices, memos, and sharing data. Spam mail are emails that are sent in bulk to a large number of people simultaneously, while this can be useful for sending same data to a large number of people for useful purposes, but it is mostly used for advertising or scam. These spam mails are expensive for the companies and use a huge amount of resources. They are also inconvenient to the user as spam uses a lot of inbox space and makes it difficult to find useful and important emails when needed. To counter this problem, many solutions have come into effect, but the spammers are way ahead to find these solutions. This paper aims at discussing these solutions and identifies the strengths and shortcomings. It also covers a solution to these spam emails by combining classification techniques with knowledge engineering to get better spam filtering. It discusses classification techniques like Naïve Bayes, SVM, k-NN, and Artificial Neural Network and their respective dependencies on the training set. In the end of this paper, the global training set is mentioned which is a way to optimize these training sets and an algorithm has been proposed for the same.

Vishal Kumar Singh, Shweta Bhardwaj
Smart Gesture Control for Home Automation Using Internet of Things

Internet of Things (IoT) is a system where the machines involved in a system or an infrastructure can be monitored and certain activities can be controlled with the help of sensors and actuators. This research presents the model for controlling remote devices with the help of hand gestures. The gestures are recognized using template matching algorithms to identify the hand gestures and control remote things accordingly using the Internet of Things. The microcontrollers and the remote units which are involved in the architecture are connected to the Internet either via LAN or Wi-Fi module. The proposed system will help those who generally forget to switch off the power when not in use, as an example. It is a contribution to the body of knowledge in the field of home automation using a microcontroller.

Sunil Kumar Khatri, Govind Sharma, Prashant Johri, Sachit Mohan
A Proposed Maturity Model for Himachal Pradesh Government e-Services

With new leaps in the advancement of technology on a daily basis, it becomes quite essential that basic and necessary governmental services are provided through the internet as e-services. These are small steps to fulfil the dream of a “digitalized India”. This paper is an aim to provide a maturity model for the existing e-services of Himachal Pradesh (HP) government. Its central focus is on how to make the services easily accessible for the masses and improve their functionality by giving a detailed and sophisticated maturity model. The paper also lists the major concerns of information security in the e-service portal of Himachal Pradesh (HP) government and an attempt has been made to provide a solution for the same. The proposed solution herewith will help the government to improve the level of its e-services and the security of the portal.

Alpana Kakkar, Seema Rawat, Piyush Gupta, Sunil Kumar Khatri
Malaria Detection Using Improved Fuzzy Algorithm

Malaria is one of the most life-threatening diseases, which need a serious attention in today’s scenario. This disease is estimated to be having 3–6 billion infected cases worldwide annually having the mortality rate of 1–3 million people. Malaria should be diagnosed on time and treated precisely as it can lead to death of a person. The main objective of this paper is to design and describe an algorithm that can diagnose malaria in the early stage only so that a person cannot go up to the serious and hazardous stages or complications and also the mortality rate of malaria is reduced to an extent. Fuzzy logic is an approach to implement expert systems, which are portable and can diagnose malaria accurately as compared to the other systems.

Mukul Sharma, Rajat Mittal, Tanupriya Choudhury, Suresh Chand Satapathy, Praveen Kumar
Feature Extraction Techniques Based on Human Auditory System

Feature extraction is a relevant method in the performance of the ASR system. A good technique not only removes irrelevant characteristics, but also represents important attributes of a speech signal. This paper intends to concentrate on the comparison between feature extractions techniques of speech signals based on human auditory system for better understanding and to enhance its further applications. In this review, we explain three different techniques for feature extraction. The main emphasis is to show how they are useful in processing signals and extracting features from unprocessed signals. The human auditory system is explained which combines this study altogether. The aim is to describe techniques like Zero Crossing with Peak Amplitude, Perceptual Linear Prediction and Mel Frequency Cepstral Coefficient. As each method has its own merits and demerits, we have discussed some of the most important features of these techniques.

Sanya Jain, Divya Gupta
Movie Recommendation System: Hybrid Information Filtering System

The movie recommendation system is a hybrid filtering system that performs both collaborative and content-based filtering of data to provide recommendations to users regarding movies. The system conforms to a different approach where it seeks the similarity of users among others clustered around the various genres and utilizes his preference of movies based on their content in terms of genres as the deciding factor of the recommendation of the movies to them. The system is based on the belief that a user rates movies in a similar fashion to other users that harbor the same state as the current user and is also affected by the other activities (in terms of rating) he performs with other movies. It follows the hypothesis that a user can be accurately recommended media on the basis others interests (collaborative filtering) and the movies themselves (content-based filtering).

Kartik Narendra Jain, Vikrant Kumar, Praveen Kumar, Tanupriya Choudhury
Eco-Friendly Green Computing Approaches for Next-Generation Power Consumption

The term Green Computing is combination of two words which basically means usage of electronic devices in an efficient way. Green computing is a trend/study which came recently and is about building, designing and operating electronic devices to maximize the efficiency in terms of energy. Green computing includes the manufacturing and disposing of electronic devices including the computer resources such as the monitor, storage devices, CPU, associated subsystems, etc., without causing any ill effect to the environment. Also, one of the important aspects of green computing is recycling of the devices which are used worldwide in IT industry. This research paper highlights the advantageous use of green computing and its associated approaches through a survey conducted over power consumption in Amity University.

Seema Rawat, Richa Mishra, Praveen Kumar
Implementing Test Automation Framework Using Model-Based Testing Approach

There are various stages present in the process of software testing lifecycle which starts mainly with the requirement review process, proceeds with test planning, test designing, test case development and execution of test cases and ends with test reporting phase. This paper will emphasize mainly on the application of model-based testing with test automation frameworks for automating the test designing and test development phases of software testing lifecycle. In the test development phase it mainly provides information about using model-based testing in the activity related to the development of test automation scripts for execution of tests. This mainly tells about the steps involved in the implementation of test automation framework using the approach of model-based testing. Test development and test designing both can be automated to decrease the amount of human effort and cost for the same in the project. The development of test scripts which includes both automated and manual tests can be automated using the concept of model-based testing. In this approach firstly the model is created for capturing the behaviour of the system which is under test. After this the model-based testing tool parses the same to create the manual testing scripts. The model-based testing tool then in addition generates the automated test scripts for automated test execution and can also be integrated with popular tools and test automation frameworks. So, using a model for automating the creation of both automated and manual test scripts not only saves cost and thereby effort, but also increases the amount of coverage and reduces a significant amount of time for the product to go to market.

Japneet Singh, Sanjib Kumar Sahu, Amit Prakash Singh
Improved Exemplar-Based Image Inpainting Approach

Image inpainting is a very common technique which is widely used to bring back or recover an image when it gets destroyed, or when there is an intention to perform a morphological operation on an image. It is a simple process to fill up the pixels of a particular region. The task is very similar to that of a skilled painter who has to draw or remove an object from its painting on its canvas. Inpainting’s application may vary from its usage which includes object removal, object replacement etc. The aim of this process is to change the image properties by removing or adding objects. It has been also used to recover lost pixels or block of pixels of an image during a transmission of image through a noisy channel. It has been a useful method for red-eye removal and default stamped date from photographs clicked through primitive cameras. In this paper, we have built an exemplar-based image inpainting tool using basic functions of MATLAB. We have discussed the results by comparing the output generated by the image inpainting tool with Adobe Photoshop results to compare the results and check how much efficient the tool is.

Hitesh Kumar, Shilpi Sharma, Tanupriya Choudhury
The Hidden Truth Anonymity in Cyberspace: Deep Web

The main objective of this paper is to study and illustrate the workings of the invisible Internet and practically determining the Onion Routing relays, bridges and exit nodes. A detailed research has been done on the working of exit nodes and privacy over the deep web through vulnerable nodes. This paper also illustrates a practical study of the depth of data available on the deep web. Emphasis is laid down towards the safe access to deep web without compromising one’s privacy. A survey was conducted in the paper to show the awareness among the technically sound public and results were shown in pie charts.

Saksham Gulati, Shilpi Sharma, Garima Agarwal
Test Case Optimization and Prioritization of Web Service Using Bacteriologic Algorithm

Regression testing, testing is done on the changes made in existing software to check whether the existing software is working properly or not after the changes has been done. Therefore, retesting is performed to detect the new faults found. This type of testing is performed again and again after the changes have been made in the pre-existing software. Various methods are used for test case reduction and optimization for a web service. Regression testing creates a large number of test suites which consumes a lot of time in testing and many other problems are faced. Therefore, some technique or method should be used so that number of test cases are reduced and also test cases can be prioritized keeping in mind the time and budget constraints. The test case reduction and prioritization need to be achieved depending on various parameters such as branch coverage and also on basis of fault coverage etc. Therefore, this paper discusses about the analysis of the code of a web service and the technique used to analyze a web service based on branch or code coverage and also the fault detection for test case reduction and prioritization is bacteriologic algorithm (BA). The test cases generated and also other requirements are mapped with the branch coverage and fault coverage of the code of the web service.

Gaurav Raj, Dheerendra Singh, Ishita Tyagi
Backmatter
Metadaten
Titel
Intelligent Computing and Information and Communication
herausgegeben von
Subhash Bhalla
Prof. Dr. Vikrant Bhateja
Dr. Anjali A. Chandavale
Dr. Anil S. Hiwale
Prof. Dr. Suresh Chandra Satapathy
Copyright-Jahr
2018
Verlag
Springer Singapore
Electronic ISBN
978-981-10-7245-1
Print ISBN
978-981-10-7244-4
DOI
https://doi.org/10.1007/978-981-10-7245-1

Premium Partner