Skip to main content
Top

2018 | Book

Data Engineering and Intelligent Computing

Proceedings of IC3T 2016

Editors: Suresh Chandra Satapathy, Vikrant Bhateja, K. Srujan Raju, B. Janakiramaiah

Publisher: Springer Singapore

Book Series : Advances in Intelligent Systems and Computing

insite
SEARCH

About this book

The book is a compilation of high-quality scientific papers presented at the 3rd International Conference on Computer & Communication Technologies (IC3T 2016). The individual papers address cutting-edge technologies and applications of soft computing, artificial intelligence and communication. In addition, a variety of further topics are discussed, which include data mining, machine intelligence, fuzzy computing, sensor networks, signal and image processing, human-computer interaction, web intelligence, etc. As such, it offers readers a valuable and unique resource.

Table of Contents

Frontmatter
Analysis of Genetic Algorithm for Effective Power Delivery and with Best Upsurge

Wireless network is ready for hundreds or thousands of nodes, where each node is connected to one or sometimes more sensors. WSN sensor integrated circuits, embedded systems, networks, modems, wireless communication and dissemination of information. The sensor may be an obligation to technology and science. Recent developments underway to miniaturization and low power consumption. They act as a gateway, and prospective clients, I usually have the data on the server WSN. Other components separate routing network routers, called calculating and distributing routing tables. Discussed the routing of wireless energy balance. Optimization solutions, we have created a genetic algorithm. Before selecting an algorithm proposed for the construction of the center console. In this study, the algorithms proposed model simulated results based on parameters depending dead nodes, the number of bits transmitted to a base station, where the number of units sent to the heads of fuel consumption compared to replay and show that the proposed algorithm has a network of a relative.

Azeem Mohammed Abdul, Srikanth Cherukuvada, Annaram Soujanya, R. Srikanth, Syed Umar
Edge Detection of Degraded Stone Inscription Kannada Characters Using Fuzzy Logic Algorithm

Digital India is an initiative by the Government of India. This initiative encourages digitization and analysis in all walks of life. Digitization will preserve any historical document and that information can access by any individuals by his finger tip from any place. Stone inscriptions are one of the key historical evidences of literature and culture of that region in the passage of time. Recognition and analysis of stone inscriptions play a pivotal role in deciding the era/age it belongs ad to understand the content. A proper digitization and recognition technique is prerequisite and desired. Here, in this work digitization of characters has been done by using ordinary digital camera. Further, the captured images are pre-processed in order to extract features. In this proposed algorithm, gradient analysis is carried out at every pixel in the x and y directions, based on the result it defines an edge using Fuzzy Inference System. The experiment was conducted on twenty set of analogous degraded stone inscriptions Kannada characters and result obtained was magnificent with better time efficiency compared to prior methods.

B. K. RajithKumar, H. S. Mohana, J. Uday, B. V. Uma
Design of Alpha/Numeric Microstrip Patch Antenna for Wi-Fi Applications

The Microstrip Patch Antenna’s (MPA’s) find usability in several day to day applications. They hold several advantages like low-profile structure, low fabrication cost, and they support both circular as well as linear polarizations, etc. This paper proposes MPA for Wi-Fi applications where it is used within the 2.4 GHz range of frequencies (IEEE 802.11). Although various antenna designs are currently prevalent to support the existing systems, in-order to overcome the bandwidth limitations, Alpha-Numeric (Alphabets and Numbers) Microstrip Patch Antennas are proposed in this paper. The MPA’s are designed with 2.4 GHz as their resonant frequency. The simulation results based on the essential antenna performance analysis parameters like Return Loss, Gain, Radiated Power and Directivity are discussed. The antenna’s are designed with a thickness of 1.5 mm, height of 70 mm and width of 60 mm, the substrate material of the antenna’s is Flame Retardant 4 (FR4) and its relative permittivity is 4.4. The designs for all the Microstrip Patch Antennas are simulated using Advanced Design System 2009 (ADS) software.

R. Thandaiah Prabu, R. Ranjeetha, V. Thulasi Bai, T. Jayanandan
Performance Evaluation of Leaf Disease Measure

Leaf disease is a state where I find the abnormality observation in the growth of the plant. Most of the diseases I can easily find out by observing the conditions on leaf at regular intervals of time. If I found the diseases in early stage, then, I can save the plant without further growth of the disease by taking necessary actions. So, here made an attempt to find the leaf disease measure. Local Binary pattern, median filter, Morphological operations and edge detection are used for analyzing the disease in leaf images. For comparing the disease level, Rank table is considered. Finally, calculates the execution time for measuring the leaf disease in a leaf by using various edge detection techniques. This is further extended by using other techniques like hierarchical clustering.

D. Haritha
Ear Recognition Using Self-adaptive Wavelet with Neural Network Classifier

We present a novel approach to ear recognition that utilizes ring-projection method for reducing the dimensions of two-dimensional image into one-dimensional information, that consists of the summation of all pixels that lie on the boundary of a circle with radius ‘r’ and center at the centroid of the image. As a 2-D image is transformed into a 1-D signal, so less memory is required and it is faster than existing 2-D descriptors in the recognition process. Also, ring-projection technique is rotation-invariant. The 1-D information, obtained in the ring-projection method, is normalized so as to make a new wavelet which is named as self-adaptive wavelet. Features are extracted using this wavelet by the process of decomposition. Neural Network based classifiers such as Back Propagation Neural Network (BPNN) and Probabilistic Neural Network (PNN) are used to obtain the recognition rate. A survey of various other techniques has also been discussed in this paper.

Jyoti Bhardwaj, Rohit Sharma
An Efficient Algorithm for Mining Sequential Patterns

The temporal component of the spatio-temporal databases is the key factor that leads to large accumulation of data. It can be said that continuous collection of spatial data, leads to spatio-temporal databases. A event type sequence is called as an sequential pattern and extracting spatio-temporal sequential patterns from spatio-temporal event data sets paves way to define causal relationships between event types. In this paper, a data structure has been proposed to support efficient mining of sequential patterns.

Gurram Sunitha, M. Sunil Kumar, B. Sreedhar, K. Jeevan Pradeep
Comparison of SVM Kernel Effect on Online Handwriting Recognition: A Case Study with Kannada Script

Proposed research work is aimed at investigating the issues specific to online Kannada handwriting recognition and design an efficient writer independent Online Handwriting recognizer. The proposed system accepts continuous Kannada online handwriting from pen tablet and produces recognized Kannada text as the system output. System comprises of pre-processing, segmentation, feature extraction and character recognition units. SVM classifier is implemented to test its efficiency with the Kannada handwritten characters. The recognition rates are analyzed for different SVM kernels.

S. Ramya, Kumara Shama
Feature Extraction of Cervical Pap Smear Images Using Fuzzy Edge Detection Method

In Medical field Segmentation of Medical Images is significant for disease diagnose. Image Segmentation divide an image into regions precisely which helps to identify the abnormalities in the Cancer cells for accurate diagnosis. Edge detection is the basic tool for Image Segmentation. Edge detection identifies the discontinuities in an image and locates the image intensity changes. In this paper, an improved Edge detection method with the Fuzzy approach is proposed to segment Cervical Pap Smear Images into Nucleus and Cytoplasm. Four important features of Cervical Pap Smear Images are extracted using proposed Edge detection method. The accuracy of extracted features using proposed method is analyzed and compared with other popular Image Segmentation techniques.

K. Hemalatha, K. Usha Rani
Minimizing Frequent Itemsets Using Hybrid ABCBAT Algorithm

The expansion in information technology field leads to the increase in amount of data collected. Huge amount of data is stored in databases, data warehouses and repositories. Data mining is the process of analyzing the database and extract the required information and finding the relationships among the items of datasets using association rule mining. Apriori is a familiar algorithm for association rule mining which generates frequent itemsets. In this paper, we propose a new algorithm called hybrid ABCBAT which minimizes the generation of frequent itemsets and also reduces the time, space and memory. In the proposed algorithm, ABC is hybridized with random walk of BAT algorithm. Random walk is used in the place of onlooker bee phase in order to increase the exploration. Hybrid ABCBAT algorithm is applied over the frequent itemsets gathered from apriori algorithm, to minimize frequent itemsets. Different datasets from UCI repository are considered for experiment. The proposed algorithm has better optimization accuracy, convergence rate and robustness.

Sarabu Neelima, Nallamothu Satyanarayana, Pannala Krishna Murthy
Hybrid Models for Offline Handwritten Character Recognition System Without Using any Prior Database Images

In this paper a new method of classification is proposed by making hybrid models by using 3 different technique. One of them is correlation method, which use statistical template matching technique. Other one is principal component analysis in which for each image (character) there are some principal component, named Eigen value, and Eigen vectors. Third is Hough line detection technique, with the help of this we can find number of line segments in a character. Here with the experiments we can say that with the help of mixture of two or more different methods we can get better result. In this paper, we have implemented above techniques without using previous database of character images and getting 94.8% accuracy.

Kamal Hotwani, Sanjeev Agarwal, Roshan Paswan
A Novel Cluster Algorithms of Analysis and Predict for Brain Derived Neurotrophic Factor (BDNF) Using Diabetes Patients

Brain Derived Neurotrophic Factor (BDNF) is involved Diabetes disease is associated with metabolic syndrome. Disease is mainly Type-2 Diabetes Mellitus (T2DM) parameters related to BDNF also. Today’s most people suffered Diabetes Disease. Diabetes Mellitus is a metabolic disorder. Current research is Cluster analyses of T2DM of BDNF data based on predicting the diabetes and identify patients. In this paper, Evaluated as a clustering method for the cluster regarding T2DM of BDNF dataset classifies several clusters. Data Mining is one of the primary methods in clustering. This method examines measurements based on compute minimum, maximum and average values based predict of patients. These algorithms and mathematical problems applied into dataset, evaluate Normalize data and similarity measures based on identifying accurate results. Identification of the BDNF Korley et al. (J Neurotrauma, 33(2):215–225, 2015, [1]) gene these factors help the neurological affected, Change Behavior thing and Mind Depression.

Dharma Dharmaiah Devarapalli, Panigrahi Srikanth
Image Frame Mining Using Indexing Technique

Data mining is a technique the bring out hidden information effectively from an available data set. Most of this extraction works well when performed for binary and character information. Mining information form images is a challenge today for many researchers. Creating of images and videos is easy as it does not require any domain knowledge, but extracting the required knowledge is difficult. For this reason, today video data mining is an interesting area for many researchers. To overcome these problems many researchers are motivated for finding an effective retrieval and indexing technique. This research paper brings a new technique for video content retrieval using hierarchical clustering technique. Objective of this work is to extract image key frames from the trained image set and use this as an image input query. The experiment proved that the proposed technique provided better results than existing video retrieval and indexing technique.

D. Saravanan
A Novel Adaptive Threshold and ISNT Rule Based Automatic Glaucoma Detection from Color Fundus Images

Glaucoma, an eye disease recognized to be the second most leading cause of blindness worldwide. Early detection and subsequent treatment of glaucoma is hence important as damage done by glaucoma is irreversible. Large scale manual screening of glaucoma is a challenging task as skilled manpower in ophthalmology is low. Hence many works have been done towards automated glaucoma detection system from the color fundus images (CFI). In this paper, we propose a novel method of automated glaucoma detection from CFI using color channel adaptive thresholding and ISNT rule. Structural features such as cup-to-disk ratio (CDR), neuro-retinal rim (NRR) area of the optic nerve head (ONH) are extracted from CFI using color channel adaptive thresholding and morphological processing in order to segment Optic Disk (OD) and Optic Cup (OC) required for calculating the CDR value. The results obtained by the proposed methodology are very promising yielding an overall efficiency of 99%.

Sharanagouda Nawaldgi, Y. S. Lalitha, Manjunath Reddy
Comparison of Classification Techniques for Feature Oriented Sentiment Analysis of Product Review Data

With the rapid increase in popularity of e-commerce services over the years, all varieties of products are sold online today. Posting online reviews has become a common means for people to express their impressions on any product, while serving as a recommendation for others. To enhance customer satisfaction and buying experience, often the sellers provide a platform for the customers to express their views. Due to the explosion of these opinion rich sites where numerous opinions about a product are expressed, a potential customer finds it difficult to read all the reviews and form an intelligent opinion about the product. In this research, a new framework comprising of the inbuilt packages of python is designed which mines many customers’ opinions about a product and groups them accordingly based on their sentiments, which aids the potential buyers to form a capitalized view on the product. Here classification of the reviews is done using three different classification algorithms i.e. Naïve Bayes Algorithm, Maximum Entropy Classifier and SVM (Support Vector Machine), and their performance is compared. The methodology showcased in this work can be extended easily in all domains.

Chetana Pujari, Aiswarya, Nisha P. Shetty
Review of Texture Descriptors for Texture Classification

Texture classification is a process of distinguishing or classifying different textures into separate classes. Finding an efficient texture descriptor is a vital step for performing accurate texture classification. The research area of texture classification is widely investigated in several computer vision and pattern recognition problems. In this paper, texture descriptors applied for the texture classification in the literature are summarized. A general framework for texture classification and significance of texture descriptors are also presented in this paper.

Philomina Simon, V. Uma
Improvisation in HEVC Performance by Weighted Entropy Encoding Technique

Now a day multimedia applications are growing rapidly and at the same time the volume of video transactions is raising exponentially. This demands an efficient technique to encode the video and to reduce the congestion in the transmission channel. This paper presents an improvisation technique; weighted encoding for High Efficiency Video Coding (HEVC). This method optimizes the spatial and temporal redundancy during the motion compensation by the optimal choice of code block. The blocks are chosen on the basis of weights- assigned to it using the firefly algorithm. On encoding it reduces the size of the video with perceptually better quality video or Peak Signal to Noise Ratio (PSNR).

B. S. Sunil Kumar, A. S. Manjunath, S. Christopher
Improvement in PMEPR Reduction for OFDM Radar Signal Using PTS Algorithm

In this paper we suggest a method to address the complication of variable amplitude in multicarrier signal. Multifrequency Complementary Phase-Coded (MCPC) signal has fluctuations in amplitude because it is the sum of carriers with different frequency. To avoid non linear working of power amplifiers at transmitter, it is desirable to reduce Peak to Mean Envelope Power Ratio (PMEPR) of the signal. We have tried two algorithms to reduce PMEPR and sidelobes. Namely, clipping technique which is a signal distortion technique and one from signal scrambling technique called as Partial Transmit Sequence (PTS) algorithm.

C. G. Raghavendra, I. M. Vilas, N. N. S. S. R. K. Prasad
Design and Implementation of High Speed VLSI Architecture of Online Clustering Algorithm for Image Analysis

A novel architecture for computing On-line clustering using moving average method for handling varied dimension data up to eight is proposed. The architecture proposed can perform clustering operation in a single clock cycle for any given dimension. A new method for division is proposed using parallel multiplier architecture and power of two which computes the division operation in single clock cycle. The architecture is tested for its working using Xilinx/ISim tool and the design is implemented using FPGA Spartan 3A.

M. G. Anuradha, L. Basavaraj
Entity Level Contextual Sentiment Detection of Topic Sensitive Influential Twitterers Using SentiCircles

Sentiment analysis, when combined with the vast amounts of data present in the social networking domain like Twitter data, becomes a powerful tool for opinion mining. In this paper we focus on identifying ‘the most influential sentiment’ for topics extracted from tweets using Latent Dirichlet Allocation (LDA) method. The most influential twitterers for various topics are identified using the TwitterRank algorithm. Then a SentiCircle based approach is used for capturing the dynamic context based entity level sentiment.

Reshma Sheik, Sebin S. Philip, Arjun Sajeev, Sinith Sreenivasan, Georgin Jose
Prediction of Human Ethnicity from Facial Images Using Neural Networks

This work attempts to solve the problem of ethnicity prediction of humans based on their facial features. Three major ethnicities were considered for this work: Mongolian, Caucasian and the Negro. A total of 447 image samples were collected from the FERET database. Several geometric features and color attributes were extracted from the image and used for classification problem. The accuracy of the model obtained using an MLP approach was 82.4% whereas the accuracy obtained by using a convolution neural network was a significant 98.6%.

Sarfaraz Masood, Shubham Gupta, Abdul Wajid, Suhani Gupta, Musheer Ahmed
Glyph Segmentation for Offline Handwritten Telugu Characters

Segmentation plays a crucial role in the recognition of offline handwritten characters from the digitized document images. In this paper, the authors propose the glyph segmentation method for offline handwritten Telugu characters. This method efficiently segments the top vowel ligature glyph, main glyph, bottom vowel ligature glyph and consonant conjunct glyph from the offline handwritten Telugu character images. It efficiently identifies the small glyphs that are related to the unconnected main glyphs or consonant conjuncts and also efficiently segments the connected top vowel ligature from the main glyph. This approach of segmentation efficiently reduces the train data size for the purpose of offline handwritten Telugu characters recognition system. The result shows the efficiency of proposed method.

C. Naga Manisha, Y. K. Sundara Krishna, E. Sreenivasa Reddy
A Roadmap for Agility Estimation and Method Selection for Secure Agile Development Using AHP and ANN

The modern software industry is expected to provide fast software delivery and because of dynamic environment the customer requirements changes very rapidly, which has lead to inclination towards agile development approaches over other traditional approaches. It has the advantages like fast release and simplified documents which eventually lead to maximizing profit and productivity. However, it is a mammoth task to make a calculative decision about whether to use an agile approach for a given project or not because of the lack of any empirical decision making process. This paper provides a roadmap for making decision using Analytic Hierarchy Process (AHP) and Artificial Neural Network (ANN) with Agility Indicator and if selected, it further suggests which Agile Development method is better suited for among popular methods like Feature-driven Development (FDD), Lean development, Scrum, Crystal Clear, Extreme Programming (XP) and Dynamic Software Development Method (DSDM). It also addresses the major concern about security requirements to enhance the security features by integrating security activities from security engineering processes without degrading the agility of the agile process.

Amit Sharma, R. K. Bawa
Hadoop Framework for Entity Recognition Within High Velocity Streams Using Deep Learning

Social media such as twitter, Facebook are the sources for Stream data. They generate unstructured formal text on various topics containing, emotions expressed on persons, organizations, locations, movies etc. Characteristics of such stream data are velocity, volume, incomplete, often incorrect, cryptic and noisy. Hadoop framework is proposed in our earlier work for recognising and resolving entities within semi structured data such as e-catalogs. This paper extends the framework for recognising and resolving entities from unstructured data such as tweets. Such a system can be used in data integration, de-duplication, detecting events, sentiment analysis. The proposed framework will recognize pre-defined entities from streams using Natural Language Processing (NLP) for extracting local context features and uses Map Reduce for entity resolution. Test results proved that the proposed entity recognition system could identify predefined entities such as location, organization and person entities with an accuracy of 72%.

S. Vasavi, S. Prabhakar Benny
A Novel Random Forest Approach Using Specific Under Sampling Strategy

In Data Mining the knowledge is discovered from the existing real world data sets. In real time scenario, the category of datasets varies dynamically. One of the emerging categories of dataset is class imbalance data. In Class Imbalance data, the percentages of instances in one class are far greater than the other class. The traditional data mining algorithms are well applicable for knowledge discovery from balance datasets. Efficient knowledge discovery is hampered in the case of class imbalance datasets. In this paper, we propose a novel approach dubbed as Under Sampling using Random Forest (USRF) for efficient knowledge discovery from imbalance datasets. The proposed USRF approach is verified on the 11 benchmark datasets from UCI repository. The experimental observations show that an improved accuracy and AUC is achieved with the proposed USRF approach with a good reduction in RMS error.

L. Surya Prasanthi, R. Kiran Kumar, Kudipudi Srinivas
Improved Whale Optimization Algorithm Case Study: Clinical Data of Anaemic Pregnant Woman

WOA is a meta-heuristic algorithm possessing the proper potentiality in solving complex numerical function optimization problems. It works well, but poor in the convergence at exploration and exploitation phases. In order to enhance the convergence enforcement of WOA, a novel constitutional appraising strategy based WOA has been set forth in this paper. In this scenario, constituent states are fully utilized in each of the iterations to supervise the subsequent gazing process, and to counterbalance the local exploration with global exploitation. We fix up with the mechanism together with the convergence straight stuff of the enhanced algorithm. Comparable investigations are supervised on various mathematical benchmark function optimization problems. Simulation results confirm, with statistical significance, that the proposed scenario is more efficient in the convergence performance of WOA. In addition to this, we applied the same technique to a clinical dataset of an anaemic pregnant woman and obtained optimized clusters and cluster heads to secure a clear comprehension and meaningful insights in the clinical decision-making process.

Ravi Kumar Saidala, Nagaraju Devarakonda
A Novel Edge Detection Algorithm for Fast and Efficient Image Segmentation

Edge detection determines the boundaries of objects in an Image. Edge detection is a vital concept in object recognition and Image analysis. This paper evaluates the existing edge detection methods and proposes a new edge detection algorithm which uses the morphological operations, sobel operator, Gaussian Smoothing and masking. The novelty of the proposed algorithm is extracting continuous edges in the Image and removing spurious edges using m-connectivity. The paper introduces performance parameters for edge detection to determine which method gives good results. A parameter named Human Perception Clarity (HPC) is mathematically modeled and experimentally proves the efficacy of proposed algorithm.

P. Prathusha, S. Jyothi
Real Time Audio Steganographic Countermeasure

Steganographic techniques are used to embed data into a cover file using different algorithms. In this paper audio steganography countermeasure is discussed which uses a technique called double stegging or steganographic jamming where variations of LSB embedding algorithms are used for audio steganography prevention which can be used in real time under acceptable information loss. We then proceed to show that this method renders hidden embedded data unrecoverable. The resulting audio quality after Steganographic Jamming is evaluated. Mean Opinion Score and Signal to Noise Ratio are used to calculate the quality of output audio file which shows the effectiveness of the technique described.

T M Srinivas, P P Amritha
A Hybrid Method for Extraction of Events from Natural Language Text

Events extraction is a significant and interesting task in the field of Natural Language Processing (NLP). Basically events are the dynamic occurrences, specific happenings, causes or things. An event plays a vital role in narrative of text and also important for many NLP applications. This paper presents a Hybrid/Composite way of events extraction from natural language text. Earlier work of events extractions were developed with rule based approach or machine learning methods. The Proposed hybrid makes use of both machine learning approaches and hand coded rules to extract the events. Experiments were conducted on SemEval-2010 data set, the results obtained shown better precision and recall when compared with the existing methods.

Vanitha Guda, S. K. Sanampudi
A Proposal on Application of Nature Inspired Optimization Techniques on Hyper Spectral Images

Hyper spectral image are used in various applications such as geological systems, geo sciences and astronomy. These images are acquired using remote sensing. Remote sensing is the process of getting information about an object without making any physical contact with the object. Satellite Images referred as hyper spectral images are the most used images in remote sensing and are of more interest to find out the classification of objects in those images. The classification can give us the important factors like vegetation, buildings, roads and more. Satellite images can be of assistance in supervision of effects due to natural disasters, to recognize mining areas which are hidden from human view, biodiversity examination, rural and urban environment detection for analysis, etc. However, occasionally the Satellite images acquired can be affected by unforeseen distortions, artificial unwanted structures called artifacts that are formed by the tool itself or sometimes due to the diverse pre-processing procedures involved. Optimization algorithms in combination with Image processing methods are used to classify the objects in satellite images for easy perception and analysis. In this paper, various optimization techniques like particle swarm optimization (PSO), DPSO, HSO, and Proposed MFA optimization algorithms are compared to obtain optimal classification of objects in a satellite image.

M. Venkata dasu, P. VeeraNarayana Reddy, S Chandra Mohan Reddy
Dual Notched Band Compact UWB Band Pass Filter Using Resonator Loaded SIR for Wireless Applications

In this paper a compact and simple Ultra Wide Band (UWB) Band Pass Filter(BPF) with dual notch band is proposed. Structure of the proposed filter comprises a Stepped Impedance Resonator (SIR) loaded with Interdigital resonator. SIR is coupled with input and output ports using interdigital coupling to create UWB pass band range of the filter. Intedgital resonators are loaded on the SIR to create notch band to reject frequency in the pass band of the filter. Pass band of the filter covers 3.1 to 10.6 GHz with two notch bands at 5.2 and 7.8 GHz to avoid interfernce from the WiFi and satellite comunication. Filter is designed on Rogers RT/ Duriod 6010, with the dielectric substrate of dielectric constant εr = 10.8 and thickness h = 1.27 mm. There is no via or defected ground structure is used in the filter which makes its fabrication easier and cost effective.

Arvind Kumar Pandey, R. K. Chauhan
Frequent SubGraph Mining Algorithms: Framework, Classification, Analysis, Comparisons

Graphs and Trees are non-linear data structures used to organise, model and solve many real world problems and becoming more popular both in scientific as well as commercial domains. They have wide number of applications ranging from Telephone networks, Internet, Social Networks, Program flow, Chemical Compounds, BioInformatics, XML data, Terrorist networks etc. Graph Mining is used for finding useful and significant patterns. Frequent subgraph Mining mines for frequent patterns and subgraphs and they form the basis for Graph clustering, Graph classification, Graph Based Anomaly Detection. In this paper, classification of FSM algorithms is done and popular frequent subgraph mining algorithms are discussed. Comparative study of algorithms is done by taking chemical compounds dataset. Further, this paper provides a framework which acts as strong foundation in understanding any frequent subgraph mining algorithm.

Sirisha Velampalli, V. R. Murthy Jonnalagedda
Facial Expression Classification Using Machine Learning Approach: A Review

Automatic Facial Expression analysis has enthralled increasing attention in the research community in excess of two decades and its expedient in many application like, face animation, customer satisfaction studies, human-computer interaction and video conferencing. The precisely classifying different emotion is an essential problem in facial expression recognition research. There are several machine learning algorithms applied to facial expression recognition expedition. In this paper, we surveyed three different machine learning algorithms such as Bayesian Network, Hidden Markov Model and Support Vector machine and we attempt to answer following questions: How classification algorithm used its characteristics for emotion recognition? How various parameters in learning algorithm is devoted for better classification? What are the robust features used for training? Finally, we examined how advances in machine learning technique used for facial expression recognition?

A. Baskar, T. Gireesh Kumar
Heuristic Approach for Nonlinear n × n (3 ≤ n ≤ 7) Substitution-Boxes

Substitution boxes are meant to enact nonlinear transformations of n-bit input streams to n-bit output streams. A highly nonlinear essence of them is imperative to induce obligatory confusion of data and to mitigate the potential linear cryptanalysis as well. It has been known that cryptographically potent S-boxes are creditworthy for the success of modern block encryption systems. This paper proposes to suggest an approach to frame a generic design that has the efficacy of synthesizing highly nonlinear balanced n × n S-boxes for 3 ≤ n ≤ 7. The proposed approach is based on the heuristic optimization that seeks for local and global best S-box candidates on each iteration. The resultant optimized S-boxes are provided and tested for nonlinearity soundness. The performance outcomes and assessment analysis justify that the generic approach is consistent for contriving highly nonlinear key-dependent S-boxes.

Musheer Ahmad, M. Alauddin, Hamed D. AlSharari
Bi-Temporal Versioning of Schema in Temporal Data Warehouses

The temporal design of data warehouse (DW), which is an extension to multidimensional model gives a provision to implement the solution to handle time-varying info in dimensions. The dimension data is time-stamped with valid time (VT) to maintain a complete data history in temporal data warehouses (TDWs). Thus, TDWs manage evolvement of schema over a period of time by using versioning of schemas as well as evolution of data described under various versions of schema. But schema versioning in TDWs has not been covered in full detail. Mainly, the approaches to handle schema versions using valid time were proposed so far. This paper proposes an approach for bitemporal versions of schema in temporal DW model that allows for retroactive and proactive schema modifications and in addition also helps in tracking them.

Anjana Gosain, Kriti Saroha
An Improved Mammogram Classification Approach Using Back Propagation Neural Network

Mammograms are generally contaminated by quantum noise, degrading their visual quality and thereby the performance of the classifier in Computer-Aided Diagnosis (CAD). Hence, enhancement of mammograms is necessary to improve the visual quality and detectability of the anomalies present in the breasts. In this paper, a sigmoid based non-linear function has been applied for contrast enhancement of mammograms. The enhanced mammograms are used to define the texture of the detected anomaly using Gray Level Co-occurrence Matrix (GLCM) features. Later, a Back Propagation Artificial Neural Network (BP-ANN) is used as a classification tool for segregating the mammogram into abnormal or normal. The proposed classifier approach has reported to be the one with considerably better accuracy in comparison to other existing approaches.

Aman Gautam, Vikrant Bhateja, Ananya Tiwari, Suresh Chandra Satapathy
Performance Comparison of Pattern Search, Simulated Annealing, Genetic Algorithm and Jaya Algorithm

In this paper, we have shown the performance comparison of four powerful global optimization algorithms, namely Pattern Search, Simulated Annealing, Genetic Algorithm and Jaya Algorithm. All of these algorithms are used to find an optimum solution. The standard benchmark functions are utilized for the implementation. The results are collected and analyzed that helps to classify the algorithms according to their computational capability to solve the optimization problems.

Hari Mohan Pandey, Manjul Rajput, Varun Mishra
Maven Video Repository: A Visually Classified and Tagged Video Repository

WEB 2.0’s accelerated growth has paved a way for the emergence of social video sharing platforms. These video sharing communities produce videos at an exponential rate. Unfortunately, these videos are incongruously tagged, leading to minimal amount of metadata to retrieve them. Categorizing and indexing these videos has become a pressing problem for these communities. Videos generated by these communities depend on users to tag them, thus they end up being loosely tagged. An innovative and novel application has been presented to classify and tag these large volumes of user-generated videos. The above proposed content-based automatic tagging application tags the videos, which further help in indexing and classifying them. This application first recognizes the person in the video and then discerns their emotions and then creating a MPEG-7 xml file to store the metadata. This application will drastically reduce human effort and radically increase the efficiency of video searching.

Prashast Sahay, Ijya Chugh, Ridhima Gupta, Rishi Kumar
A Comprehensive Comparison of Ant Colony and Hybrid Particle Swarm Optimization Algorithms Through Test Case Selection

The focus of this paper is towards comparing the performance of two metaheuristic algorithms, namely Ant Colony and Hybrid Particle Swarm Optimization. The domain of enquiry in this paper is Test Case Selection, which has a great relevance in software engineering and requires a good treatment for the effective utilization of the software. Extensive experiments are performed using the standard flex object from SIR repository. Experiments are conducted using Matlab, where Execution time and Fault Coverage are considered as quality measure, is reported in this paper which is utilized for the analysis. The underlying motivation of this paper is to create awareness in two aspects: Comparing the performance of metaheuristic algorithms and demonstrating the significance of test case selection in software engineering.

Arun Prakash Agrawal, Arvinder Kaur
Parameter Estimation of Software Reliability Model Using Firefly Optimization

This paper, presents an effective parameter estimation technique for software reliability growth models using firefly algorithm. Software failure rate with respect to time has always been a foremost concern in the software industry. Every second organization aims to achieve defect free software products, which makes software reliability prediction a burning research area. Software reliability prediction techniques generally use numerical estimation method for parameter estimation, which is certainly not the best. Local optimization, biasness and model’s parameter initialization are some foremost limitation, which eventually suffers the finding of optimal model parameters. Firefly optimization overcomes these limitations and provides optimal solution for parameter estimation of software reliability growth models. Goel Okumoto model and Vtub based fault detection rate model is selected to validate the results. Seven real world datasets were used to compare the proposed technique against Cuckoo search technique and CASRE tool. The results indicate the superiority of proposed approach over existing numerical estimation techniques.

Ankur Choudhary, Anurag Singh Baghel, Om Prakash Sangwan
Prediction of Crime Trends Using Mk-MC Technique

Day by day the quantum of data has been increasing not only in terms of user generated content in social media but also outside the social media, due to which the data has gone from scarce to superabundant that conveys new advantages to users. This explosion of data has made it difficult to handle and analyze huge datasets. Therefore, the techniques of Data Mining assist in exploring and analyzing enormous datasets and helps in discovering meaningful patterns. Clustering is one such task of Data Mining that gathers all the data and partitions it into various groups taking into account their similarity or closeness measure. Clustering in the field of Social Science is used in identification, analysis and detection of various crime patterns. This paper proposes the Modified k-means clustering technique which is applied on the fictitious crime data in order to identify various crime patterns or trends and make a variety of predictions from the analysis of different crime patterns.

B. M. Vidyavathi, D. Neha
A Novel Map-Reduce Based Augmented Clustering Algorithm for Big Text Datasets

Text clustering is a well known technique for improving quality in information retrieval, In Today’s real world data is not organized in the essential manner for a precise mining, given a large unstructured text document collection it is essential to organize into clusters of related documents. It is a contemporary challenge to explore compact and meaning insights from large collections of the unstructured text documents. Although many frequent item mining algorithms have been discovered yet most do not scale for “Big Data” and also takes more processing time. This paper presents a high scalable speedy and efficient map reduce based augmented clustering algorithm based on bivariate n-gram frequent item to reduce high dimensionality and derive high quality clusters for Big Text documents and also the comparative analysis is shown for the sample text datasets with stop word removal the proposed algorithm performs better than without stop word removal.

K. V. Kanimozhi, M. Venkatesan
Scrutiny of Data Sets Through Procedural Algorithms for Categorization

This paper evaluates the selected classification algorithms for classifying thyroid datasets. The classification algorithms considered here are principle component analysis method and partial least square regression method of machine learning algorithms. After successful prediction of disease levels, these algorithms resultant output levels are compared. The analysis suggests the best classifiers for predicting the exact levels of thyroid disease. This work is a comparative study of above said algorithms on thyroid dataset firmly collected from UCI Machine Learning Repository.

Prasad Vadamodula, M. Purnachandra Rao, V Hemanth Kumar, S Radhika, K Vahini, Ch. Vineela, Ch. Sravani, Srinivasa Rao Tamada
Effective Security in Social Data Security in OSNs

Assurance is a grinding center that enhances when trades perform intermediate in Online Social Networks (OSNs). Diverse gatherings of utilization innovation specialists have limited the ‘OSN security issue’ as one of surveillance, institutional or open security. In taking care of these issues they have moreover overseen them just as they were person. We adapt that the elite security issues are caught and that evaluation on genuine feelings of serenity in Online Social Networks would advantage from a more exhaustive method. Nowadays, points of interest systems mean a critical piece of relationship; by fail in security, these organizations will decrease a ton of pleasant areas to see as well. The inside motivation behind subtle elements security (Content Privacy) is risk control. There is a important deal of discovering works and exercises in privacy to risk control (ISRM, for example, NIST 800-30 and ISO/IEC 27005. Regardless, only few works of appraisal focus on Information Security danger diminishment, while the signs depict normal determinations and suggestions. They don’t give any use ideas concerning ISRM; truth be told diminishing the Information Security dangers in questionable conditions is cautious. Subsequently, these papers joined acquired counts (GA) for Information Security danger loss of weaknesses. Finally, the parity of the associated system was broke down through a reflection.

A. S. V. Balakrishna, N. Srinivasu
Analysis of Different Pattern Evaluation Procedures for Big Data Visualization in Data Analysis

Data visualization is the main focusing concept in big data analysis for processing and analyzing multi variate data, because of rapid growth of data size and complexity of data. Basically data visualization may achieve three main problems, i.e. 1. Structured and Unstructured pattern evaluation in big data analysis. 2. Shrink the attributes in data indexed big data analysis. 3. Rearrange of attributes in parallel index based data storage. So in this paper we analyze different techniques for solving above three problems with feasibility of each client requirement in big data analysis for visualization in real time data stream extraction based on indexed data arrangement. We have analyzed different prototypes in available parallel co-ordinate and also evaluate quantitative exert review in real time configurations for processing data visualization. Report different data visualization analysis results for large and scientific data created by numerical simulation in practice sessions analysed in big data presentation.

Srinivasa Rao Madala, V. N. Rajavarman, T. Venkata Satya Vivek
Analysis of Visual Cryptographic Techniques in Secret Image Data Sharing

Due to expanding computerized world continuously environment, security has gotten to be creative assignment in transmitting picture. There are more number of methods presented for security in advanced pictures for insurance from inventive uninvolved or dynamic assaults in system correspondence environment. Like insightful Visual Cryptographic (VC) is a cutting edge strategy, which is utilized to mystery picture safely impart furthermore keep up to privacy. To proceed with difficulties of security in advanced picture information sharing, in this paper we break down various VC security instruments for computerized picture information offering to regard to mystery information secrecy. Our examination give effective security answers for relative mystery advanced picture information imparting to correspondence progressively environment.

T. Venkata Satya Vivek, V. N. Rajavarman, Srinivasa Rao Madala
Wavelet Based Saliency Detection for Stereoscopic Images Aided by Disparity Information

In the field of Computer vision, dependable assessment of visual saliency permits suitable processing of pictures deprived of earlier learning of their substance, and therefore sustains as an imperative stride in numerous errands including segmentation, object identification, and Compression. In this paper, we present a novel saliency recognition model for 3D pictures in view of highlight difference from luminance, color, surface texture, and depth. Difference of the stereo pair is extricated utilizing sliding window strategy. Then we present a contrast based saliency identification method that assesses global contrast divergences and spatial lucidness at the same time. This calculation is straightforward, proficient, and produces full determination saliency maps by combination of the considerable number of elements removed. Our calculation reliably performed better than existing saliency discovery strategies, yielding higher accuracy. We likewise show how the extricated saliency guide can be utilized to make top notch division covers for ensuing picture handling.

Y. Rakesh, K. Sri Rama Krishna
Dynamic Load Balancing Environment in Cloud Computing Based on VM Ware Off-Loading

A novel framework to upgrade the execution of adaptable applications and extra battery use. This structure offloads only the concentrated systems. The offloading strategy depends on upon the module asserted component off-loader. This module picks at runtime whether the application’s schedules will run locally on the adaptable or offload to the cloud. Green Cloud processing (GCC) has drawn basic examination thought as the unmistakable quality and capacities of mobile phones have been improved starting late. In this paper, we demonstrate a structure that uses virtualization advancement to apportion server farm assets persistently checking application requests and fortify green enhancing to enlist the measure of servers being used. We demonstrate the likelihood of “skewness” to gage the unevenness in the multi-dimensional asset usage of a server. By minimizing skewness, we can join different sorts of workloads charmingly and enhance the general utilization of server assets. We build up an arrangement of heuristics that imagine over-weight in the structure plausibly while sparing centrality utilized. Take after driven reenactment and examination comes about demonstrate that our calculation accomplishes great execution.

C. Dastagiraiah, V. Krishna Reddy, K. V. Pandurangarao
Dynamic Secure Deduplication in Cloud Using Genetic Programming

Cloud Data Storage reduces trouble on customers concerning their neighborhood outsourcing data are new issues with respect with data duplicates in the cloud. But some earlier systems deals with the issue of completing an approach to manage handles cloud security and execution with respect to de-duplication by properly applying together in the cloud with record signature recognizing verification methodology using standard Hash based Message Authentication Codes (HMAC). As a result of these hash code counts like SHA-1 and MD5 the record dependability qualities are epic inciting absence of movement variable at the de-duplication estimation. In view of this above issue the limit show obliges prior dependability hash codes inciting execution issues. In this paper, we propose a Genetic Programming approach to manage record deduplication that joins several unmistakable bits of confirmation expelled from the data substance to find a deduplication point of confinement that has the cutoff see whether two segments in a store are copies or not. As showed up by our trials, our procedure beats a present bleeding edge strategy found in the written work. Moreover, the proposed limits are computationally less asking for since they use less affirmation. Moreover, our inherited programming technique is set up to do thusly changing these abilities to a given settled duplicate ID limit, freeing the customer from the heaviness of picking and tune this parameter.

K. V. Pandu Ranga Rao, V. Krishna Reddy, S. K. Yakoob
Synthesis of Sector Beams from Array Antennas Using Real Coded Genetic Algorithm

In this paper, flat top far field radiation patterns known as sector beams are generated from linear arrays using Real Coded Genetic Algorithm. For all specified angular sectors, the pattern synthesis is carried out. The synthesis involves the determination of complex excitation functions for linear array for different beam widths. These patterns are numerically generated with the designed excitation levels. Controlling the ripples in the trade—in region maintaining the side lobe levels in the acceptable limits in the trade—off region is made possible in the present work.

Sudheer Kumar Terlapu, G. S. N. Raju
Performance Analysis of Encrypted Data Files by Improved RC4 (IRC4) and Original RC4

In Cryptography, RC4 is one of the best known stream cipher from the last two decades in steam cipher family and it’s already achieved the trust of many organizations after its code become public. Many Eminent researchers like Roos and et al. established that RC4 contains weakness and biasness in its internal stages. Most of the researcher feels that in KSA part of RC4 components contain most bias values related to the keys. If the KSA part generates strong output as for input of PRGA then resultant stream become more random. Already we proved and published that Improved RC4 (IRC4) gives better result compare to original RC4 in the generation of stream keys for encryption of any plaintext file. In this paper we generate two types of output stream files from the algorithm of IRC4 and original RC4 and implemented the both types of stream files to make encrypted files using same plaintext files and generates corresponding cipher text files. The two set of encrypted type files are testing through NIST Statistical package and from where we find out that IRC4 returns better result related to randomness of the cipher texts files, from where we can conclude that IRC4 algorithm generates more random stream and indirectly increase the security.

Hemanta Dey, Uttam Kumar Roy
A Novel Context and Load-Aware Family Genetic Algorithm Based Task Scheduling in Cloud Computing

With the advent of web technologies and efficient networking capabilities, desktop applications are increasingly getting amalgamated with the touch of cloud computing. Most of the recent developments are dominated by consumer centric market, ensuring best quality of service and hence, greater customer base, leading to the rise of peaks in the profit charts. However, certain challenges in the field of cloud need to be dealt with, before peak performance is achieved and resource scheduling is one of these. This paper aims to present a context and load aware methodology for efficient task scheduling using modified genetic algorithm known as family genetic algorithm. Based on analysis of user characteristics, user requests are fulfilled by the right type of resource. Such a classification helps attain efficient scheduling and improved load balancing and will prove advantageous for the future of the cloud. Results show that the proposed technique is efficient under various circumstances.

Kamaljit Kaur, Navdeep Kaur, Kuljit Kaur
Indian Stock Market Analysis Using CHAID Regression Tree

Data mining is the technique which utilized to extract concealed “analytical” and “predictive” facts and figures from a big set of datasets and databases. It is being applied in various research areas by data scientists and analysts such as mathematics, marketing, genetics, cybernetics, etc. In this paper, A Chi Squared Automatic Interaction Detection (CHAID) regression tree model has been proposed to infer the volatility of the Stock Exchange Sensitive Index (SENSEX) data while explicitly accounting for dependencies between multiple derived attributes. Using real stock market data, dynamic time varying graphs are constructed to further analyze how the volatility depends on various factors such as Lok Sabha Elections, Domestic Riots, Union Budget of India, Indian Monsoon and Global factors. Factors have been analyzed to understand their role in the fluctuations seen over time in the market and how the SENSEX behave over these factors.

Udit Aggarwal, Sai Sabitha, Tanupriya Choudhury, Abhay Bansal
A Novel Approach to Improve the Performance of Divisive Clustering-BST

The traditional way of searching data has many disadvantages. In this context we propose Divisive hierarchical clustering method for quantitative measures of similarity among objects that could keep not only the structure of categorical attributes but also relative distance of numeric values. For numeric clustering, the quantity of clusters can be approved through geometry shapes or density distributions, in the proposed Divclues-T Calculate the Arithmetic mean it is called as a root node, the objects smaller than root node fall into left sub tree otherwise right sub tree this process is repeated until we find singleton object.

P. Praveen, B. Rama
Prediction and Analysis of Pollution Levels in Delhi Using Multilayer Perceptron

Air Pollution is a major problem faced by humans worldwide and is placed in the top ten health risks. Particulate Matter (PM10) is one of the major parameters to measure the air quality of an area. These are the particulate matter of the size 10 μm or less suspended in the air. PM10 occur naturally from volcanoes, forest fire, dust storms etc., as well as from human activities like coal combustion, burning of fossil fuels etc. The PM10 value is predicted by multilayer perceptron algorithm, which is an artificial neural network, Naive Bayes algorithm and Support Vector Machine algorithm. Total of 9 meteorological factors are considered in constructing the prediction model like Temperature, Wind Speed, Wind Direction, Humidity etc. We have then constructed an analysis model to find the correlation between the different meteorological factors and the PM10 value. Results are then compared for different algorithms, which show MLP as the best.

Aly Akhtar, Sarfaraz Masood, Chaitanya Gupta, Adil Masood
An Architectural View Towards Autonomic Cloud Computing

Cloud computing is causing significant transformations in the world of information technology. It continues to be the hot favorite of people from both within and outside the IT industry. One of the key factors for which the Cloud is know is its accessibility to never ending resources. This is a perception which the Cloud has been able to maintain since long but due to extensive user involvements and humungous amounts of data this perception seems to fade away. In present day Cloud seems to face challenges when it comes to over utilization of resources, fault tolerance and dynamic monitoring and management of services. In order to address these problems, human intervention continues to increases thus causing a negative impact on the QoS and generic nature of cloud. In order to overcome these challenges we propose an Autonomic Cloud Computing Environment which provides dynamic allocation and monitoring of resources along with orchestration of cloud services based upon VM migration. The system that we propose is SLA complaint and automates the user experience depending upon the clauses mentioned in the SLA.

Ravi Tomar, Abhirup Khanna, Ananya Bansal, Vivudh Fore
Implementation and Analysis of TL-SMD Cryptographic Hash Algorithm

With an advent of technological innovative tools and technology, Data security has become a major challenge in today’s world. The solution to this challenge comes out in the way of Cryptographic hash function which is used in various security applications. It is a one-way hash function which is designed to provide data security. One-way hash functions are those hash functions which cannot be reverted back i.e. we cannot find the input or the actual message bits using the hexadecimal output value. TL-SMD is a cryptographic hash function having two layers of encryption. This paper is an extension to the TL-SMD work, here the algorithm is implemented using MATLAB and results are analysed and discusses on various steps for the further improvement in data security.

Manish Grewal, Nihar Ranjan Roy, Twinkle Tiwari
Contrast Enhancement of an Image by DWT-SVD and DCT-SVD

In this paper a novel contrast stretching technique is proposed that is based on two methods: (a) Discrete Wavelet Transform (DWT) followed by SVD and (b) Discrete Cosine Transform (DCT) followed by SVD where SVD refers to Singular Value Decomposition. In DWT-SVD technique, DWT is applied on an image resulting in the conversion of that entire image into four distinct frequency subbands (i.e. LL, LH, HL and HH subbands) followed by the application of SVD on LL sub-band of DWT processed image (because LL subband contains illumination coefficients). In this way, the values of illumination coefficients are normalized and LL subband is reformed using updated coefficients. Afterwards, image is reconstructed by using inverse DWT. In 2nd method, image is processed with DCT followed by SVD and after the reconstruction of the image in frequency domain, finally image is reconstructed by taking inverse DCT. This paper provides modified technique of DWT-SVD technique as DWT-SVD technique alone cannot produce appreciable results for some images having low contrast. Based upon the quality of contrast within an image, DWT-SVD or DCT-SVD can be used.

Sugandha Juneja, Rohit Anand
A Fuzzy Logic Based Approach for Data Classification

In this paper, we have developed a new algorithm to handle the classification of data by using fuzzy rules on real world data set. Our proposed algorithm helps banks to decide whether to grant loan to customers by classifying them into three clusters—accepted, rejected and those who have probability to get loan. To handle third cluster, fuzzy logic based approach is appropriate. We have implemented our proposed algorithm on standard bank of England data set. Our algorithm makes prediction for getting loan on basis of various attributes like job status, applicant is the chief loan applicant or not, source of income, weight factor etc. Fuzzy rules generated from the numerical data give output in linguistic terms. We have compared our algorithm with the state of the art algorithms—K-Means, Fuzzy C-means etc. Our algorithm has proved to be more efficient than others in terms of performance.

Shweta Taneja, Bhawna Suri, Sachin Gupta, Himanshu Narwal, Anchit Jain, Akshay Kathuria
Facebook Like: Past, Present and Future

As a social networking website, Facebook has a huge advantage over other sites: the emotional investment of its users. However, such investments are meaningful only if others respond to them. Facebook provides a way to its users for responding to posts by writing comments or by pressing a Like button to express their reactions. Since its activation on February 9, 2009, the Facebook Like button has evolved as an essential part of users’ daily Facebook routines and a popular tool for them to express their social presence. However, the inadequacy of the Like button in expressing the original sentiments of a user towards a post has raised serious discussions among the users. It is an apparent deduction that Facebook Like disappoints at addressing the wide spectrum of emotions that an online human communication entails. It does not let the post creator ascertain that the sentiment behind his post has been perceived in its true essence. Even after the collaboration with emotions, the Like button still has a wide range of issues that needs to be addressed. The paper considers these pros and cons associated with the current Facebook Like button. The paper also provides novel technique to improve the efficiency of the Like feature by associating it with an intelligent engine for generating recommendations to the users. This, in turn, shall improve the user-posted content on Facebook.

Kumar Gaurav, Akash Sinha, Jyoti Prakash Singh, Prabhat Kumar
Criminal Policing Using Rossmo’s Equation by Applying Local Crime Sentiment

The paper explains discusses criminal policing by applying Rossmo’s equation and local crime sentiment approach is proposed where nine types of crimes are discussed which is put into four categories. Each category weight is calculated from the criminal database based on his/her crimes. Rapid miner tool is used to generate graph using results generated through Rossmo’s equation and local crime sentiment approach. The resultant graph is then analyzed to predict the most probable criminal. Gurgaon proclaimed offender case is used for case study. The experimental proves that the approach presented in the paper will give accurate results.

Fuzail Ahmad, Simran Syal, Mandeep Singh Tinna
Moving Object Tracking and Detection Based on Kalman Filter and Saliency Mapping

There are many applications like video surveillance, object detection and tracking which require the processing of video to extract the desired result out of it. In this paper, we use saliency mapping to extract the interested regions of a video after its successful detection and tracking using Kalman filter. The saliency mapping uses the concept of temporal saliency mapping and spatial saliency mapping to distinguish between the various regions of a video. The high motion region is detected with the help of temporal mapping, while region consisting of regular movement is identified by spatial mapping. The effective saliency map is created using the combination of both spatial and temporal saliency mapping of the salient object. The experimental results obtained using public dataset, shows that our method performed well in detection, tracking and saliency mapping of the object.

Priyanka Prasad, Ashutosh Gupta
XUBA: An Authenticated Encryption Scheme

In this paper, we propose a stream cipher based authenticated encryption scheme, XUBA capable of achieving standard security. It is a bit based stream cipher with a key and initialization vector of 128 bits. Authenticity is provided by generating a tag of 128 bits irrespective of the input message. The new cipher is resistant to algebraic, differential and time-memory-data trade-off attacks. 128-bits of the tag makes the cipher resistant to forgery attack and guessing of MAC.

R. Neethu, M. Sindhu, Chungath Srinivasan
Secure Speech Enhancement Using LPC Based FEM in Wiener Filter

Speech enhancement is a process which cultivates the quality of speech signal in noisy environment. It refers to removing or reducing the background noise in order to obtain an improved quality of original speech signal. Degradation of speech signal is most common problem in speech communication, so enhancement of the speech plays a vital role in improving the quality of speech signal. A number of methods are used for speech enhancement. Here we are using LPC based FEM in Wiener filter for speech enhancement. This method is then compared with several speech enhancement algorithms to obtain a better result or better speech quality. On comparing this method with other speech enhancement methods we obtain better speech performance. Here we are using NOIZEUS speech database in order to compare different speech enhancement methods. The experimental result shows that our proposed method provides better result and there is no information loss in original speech signal.

Kavita Bhatt, C. S. Vinitha, Rashmi Gupta
Backmatter
Metadata
Title
Data Engineering and Intelligent Computing
Editors
Suresh Chandra Satapathy
Vikrant Bhateja
K. Srujan Raju
B. Janakiramaiah
Copyright Year
2018
Publisher
Springer Singapore
Electronic ISBN
978-981-10-3223-3
Print ISBN
978-981-10-3222-6
DOI
https://doi.org/10.1007/978-981-10-3223-3

Premium Partner