Skip to main content
Top

2016 | Book

Proceedings of 3rd International Conference on Advanced Computing, Networking and Informatics

ICACNI 2015, Volume 1

Editors: Atulya Nagar, Durga Prasad Mohapatra, Nabendu Chaki

Publisher: Springer India

Book Series : Smart Innovation, Systems and Technologies

insite
SEARCH

About this book

Advanced Computing, Networking and Informatics are three distinct and mutually exclusive disciplines of knowledge with no apparent sharing/overlap among them. However, their convergence is observed in many real world applications, including cyber-security, internet banking, healthcare, sensor networks, cognitive radio, pervasive computing amidst many others. This two volume proceedings explore the combined use of Advanced Computing and Informatics in the next generation wireless networks and security, signal and image processing, ontology and human-computer interfaces (HCI). The two volumes together include 132 scholarly articles, which have been accepted for presentation from over 550 submissions in the Third International Conference on Advanced Computing, Networking and Informatics, 2015, held in Bhubaneswar, India during June 23–25, 2015.

Table of Contents

Frontmatter

Soft Computing Techniques for Advanced Computing

Frontmatter
Fuzzy Logic Based UPFC Controller for Voltage Stability and Reactive Control of a Stand-Alone Hybrid System

This paper is mainly focused on the implementation of fuzzy logic based UPFC controller in the isolated wind-diesel-micro hydro system for management of reactive power and enhancement of voltage stability. For better analysis a linearised small signal transfer function system is considered for different load inputs. The fuzzy based UPFC controller has been tunned to improve the reactive power of the off grid system. Simulation in MATLAB environment has been carried out and the effectiveness of fuzzy tuned controller is established..

Asit Mohanty, Meera Viswavandya, Sthitapragyan Mohanty, Dillip Mishra
APSO Based Weighting Matrices Selection of LQR Applied to Tracking Control of SIMO System

This paper employs an adaptive particle swarm optimization (APSO) algorithm to solve the weighting matrices selection problem of linear quadratic regulator (LQR). One of the important challenges in the design of LQR for real time applications is the optimal choice state and input weighting matrices (Q and R), which play a vital role in determining the performance and optimality of the controller. Commonly, trial and error approach is employed for selecting the weighting matrices, which not only burdens the design but also results in non-optimal response. Hence, to choose the elements of Q and R matrices optimally, an APSO algorithm is formulated and applied for tracking control of inverted pendulum. One of the notable changes introduced in the APSO over conventional PSO is that an adaptive inertia weight parameter (AIWP) is incorporated in the velocity update equation of PSO to increase the convergence rate of PSO. The efficacy of the APSO tuned LQR is compared with that of the PSO tuned LQR. Statistical measures computed for the optimization algorithms to assess the consistency and accuracy prove that the precision and repeatability of APSO is better than those of the conventional PSO.

S. Karthick, Jovitha Jerome, E Vinodh Kumar, G Raaja
Application of Fuzzy Soft Multi Sets in Decision-Making Problems

Alkhazaleh and Salleh presented a fuzzy soft multi set theoretic approach to solve decision-making problems using the Roy-Maji Algorithm, which has some limitations. In this research work, we have proposed an algorithm to solve fuzzy soft multi set based decision making problems using the Feng’s algorithm, which is more stable and more feasible than the Alkhazaleh–Salleh Algorithm. The feasibility of our proposed algorithm in practical applications is illustrated by a numerical example.

Anjan Mukherjee, Ajoy Kanti Das
Fuzzy kNN Adaptation to Learning by Example in Activity Recognition Modeling

Activity Recognition is a complex task of the Human Computer Interaction (HCI) domain. k-Nearest Neighbors (kNN) a non-parametric classifier, mimics human decision making, using experiences for segregating a new object. Fuzzy Logic mimics human intelligence to make decisions; but suffers from requiring domain expertise to propose novel rules. In this paper a novel technique is proposed that comes with efficient fuzzy rules from the training data. The kNN classifier is modified by incorporating fuzzification of the feature space by learning from the data and not relying solely on domain experts to draw fuzzy rules. Additional novelty is the efficient use of the Fuzzy Similarity Relations and Fuzzy Implicators for hybridization of the kNN Classifier. The proposed hybridized fuzzy kNN classifier is shown to perform 5.6 % better than the classical kNN counterpart.

Vijay Borges, Wilson Jeberson
Plant Leaf Recognition Using Ridge Filter and Curvelet Transform with Neuro-Fuzzy Classifier

The current work proposes an innovative methodology for the recognition of plant species by using a combination of shape and texture features from leaf images. The leaf shape is modeled using Curvelet Coefficients and Invariant Moments while texture is modeled using a Ridge Filter and some statistical measures derived from the filtered image. As the features are sensitive to geometric orientations of the leaf image, a pre processing step is performed to make features invariant to geometric trans-formations. To classify images to pre-defined classes, a Neuro fuzzy classifier is used. Experimental results show that the method achieves acceptable recognition rates for images varying in texture, shape and orientation.

Jyotismita Chaki, Ranjan Parekh, Samar Bhattacharya
An Effective Prediction of Position Analysis of Industrial Robot Using Fuzzy Logic Approach

Industrial robots have been extensively used by many industries as well as organizations for different applications. This paper introduces some qualitative parameters to find out the best predictive value as per the comparison to experimental value. In the fuzzy-based method, the weight of each criterion and the rating of each alternative are described by using different membership functions and linguistic terms. By using four techniques triangular, trapezoidal, Gaussian and Hybrid membership functions, the effective prediction of robot angle with their position is determined in accordance to space of Robot. This paper compares four techniques and found that the Hybrid membership function is the effective one for determination of effective prediction measurement as it shows a good agreement with the experimental result. By taking help of this paper the user can easily access to any point of the workspace locations. It is very effective one by the fuzzy logic systems to analyze the work space.

P. Kesaba, B. B. Choudhury, M. K. Muni
Pitch Angle Controlling of Wind Turbine System Using Proportional-Integral/Fuzzy Logic Controller

Blade pitch angle and tip speed ratio (TSR) control approaches are very important in Wind Turbine (WT) to achieve a constant output torque for stability of WT operation. Therefore, accurate and effective control techniques are investigated and designed to supply constant output torque to the Permanent Magnet Synchronous Generator (PMSG). Furthermore, two controlling approaches, i.e. Proportional-Integral (PI) and Fuzzy Logic (FL) based blade pitch angle and TSR control are implemented, and their performance is investigated in terms of torque stability and response time. The complete system is modeled in MATLAB/Simulink environment. The performance of the system with these control techniques are investigated under variable wind speed conditions. The performance of both the systems is found satisfactory, but system with FLC shows better performance as compared to PI controller.

Rupendra Kumar Pachauri, Harish Kumar, Ankit Gupta, Yogesh K. Chauhan
Recognition of Repetition and Prolongation in Stuttered Speech Using ANN

This paper mainly focuses on repetition and prolongation detection in stuttered speech signal. The acoustic and pitch related features like Mel-frequency cepstral coefficients (MFCCs), formants, pitch, zero crossing rate (ZCR) and Energy are used to test the effectiveness in recognizing repetitions and prolongations in stammered speech. Artificial Neural Networks (ANN) are used as classifier. The results are evaluated using combination of different features. The results show that the ANN classifier trained using MFCC features achieves an average accuracy of 87.39 % for repetition and prolongation recognition.

P. S. Savin, Pravin B. Ramteke, Shashidhar G. Koolagudi

Computational Intelligence: Algorithms, Applications and Future Directions

Frontmatter
Primary Health Centre ATM: PHCATM See Healthcare Differently

PHCATM is a product that can accurately and efficiently find relief for common symptoms by interactively querying the user and providing real time suggestions. In addition, it has the ability to connect with the nearest medical officer with live chat whenever demanded for online health check-up. The user can discuss problems with the medical officer, and be involved in deciding the best care solution. Finally, the PHCATM module can dispense the prescribed medications and print out prescriptions written electronically. Too many people in India die due to lack of diagnosis in first place, availability of medicine in time or in affordable cost as second.

Priyabrata Sundaray
Hybrid Model with Fusion Approach to Enhance the Efficiency of Keystroke Dynamics Authentication

We propose in this paper a novel technique to enhance the performance of keystroke dynamic authentication using hybrid model with four fusion approach. Firstly, extract keystroke features from our database. Then generate template from extracted features, which is compact form of keystroke feature data. Hybrid model based on combination of Gaussian probability density function (GPDF) and Support Vector Machine (SVM) will convert test features into scores. At last, applied four fusion rules on hybrid model to fusing GPDF and SVM scores to improve the final result. Experimental results show that the performance of the proposed hybrid model can bring obvious improvement with error rate of 1.612 %.

Ramu Thanganayagam, Arivoli Thangadurai
Multi-class Twin Support Vector Machine for Pattern Classification

In this paper, we propose a novel algorithm for multi-class classification, called as Multi-class Twin Support Vector Machine (MTWSVM) which is an extension of the binary Twin Support Vector Machine (TWSVM). MTWSVM is based on “one-against-one” strategy in which the patterns of each class are trained with the patterns of another class. To speed up the training phase, optimization problems are solved by Successive Over Relaxation (SOR) technique. The experiment is performed on eight benchmark datasets and the performance of the proposed approach is compared with the existing multi-class approaches based on Support Vector Machines and Twin Support Vector Machines.

Divya Tomar, Sonali Agarwal
Multi-label Classifier for Emotion Recognition from Music

Music is one of the important medium to express the emotions such as anger, happy, sad, amazed, quiet etc. In this paper, we consider the task of emotion recognition from music as a multi-label classification task because a piece of music may have more than one emotion at the same time. This research work proposes the Binary Relevance (BR) based Least Squares Twin Support Vector Machine (LSTSVM) multi-label classifier for emotion recognition from music. The performance of the proposed classifier is compared with the eight existing multi-label learning methods using fourteen evaluation measures in order to evaluate it from different point of views. The experimental result suggests that the proposed multi-label classifier based emotion recognition system is more efficient and gives satisfactory outcomes over the other existing multi-label classification approaches.

Divya Tomar, Sonali Agarwal
Visibility Enhancement in a Foggy Road Along with Road Boundary Detection

Images and videos of outdoor scenes suffer from reduced clarity due to presence of fog/haze/mist, and thus it becomes difficult to drive in bad weather conditions. Several methods have already been proposed to improve the images acquired in foggy weather conditions. In this paper a novel method of dehazing using dark channel prior along with masking the sky regions has been proposed, the output has improved considerably due to clear visibility of separation of surrounding edges from the sky as well as reduced artifacts. Focus on road edge detection has also been emphasized on, in this work along with dehazing leading to prominent visibility in foggy conditions.

Dibyasree Das, Kyamelia Roy, Samiran Basak, Sheli Sinha Chaudhury
Artificial Neural Network Based Prediction Techniques for Torch Current Deviation to Produce Defect-Free Welds in GTAW Using IR Thermography

In recent years, on-line weld monitoring is the potential area of research. In this work, torch current deviation prediction systems are developed with Artificial Neural Networks to produce welds free from Lack of Penetration. Lack of penetration is deliberately introduced by varying the torch

current

. Thermographs are acquired during welding and hotspots are extracted using Euclidean Distance based segmentation and are quantitatively characterized using the second order central moments. Exemplars are then created with central moments as input parameters and deviation in torch current as the output parameter. Radial Basis Networks (RBN) and Generalized Regressive Neural Networks (GRNN) are then trained and tested to assess the suitability for torch current prediction. GRNN outperforms RBN in predicting the torch current deviation with 98.95 % accuracy.

N. M. Nandhitha
A Study on Cloud Based SOA Suite for Electronic Healthcare Records Integration

In order to exchange healthcare information reliability, security and cost-effectiveness are three important factors where healthcare industry has to focus upon. Versatile platforms for enterprise-wide information sharing are needed for payers and providers. To provide quality care to patient’s accurate information through networks should be required for clinicians and integrated information for the business operation is needed for administrators. Information access from various systems like innovation, performance improvement, demand, monetary and many such other systems is required from both the sides of organization. These organizations must share data externally from application data, medical records of all patients, medicinal data, chemistry reports and symptomatic information from various mediator bodies and strictly following the policies and procedures for storing, modifying, disseminating the electronic health records (EHR). In this era most of the Health care industries are moving to cloud services for processing and storing the healthcare data. There is a need to build cloud based SOA suite for EHR integration. This SOA suite will help healthcare organizations with widespread integrated capabilities and a fused middleware platform. In this paper we study a cloud based SOA suite for EHR integration by empowering the transparent, extensible, protected methods for distributing real and secured information only for intended recipients.

Sreekanth Rallapalli, R. R. Gondkar
Hap: Protecting the Apache Hadoop Clusters with Hadoop Authentication Process Using Kerberos

Hadoop is a disseminated framework that gives an appropriated file system and MapReduce group employment handling on vast bunches utilizing merchandise servers. Despite the fact that Hadoop is utilized on private groups behind an association’s firewalls, Hadoop is regularly given as an issue multi-inhabitant administration and is utilized to store delicate information; as an issue, solid validation and approval is important to ensure private information. Adding security to Hadoop is testing in light of the fact that all the co operations don’t take after the fantastic client server design: the record framework is parceled and disseminated obliging approval checks at different focuses; a submitted bunch employment is executed at a later time on hubs not the same as the hub on which the customer verified and submitted the employment; occupation assignments from distinctive clients are executed on the same register hub; In this paper to address these difficulties, the base Kerberos confirmation system is supplemented by assignment and ability like access tokens and the idea of trust for optional administrations.

V Valliyappan, Parminder Singh
Image Analysis for Efficient Surface Defect Detection of Orange Fruits

This work portrays a novel approach for the improvement of a real-time computerized vision based model for automatic orange fruit peel defect detection. In this paper at first, different filtering methods and wavelet based method has been used to denoise the given input image and performs their comparative study. Based on this study, the wavelet based approach is used for smoothening of the images together with removing the higher energy regions in an image for better defect detection as well as makes the defects more retrievable. Finally, orange fruit skin color defects are identified by using RGB and HSI color spaces. The experimental test results indicate that the designed algorithm is scalable, computationally effective and robust for identification of orange fruit surface defects.

Thendral Ravi, Suhasini Ambalavanan
Segregation of Rare Items Association

Nowadays there are many applications including rare itemsets. Here, this paper is concentrating Associations of rare itemsets as association rule mining is considered as one of the most important data mining techniques utilized in the area of market basket data analysis, stock data analysis for frequent items mining. Also it is applied for rare itemsets mining in applications like intrusion detection, medical science, etc. as they have special characteristic like appearing for less number of times. This paper is categorizing them according to the usages of different basic approach, storage structure, mining of items, number of database scans and threshold(s) used, proposing the approach to segregate the rare items from the study of the number of research works done in this area and analyzed the result.

Dipti Rana, Rupa Mehta, Prateek Somkunwar, Naresh Mistry, Mukesh Raghuwanshi
Local Gabor Wavelet-Based Feature Extraction and Evaluation

Feature extraction is an essential step in many image processing and computer vision applications. It is quite desirable that the extracted features can effectively represent an image. Furthermore, the dominant information visually perceived by human beings should be efficiently represented by the extracted features. Over the last few decades, different algorithms are proposed to address the major issues of image representations by the efficient features. Gabor wavelet is one of the most widely used filters for image feature extraction. Existing Gabor wavelet-based feature extraction methodologies unnecessarily use both the real and the imaginary coefficients, which are subsequently processed by dimensionality reduction techniques such as PCA, LDA etc. This procedure ultimately affects the overall performance of the algorithm in terms of memory requirement and the computational complexity. To address this particular issue, we proposed a local image feature extraction method by using a Gabor wavelet. In our method, an image is divided into overlapping image blocks, and subsequently each of the image blocks are separately filtered out by Gabor wavelet. Finally, the extracted coefficients are concatenated to get the proposed local feature vector. The efficacy and effectiveness of the proposed feature extraction method is evaluated using the estimation of mean square error (MSE), peak signal-to-noise ratio (PSNR), and the correlation coefficient (CC) by reconstructing the original image using the extracted features, and compared it with the original input image. All these performance evaluation measures clearly show that real coefficients of the Gabor filter alone can effectively represent an image as compared to the methods which utilize either the imaginary coefficients or the both. The major novelty of our method lies on our claim—capability of the real coefficients of a Gabor filter for image representation.

T. Malathi, M. K. Bhuyan
Human Interaction Recognition Using Improved Spatio-Temporal Features

Human Interaction Recognition (HIR) plays a major role in building intelligent video surveillance systems. In this paper, a new interaction recognition mechanism has been proposed to recognize the activity/interaction of the person with improved spatio-temporal feature extraction techniques robust against occlusion. In order to identify the interaction between two persons, tracking is necessary step to track the movement of the person. Next to tracking, local spatio temporal interest points have been detected using corner detector and the motion of the each corner points have been analysed using optical flow. Feature descriptor provides the motion information and the location of the body parts where the motion is exhibited in the blobs. Action has been predicted from the pose information and the temporal information from the optical flow. Hierarchical SVM (H-SVM) has been used to recognize the interaction and Occlusion of blobs gets determined based on the intersection of the region lying in that path. Performance of this system has been tested over different data sets and results seem to be promising.

M. Sivarathinabala, S. Abirami
Common Coupled Fixed Point Results in Fuzzy Metric Spaces Using JCLR Property

The present study is devoted to use the notion of joint common limit (shortly,

$$JCLR$$

J

C

L

R

property) for coupled maps and utilize this concept to prove common coupled fixed point theorem on fuzzy metric space. In this paper, we also prove some fixed point theorems using the concept

$$CLR$$

C

L

R

property, E.A property and integral type contractive condition in fuzzy metric space. Illustrative examples supporting main results have been given.

Vishal Gupta, Ashima Kanwar, Naveen Gulati
Speech Based Arithmetic Calculator Using Mel-Frequency Cepstral Coefficients and Gaussian Mixture Models

In recent years, speech based computer interaction has become the most challenging and demanding application in the field of human computer interaction. Speech based Human computer interaction offers a more natural way to interact with computers and does not require special training. In this paper, we have made an attempt to build a human computer interaction system by developing speech based arithmetic calculator using Mel-Frequency Cepstral Coefficients and Gaussian Mixture Models. The system receives arithmetic expression in the form of isolated speech command words. Acoustic features such as Mel-Frequency Cepstral Coefficients features are extracted from the these speech commands. Mel-Frequency Cepstral features are used to train Gaussian mixture model. The model created after iterative training is used to predict input speech command either as a digit or an operator. After successful recognition of operators and digits, arithmetic expression will be evaluated and result of expression will be converted into an audio wave. Our system is tested with a speech database consisting of single digit numbers (0–9) and 5 basic arithmetic operators

$$ ( + , - , \times ,/\,{\text{and}}\,\% ) $$

(

+

,

-

,

×

,

/

and

%

)

. The recognition accuracy of the system is around 86 %. Our speech based HCI system can provide a great benefit of interacting with machines through multiple modalities. Also it supports in providing assistance to visually impaired and physically challenged people.

Moula Husain, S. M. Meena, Manjunath K. Gonal
Comparative Analysis on Optic Cup and Optic Disc Segmentation for Glaucoma Diagnosis

Glaucoma is an eye disease which causes continuous increase in size of optic cup and finally the permanent vision loss due to damage to the optic nerve. It is the second most prevailing disease all over the world which causes irreversible vision loss or blindness. It is caused due to increased pressure in the eyes which enlarges size of optic cup and further blocks flow of fluid to the optic nerve and deteriorates the vision. Cup to disc ratio is the measure indicator used to detect glaucoma. It is the ratio of sizes of optic cup to disc. The aim of this analysis is to study the performance of various segmentation approaches used for optic cup and optic disc so far by different researchers for detection of glaucoma in time.

Niharika Thakur, Mamta Juneja

Advanced Image Processing Methodologies

Frontmatter
A Secure Image Encryption Algorithm Using LFSR and RC4 Key Stream Generator

The increasing importance of security of multimedia data has prompted greater attention towards secure image encryption algorithms. In this paper, the authors propose a highly secure encryption algorithm with permutation-substitution architecture. In the permutation step, image pixels of the plain image are shuffled using Linear Feedback Shift Register (LFSR). The output of this step is an intermediary cipher image which is of the same size as that of the plain image. In the substitution step, sequence of random numbers is generated using the RC4 key stream generator which is XORed with the pixel value of the intermediary cipher image to produce the final cipher image. Experimental results and security analysis of the proposed scheme show that the proposed scheme is efficient and secure.

Bhaskar Mondal, Nishith Sinha, Tarni Mandal
An Improved Reversible Data Hiding Technique Based on Histogram Bin Shifting

In this paper, we propose a novel reversible data hiding scheme, which can exactly recover the original cover image after the extraction of the watermark. The proposed scheme is based on the histogram bin shifting technique. This scheme utilizes peak point (maximum point) but unlike the reported algorithms based on histogram bin shifting, we utilize second peak point instead of zero point (minimum point) to minimize the distortion created by the shifting of pixels. Proposed scheme has low computational complexity. The scheme has been successfully applied on various standard test images and experimental results along with comparison with an existing scheme show the effectiveness of the proposed scheme. Higher Peak Signal to Noise Ratio (PSNR) values indicate that proposed scheme gives better results than existing reversible watermarking schemes.

Smita Agrawal, Manoj Kumar
A Novel Approach Towards Detection and Identification of Stages of Breast Cancer

A robust and efficient CAD system to detect and classify breast cancer at its early stages is an essential requirement of radiologists. This paper proposes a system that detects, classifies and also recognizes the stage of the detected tumor which helps radiologists in reducing false positive predictions. A MRM image is preprocessed using histogram equalization and dynamic thresholding approach. Segmentation of the preprocessed image is carried out using a novel hybrid approach, which is a hybridization of PSO and K-Means clustering. Fourteen textural features are extracted from the segmented region in order to classify the tumor using Artificial Neural Network. If the tumor is classified as malignant then the stage of the tumor is identified using size as a key parameter.

M. Varalatchoumy, M. Ravishankar
Analysis of Local Descriptors for Human Face Recognition

Facial image analysis is an important and profound research in the field of computer vision. The prime issue of the face recognition is to develop the robust descriptors that discriminate facial features. In recent years, the local binary pattern (LBP) has attained a big attention of the biometric researchers, for facial image analysis due to its robustness shown for the challenging databases. This paper presents a novel method for facial image representation using local binary pattern, called augmented local binary pattern (A-LBP) which works on the consolidation of the principle of locality of uniform and non-uniform patterns. It replaces the non-uniform patterns with the mod value of the uniform patterns that are consolidated with the neighboring uniform patterns and extract pertinent information from the local descriptors. The experimental results prove the efficacy of the proposed method over LBP on the publicly available face databases, such as AT & T-ORL, extended Yale B, and Yale A.

Radhey Shyam, Yogendra Narain Singh
Implementation and Comparative Study of Image Fusion Methods in Frequency Domain

Complementary multi-focus and/or multi-model data from two or more different images are combined into one new image is called Image fusion. The main objective is to decrease vagueness and minimizes redundancy in the output while enhancing correlate information specific to a task. Medical images coming from different resources may often give different data. So, it is challenging task to merge two or more medical images. The fused images are very useful in medical diagnosis. In this paper, image fusion has been performed in discrete wavelet transform (DWT) and Contourlet transform (CNT). As a fusion rule, spatial techniques like Averaging, Maximum Selection and PCA is used. Experiments are performed on CT and MRI medical images. For evaluation and comparative analysis of methods, a set of standard performance measures are used. This paper’s results show that, the Contourlet method gives a good performance in medical image fusion, because it provides parabolic scaling and vanishing moments.

Keyur N. Brahmbhatt, Ramji M. Makwana
Iris Recognition Using Textural Edgeness Features

A method for feature extraction from an iris image based on the concept of textural edgeness is presented in this paper. Here for authentication purpose we have used two textural edgeness features namely: (1) a modified version of Gray Level Auto Correlation (GLAC) and (2) Scale Invariant Feature transform (SIFT) descriptors over dense grids in the image domain. Extensive experimental results using MMU1 and IITD iris databases demonstrate the effectiveness of the proposed system.

Saiyed Umer, Bibhas Chandra Dhara, Bhabatosh Chanda
Z-Transform Based Digital Image Watermarking Scheme with DWT and Chaos

Digital Image Watermarking has recently been used widely to address issues concerning authentication and copyright protection. Chaos is one of the promising techniques implemented in image watermarking schemes. In this paper, a new watermarking algorithm based on chaos along with discrete wavelet transform and z-transformation is proposed. The host image is decomposed into 3-level Discrete Wavelet Transform (DWT). The HH3 and HL3 sub-bands are then converted to z-domain using z-transformation (ZT). Arnold Cat Map (ACM), a chaotic map is applied to the watermark image and it is divided into two equal parts. The sub-bands HH3 and HL3 are chosen for embedding the watermark image. One part of the watermark is embedded in the z-transformed HH3 sub-band, and the other part is embedded in the z-transformed HL3 sub-band. The watermarked image is obtained by taking the inverse of the ZT and the inverse of DWT. The experimental results and the performance analysis show that the proposed method is efficient and can provide practical invisibility with additive robustness.

N. Jayashree, R. S. Bhuvaneswaran
Panoramic Image Mosaicing: An Optimized Graph-Cut Approach

Panoramic images have numerous important applications in the area of computer vision, video coding/enhancement, virtual reality and surveillance. The process of building panorama using mosaicing technique is a challenging task as it requires consideration of various factors such as camera motion, sensor noise and illumination difference, that deteriorate the quality of the mosaic. This paper proposes a feature based mosaicing technique for creation of high quality panoramic image. The algorithm mitigates the problem of seam by aligning the images employing Scale Invariant Feature Transform (SIFT) features and blending the overlapping region using optimized graph-cut. The results show the efficacy of the proposed algorithm.

Achala Pandey, Umesh C. Pati
Spatial Resolution Assessment in Low Dose Imaging

Computed tomography has been reported as most beneficial modality to mankind for effective diagnosis, planning, treatment and follows up of clinical cases. However, there is a potential risk of cancer among the recipients, who undergoes repeated computed tomography screening. This is mainly because the immunity of any living tissue can repair naturally the damage caused due to radiation only up-to a certain level. Beyond which the effort made by immunity in the natural repair can lead to cancerous cells. So, most computed tomography developers have enabled computed tomography modality with the feature of radiation dose management, working on the principle of as low as reasonably achievable. This article addresses the issue of low dose imaging and focuses on the enhancement of spatial resolution of images acquired from low dose, to improve the quality of image for acceptability; and proposes a system model and mathematical formulation of Highly Constrained-Back Projection.

Akshata Navalli, Shrinivas Desai
A Probabilistic Patch Based Hybrid Framework for CT/PET Image Reconstruction

Statistical image reconstruction for computed tomography and positron emission tomography (CT/PET) play a significant role in the image quality by using spatial regularization that penalizes image intensity difference between neighboring pixels. The most commonly used quadratic membrane (QM) prior, which smooth’s both high frequency noise and edge details, tends to produce an unfavourable result while edge-preserving non-quadratic priors tend to produce blocky piecewise regions. However, these edge-preserving priors mostly depend on local smoothness or edges. It does not consider the basic fine structure information of the desired image, such as the gray levels, edge indicator, dominant direction and frequency. To address the aforementioned issues of the conventional regularizations/priors, this paper introduces and evaluates a hybrid approach to regularized ordered subset expectation maximization (OSEM) iterative reconstruction technique, which is an accelerated version of EM, with Poisson variability. Regularization is achieved by penalizing OSEM with probabilistic patch-based regularization (PPB) filter to form hybrid method (OSEM+PPB) for CT/PET image reconstruction that uses neighborhood patches instead of individual pixels in computing the non-quadratic penalty. The aim of this paper is to impose an effective edge preserving and noise removing framework to optimize the quality of CT/PET reconstructed images. A comparative analysis of the proposed model with some other existing standard methods in literature is presented both qualitatively and quantitatively using simulated test phantom and standard digital image. An experimental result indicates that the proposed method yields significantly improvements in quality of reconstructed images from the projection data. The obtained results justify the applicability of the proposed method.

Shailendra Tiwari, Rajeev Srivastava, Arvind Kumar Tiwari
Low Cost Eyelid Occlusion Removal to Enhance Iris Recognition

The Iris recognition system is claimed to perform with very high accuracy in given constrained acquisition scenarios. It is also observed that partial occlusion due to the presence of the eyelid can hinder the functioning of the system. State-of-the-art algorithms consider that the inner and outer iris boundaries are circular and thus these algorithms do not take into account the occlusion posed by the eyelids. In this paper, a novel low-cost approach for detecting and removing eyelids from annular iris is proposed. The proposed scheme employs edge detector to identify strong edges, and subsequently chooses only horizontal edges. 2-means clustering technique clusters the upper and lower eyelid edges through maximizing class separation. Once two classes of edges are formed, one indicating edges contributing to upper eyelid, another indicating lower eyelid, two quadratic curves are fitted on each set of edge points. The area above the quadratic curve indicating upper eyelid, and below as lower eyelid can be suppressed. Only non-occluded iris data can be fetched to the further biometric system. This proposed localization method is tested on publicly available BATH and CASIAv3 iris databases, and has been found to yield very low mislocalization rate.

Beeren Sahu, Soubhagya S. Barpanda, Sambit Bakshi

Next Generation Optical Systems

Frontmatter
On Improving Static Routing and Wavelength Assignment in WDM All-Optical Mesh Networks

The method in which Routing and Wavelength Assignment (RWA) of connection requests is performed in optical WDM networks, can appreciably affect resource consumption. The blocking probability for lightpath requests increases significantly with increase in resource consumption. Thus the method of performing RWA should be such that it minimizes consumption of network resources. RWA in all-optical networks is an NP-Complete problem. This paper proposes six new heuristic algorithms for static RWA in all-optical mesh networks that are not only efficient in terms of resource conservation but can also solve the RWA problem effectively in polynomial time. Comparisons show that the proposed algorithms perform better than some earlier well-known strategies.

Abhishek Bandyopadhyay, Debdutto Chakraborty, Uma Bhattacharya, Monish Chatterjee
A Randomized N-Policy Queueing Method to Prolong Lifetime of Wireless Sensor Networks

The increasing interest in Wireless Sensor Networks (WSN) can be understood to be a result of their wide range of applications. However, due to uneven depletion of their batteries, the nodes often face premature failure. In this paper, we explore a queue based method for the improvement of the WSNs. We propose a

$$N$$

N

threshold lifetime improvement method which considers the probability variation for the various states of a WSN—sleep, idle, start-up, and busy. The experimental results validate our theoretical analysis and prove that our approach is indeed an efficient method for improving the lifetime of Wireless Sensor Networks.

Maneesha Nidhi, Veena Goswami
Overview on Location Management in PCS Network: A Survey

Location Management is an important and key issue in any wireless communication network to deliver the services of mobile user and the network needs to keep track of that mobile user constantly. Two methods for location management exist-Location update (LU) and Paging. Both location update and paging takes huge cost in terms of bandwidth, battery power, memory space, computing time etc. More location updates reduce cost of paging and the vice versa is also true. Since the LU cost is much higher than the cost of paging, this paper focuses on location management using LU only. A thorough study of different existing LU procedures of location management process in Personal Communication Service (PCS) network and its related research work are presented in this paper.

Abantika Choudhury, Abhijit Sharma, Uma Bhattacharya
Self-Organized Node Placement for Area Coverage in Pervasive Computing Networks

In pervasive computing environments, it is often required to cover a certain service area by a given deployment of nodes or access points. In case of large inaccessible areas, often the node deployment is random. In this paper, given a random uniform node distribution over a 2-D region, we propose a simple distributed solution for self-organized node placement to satisfy coverage of the given region of interest using least number of active nodes. We assume that the nodes are identical and each of them covers a circular area. To ensure coverage we tessellate the area with regular hexagons, and attempt to place a node at each vertex and the center of each hexagon termed as

target points

. By the proposed distributed algorithm, unique nodes are selected to fill up the target points mutually exclusively with limited displacement. Analysis and simulation studies show that proposed algorithm with less neighborhood information and simpler computation solves the coverage problem using minimum number of active nodes, and with minimum displacement in 95 % cases. Also, the process terminates in constant number of rounds only.

Dibakar Saha, Nabanita Das
SNR Enhancement of Brillouin Distributed Strain Sensor Using Optimized Receiver

This paper presents an improvement on signal to noise ratio (SNR) of long range Brillouin distributed strain sensor (BDSS). Differential evolution (DE) algorithm is used for receiver (avalanche photo diode (APD)) optimization. We have extracted the strain information of the proposed sensor using Fourier deconvolution algorithm and Landau Placzek ratio (LPR). SNR of the proposed system is realized using Indium Gallium Arsenide (InGaAs) APD detector over 50 km sensing range. We have achieved about 30 dB improvement of SNR using optimized receiver compared to non-optimized receiver at 25 km of sensing distance for a launched power of 10 mW. The strain resolution is observed as 1670

$${\mu\varepsilon}$$

μ

ε

at a sensing distance of 50 km. Simulation results show that the proposed strain sensor is a potential candidate for accurate measurement of strain in sharp strain variation environment.

Himansu Shekhar Pradhan, P. K. Sahu
Comparative Performance Analysis of Different Segmentation Dropping Schemes Under JET Based Optical Burst Switching (OBS) Paradigm

Optical burst switching (OBS) is the most promising technique to explore the huge bandwidth offered by the optical fiber. In OBS network sender does not wait for an acknowledgment of path reservation and uses only one-way reservation for data transmission. In spite of being very promising this one-way reservation policy of OBS technology may create contention during data transmission. Contention in the network leads to loss of packets as a result the efficiency of the optical network deteriorates. To achieve optimum performance of an OBS network it is essential to employ proper contention resolution techniques. This article investigates the contention resolution capability of segmentation based dropping scheme for JET based OBS network. Performances of three different segmentation burst dropping policies such as head dropping; tail dropping and modified tail dropping are discussed and compared for above mentioned networks. Both finite and infinite input cases are considered here.

Manoj Kumar Dutta
Free-Space Optical Communication Channel Modeling

Free-space optical (FSO) systems have proved as the best technologies for communication and surveillance systems. These systems are merged with other technologies for providing robust applications. In FSO systems, when laser light is transmitted through the atmospheric channel, severe absorption is observed especially because of fog, snow, heavy rain etc. Therefore, we have discussed here different free space optical channel modelling techniques for mitigating these effects.

G. Eswara Rao, Hara Prasana Jena, Aditya Shaswat Mishra, Bijayananda Patnaik
Wavelet Based RTL-SDR Real Time Signal Denoising in GNU Radio

Noise removal is considered to be an efficacious step in processing any kind of data. Here the proposed model deals with removal of noise from aperiodic and piecewise constant signals by utilizing wavelet transform, which is being realized in GNU Radio platform. We have also dealt with the replacement of Universal Software Radio Peripheral with RTL-SDR for a low cost Radio Frequency Receiver system without any compromise in its efficiency. Wavelet analyzes noise level separately at each wavelet scale in time-scale domain and adapts the denoising algorithm especially for aperiodic and piecewise constant signals. GNU Radio companion serves well in analysis and synthesis of real time signals.

U. Reshma, H. B. Barathi Ganesh, J. Jyothi, R. Gandhiraj, K. P. Soman

Emerging Techniques in Computing

Frontmatter
A New Way for Combining Filter Feature Selection Methods

This study investigates the issue of obtaining stable ranking from the fusion of the result of multiple filtering methods. Rank aggregation is the process of performing multiple runs of feature selection and then aggregating the results into a final ranked list. However, a fundamental question of is how to aggregate the individual results into a single robust ranked feature list. There are a number of available methods, ranging from simple to complex. Hence we present a new rank aggregation approach. The proposed approach is composed of two stages: in the first we evaluate he similarity and stability of single filtering methods then, in the second we aggregate the results of the stable ones. The obtained results on the Australian and German credit datasets using support vector machine and decision tree confirms that ensemble feature ranking have a major impact in the performance improvement.

Waad Bouaguel, Mohamed Limam
Design and Implementation of Interactive Speech Recognizing English Dictionary

Computer synthesis of natural speech is one of the major objectives for researchers as well as linguists. The technology furnishes plenty of functional applications for human-computer interaction. This paper describes the procedure and logic for implementation of a software application that uses the technology of conversion of Speech to Text and Text to Speech synthesis. Basically, its a software application which will recognize the word spoken by the user using the microphone and the corresponding meaning will be delivered through the computer speaker. We measured the performance of the application with different main memory size, processor clock speed, L1 and L2 cache line sizes. Further, this paper demonstrates the various factors on which the application shows the highest efficiency and providing the suitable environment for it. At the end, the paper describes the major use of the application and future work included for the same. This paper describes the procedure and logic for implementation of a software application that uses the technology of conversion of Speech to Text and Text to Speech synthesis. Basically, its a software application which will recognize the word spoken by the user using the microphone and the corresponding meaning will be delivered through the computer speaker. We measured the performance of the application with different main memory size, processor clock speed, L1 and L2 cache line sizes. Further, this paper demonstrates the various factors on which the application shows the highest efficiency and providing the suitable environment for it. At the end, the paper describes the major use of the application and future work included for the same.

Dipayan Dev, Pradipta Banerjee
On Careful Selection of Initial Centers for K-means Algorithm

K-means clustering algorithm is rich in literature and its success stems from simplicity and computational efficiency. The key limitation of K-means is that its convergence depends on the initial partition. Improper selection of initial centroids may lead to poor results. This paper proposes a method known as Deterministic Initialization using Constrained Recursive Bi-partitioning (DICRB) for the careful selection of initial centers. First, a set of probable centers are identified using recursive binary partitioning. Then, the initial centers for K-means algorithm are determined by applying a graph clustering on the probable centers. Experimental results demonstrate the efficacy and deterministic nature of the proposed method.

R. Jothi, Sraban Kumar Mohanty, Aparajita Ojha
Rule Based Machine Translation: English to Malayalam: A Survey

In this modern era, it is a necessity that people communicate with other parts of world without any language barriers. Machine translation system can be very useful tool in this context. The main purpose of a machine translation system is to convert text of one natural language to a different natural language without losing its meaning. The translation of text can be done in several ways mainly, rule based and corpus based—each of which uses various approaches to obtain correct translation. In this paper, we focus on the various machine translation approaches with a core focus on English to Malayalam translation using Rule Based approach.

V.C Aasha, Amal Ganesh
Time Series Forecasting Through a Dynamic Weighted Ensemble Approach

Time series forecasting has crucial significance in almost every practical domain. From past few decades, there is an ever-increasing research interest on fruitfully combining forecasts from multiple models. The existing combination methods are mostly based on time-invariant combining weights. This paper proposes a dynamic ensemble approach that updates the weights after each new forecast. The weight of each component model is changed on the basis of its past and current forecasting performances. Empirical analysis with real time series shows that the proposed method has substantially improved the forecasting accuracy. In addition, it has also outperformed each component model as well as various existing static weighted ensemble schemes.

Ratnadip Adhikari, Ghanshyam Verma
A Novel Graph Clustering Algorithm Based on Structural Attribute Neighborhood Similarity (SANS)

Graph Clustering techniques are widely used in detecting densely connected graphs from a graph network. Traditional Algorithms focus only on topological structure but mostly ignore heterogeneous vertex properties. In this paper we propose a novel graph clustering algorithm, Structural Attribute Neighbourhood Similarity (SANS) algorithm, provides an efficient trade-off between both topological and attribute similarities. First, the algorithm partitions the graph based on structural similarity, secondly the degree of contribution of vertex attributes with the vertex in the partition is evaluated and clustered. An extensive experimental result proves the effectiveness of SANS cluster with the other conventional algorithms.

M. Parimala, Daphne Lopez
Change-Point Detection in Enterprise Attack Surface for Network Hardening

Applications of change-point detection typically originate from the perspective of enterprise network security and network monitoring. With the ever-increasing size and complexity of enterprise networks and application portfolios, network attack surface keeps changing. This change in an attack surface is detected by identifying increase or decrease in the number of vulnerabilities at network level. Vulnerabilities when exploited successfully, either provide an entry point to an adversary into the enterprise network or can be used as a milestone for staging multi-stage attacks. In this paper, we have proposed an approach for change-point detection in an enterprise network attack surface. In this approach, a sequence of static attack graphs are generated for dynamic (time varying) enterprise network, and successive graphs in a sequence are compared for their dissimilarity for change-point detection. We have presented a small case study to demonstrate the efficacy and applicability of the proposed approach in capturing a change in network attack surface. Initial results show that our approach is capable of capturing the newly introduced vulnerabilities into the network and is able to differentiate these vulnerabilities for efficient network hardening.

Ghanshyam S. Bopche, Babu M. Mehtre
Efficient Quality of Multimedia Experience Using Perceptual Quality Metrics

Although there are very few metrics which correlate well with the Human Visual System (HVS), most of them do not. We extensively studied about the Video Quality Assessment (VQA) of 2D svideos for enhancing the Quality of Experience (QoE) using better Quality of Service (QoS). We propose a solution which helps us to find a high correlation between HVS and the objective metrics using Perceptual Quality Metrics (PQM). The motive behind this work is to introduce an objective metric that is adequate to predict the Mean Opinion Score (MOS) of distorted video sequences based on the Full Reference (FR) method.

Rahul Gaurav, Hasan F. Ates
Unsupervised Machine Learning Approach for Gene Expression Microarray Data Using Soft Computing Technique

Machine learning is a burgeoning technology used for extractions of knowledge from an ocean of data. It has robust binding with optimization and artificial intelligence that delivers theory, methodologies and application domain to the field of statistics and computer science. Machine learning tasks are broadly classified into two groups namely supervised learning and unsupervised learning. The analysis of the unsupervised data requires thorough computational activities using different clustering algorithms. Microarray gene expression data are taken into consideration for cluster regulating genes from non-regulating genes. In our work optimization technique (Cat Swarm Optimization) is used to minimize the number of cluster by evaluating the Euclidean distance among the centroids. A comparative study is being carried out by clustering the regulating genes before optimization and after optimization. In our work Principal component analysis (PCA) is incorporated for dimensionality reduction of vast dataset to ensure qualitative cluster analysis.

Madhurima Rana, Prachi Vijayeeta, Utsav Kar, Madhabananda Das, B. S. P. Mishra
Phase Correlation Based Algorithm Using Fast Fourier Transform for Fingerprint Mosaicing

The fingerprint identification is a challenging task in criminal investigation due to less area of interest (ridges and valleys) in the fingerprint. In criminal incidences, the obtained fingerprints are often partial having less area of interest. Therefore, it is required to combine such partial fingerprints and make them entire such that it can be compared with stored fingerprint database for identification. The conventional phase correlation method is simple and fast, but the algorithm only works when the overlapping region is in the leftmost top corner in one of the two input images. However, it does not always happen in partial fingerprints obtained in forensic science. There are total six different possible positions of overlapping region in mosaiced fingerprint. The proposed algorithm solves the problem using the mirror image transformation of inputs and gives correct results for all possible positions of overlapping region.

Satish H. Bhati, Umesh C. Pati
An LSB Substitution with Bit Inversion Steganography Method

Several works have been proposed and implemented for image Steganography using LSB substitution. In this paper, a module based LSB substitution method is implemented, which is further improved by using a novel bit inversion technique. The substitution method hides the message data after compressing smoother areas of the image in a lossless way, resulting in fewer number of modified cover image pixels. After this, a bit inversion technique is applied. In the bit inversion technique, certain LSBs of pixels of cover image are changed if they occur with a particular pattern. In this way, less number of pixels is modified. So PSNR of stego-image is improved. For correct de-steganography, the bit patterns for which LSBs has inverted needs to be stored within the stego-image somewhere.

Nadeem Akhtar
Image Quality Assessment with Structural Similarity Using Wavelet Families at Various Decompositions

Wavelet transform is one of the most active areas of research in the image processing. This paper gives analysis of a very well known objective image quality metric, so called Structural similarity, MSE and PSNR which measures visual quality between two images. This paper presents the joint scheme of wavelet transform with structural similarity for evaluating the quality of image automatically. In the first part of algorithm, each distorted as well as original image are decomposed into three levels and in second part, these coefficient are used to calculate the structural similarity index, MSE and PSNR. The predictive performance of image quality based on the wavelet families like db5, haar (db1), coif1 with one, two and three level of decomposition is figured out. The algorithm performance includes the correlation measurement like Pearson, Kendall, and Spearman correlation between the objective evaluations with subjective one.

Jayesh Deorao Ruikar, A. K. Sinha, Saurabh Chaudhury

Applications of Informatics

Frontmatter
Modelling the Gap Acceptance Behavior of Drivers of Two-Wheelers at Unsignalized Intersection in Case of Heterogeneous Traffic Using ANFIS

The gap acceptance concept is an important theory in the estimation of capacity and delay of the specific moment at unsignalized junctions. Most of analyzes have been carried in advanced countries where traffic form is uniform, and laws of priorities, as well as lane disciplines, are willingly followed. However, in India, priority laws are less honored which consequently create more conflicts at intersections. Modeling of such behavior is complex as it influenced by various traffic features and vehicles’ as well as drivers’ characteristics. A fuzzy model has been broadly accepted theory to investigate similar circumstances. This article defines the utilization of ANFIS to model the crossing performance of through movement vehicles at the four-legged uncontrolled median separated intersection, placed in a semi-urban region of Ahmedabad in the province of Gujarat. Video footage method was implemented, and five video cameras had been employed concurrently to collect the various movements and motorists’, as well as vehicles’ characteristics. An ANFIS model has been developed to estimate the possibilities of acceptance and rejections by drivers of two-wheelers for a particular gap or lag size. Seven input and one output parameters, i.e. the decision of the drivers are considered. Eleven different diverse combination of variables is employed to construct eleven different models and to observe the impact of various attributes on the correct prediction of specific model. 70 % observations are found to prepare the models and residual 30 % is considered for validating the models. The forecasting capability of the model has been matched with those experiential data set and has displayed good ability of replicating the experiential behavior. The forecast by ANFIS model ranges roughly between 77 and 90 %. The models introduced in this study can be implemented in the dynamic evaluation of crossing behavior of drivers.

Harsh Jigish Amin, Akhilesh Kumar Maurya
Optimizing the Objective Measure of Speech Quality in Monaural Speech Separation

Monaural speech separation based on computational auditory scene analysis (CASA) is a challenging problem in the field of signal processing. The Ideal Binary Mask (IBM) proposed by DeLiang Wang and colleague is considered as the benchmark in CASA. However, it introduces objectionable distortions called musical noise and moreover, the perceived speech quality is very poor at low SNR conditions. The main reason for the degradation of speech quality is binary masking, in which some part of speech is discarded during synthesis. In order to address this musical noise problem in IBM and improve the speech quality, this work proposes a new soft mask as the goal of CASA. The performance of the proposed soft mask is evaluated using perceptual evaluation of speech quality (PESQ). The IEEE speech corpus and NOISEX92 noises are used to conduct the experiment. The experimental results indicate the superior performance of the proposed soft mask as compared to the traditional IBM in the context of monaural speech separation.

M. Dharmalingam, M. C. John Wiselin, R. Rajavel
Automated Segmentation Scheme Based on Probabilistic Method and Active Contour Model for Breast Cancer Detection

Mammography is one of the renowned techniques for detection of breast cancer in medical domain. The detection rate and accuracy of breast cancer in mammogram depend on the accuracy of image segmentation and the quality of mammogram images. Most of existing mammogram detection techniques suffer from exact continuous boundary detection and estimate amount of affected area. We propose an algorithm for the detection of deformities in mammographic images that using Gaussian probabilistic approach with Maximum likelihood estimation (MLE), statistical measures for the classify of image region, post processing by morphological operations and Freeman Chain Codes for contour detection. For these detected areas of abnormalities, compactness are evaluated on segmented mammographic images. The validation of the proposed method is established by using mammogram images from different databases. From experimental results of the proposed method we can claim the superiority over other usual methods.

Biswajit Biswas, Ritamshirsa Choudhuri, Kashi Nath Dey
Disease Detection and Identification Using Sequence Data and Information Retrieval Methods

Current clinical methods base disease detection and identification heavily on the description of symptoms by the patient. This leads to inaccuracy because of the errors that may arise in the quantification of the symptoms and also does not give a complete idea about the presence of any particular disease. The prediction of cellular diseases is still more challenging; for we have no measure on the exact quantity, quality and extremeness. The typical symptoms for these diseases are visible at a later stage allowing the disease to silently progress. This paper provides an efficient and novel way of detection and identification of pancreatitis and breast cancer using a combination of sequence data and information retrieval algorithms to provide the most accurate result. The developed system maintains a knowledge base of the mutations of the diseases causing breast cancer and pancreatitis and thus uses techniques of protein sequence scoring and information retrieval for providing the best match of patient protein sequence with the mutations stored. The system has been tested with mutations available online and gives 98 % accurate results.

Sankranti Joshi, Pai M. Radhika, Pai M. M. Manohara
Advance Teaching–Learning Based Optimization for Global Function Optimization

Teaching–Learning based optimization (TLBO) is an evolutionary powerful algorithm in optimal solutions search space that is inspired from teaching learning phenomenon of a classroom. It is a novel population based algorithm with faster convergence speed and without any algorithm specific parameters. The present work proposes an improved version of TLBO called the Advance Teaching–Learning Based Optimization (ATLBO). In this algorithm introduced a new weight parameter for more accuracy and faster convergence rate. The effectiveness of the method is compare against original TLBO on many benchmark problems with different characteristics and shows the improvement in performance of ATLBO over traditional TLBO.

Anand Verma, Shikha Agrawal, Jitendra Agrawal, Sanjeev Sharma
Analysis of Online Product Purchase and Predicting Items for Co-purchase

In recent years, online market places have become popular among the buyers. During this course of time, not only have they sustained the business model but also generated large amount of profit, turning it into a lucrative business model. In this paper, we take a look at a temporal dataset from one of the most successful online businesses to analyze the nature of the buying patterns of the users. Arguably, the most important purchase characteristic of such networks is follow-up purchase by a buyer, otherwise known as a co-purchase. In this paper, we also analyze the co-purchase patterns to build a knowledge-base to recommend potential co-purchase items for every item.

Sohom Ghosh, Angan Mitra, Partha Basuchowdhuri, Sanjoy Kumar Saha
Fingerprint Template Protection Using Multiple Spiral Curves

In this paper we proposed a method for generating the cancelable fingerprint template using spiral curves by constructing contiguous right angled triangles using the invariant distances between reference minutia and every other minutiae in fingerprint image, then projecting onto a 4D space, features transformed using DFT. The proposed approach experimented by using the FVC database. The approach attains the primary needs diversity, revocability, security of biometric system. Performance is calculated using metrics GAR, FAR, and EER.

Munaga V. N. K. Prasad, Jaipal Reddy Anugu, C. R. Rao
Rhythm and Timbre Analysis for Carnatic Music Processing

In this work, an effort has been made to analyze rhythm and timbre related features to identify

raga

and

tala

from a piece of Carnatic music.

Raga

and

Tala

classification is performed using both rhythm and timbre features. Rhythm patterns and rhythm histogram are used as rhythm features. Zero crossing rate (ZCR), centroid, spectral roll-off, flux, entropy are used as timbre features. Music clips contain both instrumental and vocals. To find similarity between the feature vectors T-Test is used as a similarity measure. Further, classification is done using Gaussian Mixture Models (GMM). The results shows that the rhythm patterns are able to distinguish different

ragas

and

talas

with an average accuracy of 89.98 and 86.67 % respectively.

Rushiraj Heshi, S. M. Suma, Shashidhar G. Koolagudi, Smriti Bhandari, K. S. Rao
Repetition Detection in Stuttered Speech

This paper mainly focuses on detection of repetitions in stuttered speech. The stuttered speech signal is divided into isolated units based on energy. Mel-frequency cepstrum coefficients (MFCCs), formants and shimmer are used as features for repetition recognition. These features are extracted from each isolated unit. Using Dynamic Time Warping (DTW) the features of each isolated unit are compared with those subsequent units within one second interval of speech. Based on the analysis of scores obtained from DTW a threshold is set, if the score is below the set threshold then the units are identified as repeated events. Twenty seven seconds of speech data used in this work, consists of 50 repetition events. The result shows that the combination of MFCCs, formants and shimmer can be used for the recognition of repetitions in stuttered speech. Out of 50 repetitions, 47 are correctly identified.

Pravin B. Ramteke, Shashidhar G. Koolagudi, Fathima Afroz
Removal of Artifacts from Electrocardiogram Using Efficient Dead Zone Leaky LMS Adaptive Algorithm

The ability to extract high resolution and valid ECG signals from contaminated recordings is an important subject in the biotelemetry systems. During ECG acquisition several artefacts strongly degrades the signal quality. The dominant artefacts encountered in ECG signal such as Power Line Interference, Muscle Artefacts, Baseline Wander, Electrode Motion Artefacts; and channel noise generated during transmission. The tiny features of ECG signal are masked due to these noises. To track random variations in noisy signals, the adaptive filter is used. In this paper, we proposed Dead Zone Leaky Least Mean Square algorithm, Leaky Least Mean Froth algorithm and Median Leaky LMS algorithms to remove PLI and EM artefacts from ECG signals. Based on these algorithms, we derived some sign based algorithms for less computational complexity. We compare the proposed algorithms with LMS algorithm, which shows better performance in weight drift problem, round off error and low steady state error. The simulation results show that Dead Zone Leaky LMS algorithm gives good correlation factor and SNR ratio.

T. Gowri, P. Rajesh Kumar, D. V. R. Koti Reddy, Ur Zia Rahman
Detecting Tampered Cheque Images Using Difference Expansion Based Watermarking with Intelligent Pairing of Pixels

Image based cheque clearing systems enable faster clearing of cheques in banking system. But the paying bank should receive an unaltered image of a genuine cheque so that its decision does not support a case of fraud. The method, proposed in this paper, successfully detects tampered cheque images. The proposed method is based on a fragile and reversible watermarking technique, namely difference expansion based watermarking. But the major problem with these kinds of reversible watermarking techniques is that the pixel values in the watermarked image often fall outside the dynamic range of the image. This paper demonstrates how intelligent pairing of pixels can solve this problem. Therefore, the revised difference expansion based watermarking (based on intelligent pixel pairing) successfully detects tampered images of cheques.

Mahesh Tejawat, Rajarshi Pal
Printed Odia Digit Recognition Using Finite Automaton

Odia digit recognition (ODR) is one of the intriguing areas of research topic in the field of optical character recognition. This communication is an attempt to recognize printed Odia digits by considering their structural information as features and finite automaton with output as recognizer. The sample data set is created for Odia digits, and we named it as Odia digit database (ODDB). Each image is passed through several precompiled standard modules such as binarization, noise removal, segmentation, skeletonization. The image thus obtained is normalized to a size of 32 × 32 2D image. Chain coding is used on the skeletonised image to retrieve information regarding number of end points,

$$T$$

T

-joints,

$$X$$

X

-joints and loops. It is observed that finite automaton is able to classify the digits with a good accuracy rate except the digits

. We have used the correlation function to distinguish between,

. For our experiment we have considered some poor quality degraded printed documents. The simulation result records 96.08 % overall recognition accuracy.

Ramesh Kumar Mohapatra, Banshidhar Majhi, Sanjay Kumar Jena
Backmatter
Metadata
Title
Proceedings of 3rd International Conference on Advanced Computing, Networking and Informatics
Editors
Atulya Nagar
Durga Prasad Mohapatra
Nabendu Chaki
Copyright Year
2016
Publisher
Springer India
Electronic ISBN
978-81-322-2538-6
Print ISBN
978-81-322-2537-9
DOI
https://doi.org/10.1007/978-81-322-2538-6

Premium Partner