Skip to main content

Über dieses Buch

This volume contains 73 papers presented at CSI 2014: Emerging ICT for Bridging the Future: Proceedings of the 49th Annual Convention of Computer Society of India. The convention was held during 12-14, December, 2014 at Hyderabad, Telangana, India. This volume contains papers mainly focused on Fuzzy Systems, Image Processing, Software Engineering, Cyber Security and Digital Forensic, E-Commerce, Big Data, Cloud Computing and ICT applications.



A Hybrid Approach for Image Edge Detection Using Neural Network and Particle Swarm Optimization

An Edge of an image is a sudden change in the intensity of an image. Edge detection is process of finding the edges of an image. Edge detection is one of the image preprocessing techniques which significantly reduces the amount of data and eliminates the useless information by processing the important structural properties in an image. There are many traditional algorithms used to detect the edges of an image. Some of the important algorithms are Sobel, Prewitt, Canny, Roberts etc. A Hybrid approach for Image edge detection using Neural Networks and Particle swarm optimization is a novel algorithm to find the edges of image. The training of neural networks follows back propagation approach with particle swarm optimization as a weight updating function. 16 visual patterns of four bit length are used to train the neural network. The optimized weights generated from neural network training are used in the testing process in order to get the edges of an image.

D. Lakshumu Naidu, Ch. Seshadri Rao, Sureshchandra Satapathy

Studying Gene Ontological Significance of Differentially Expressed Genes in Human Pancreatic Stellate Cell

In this paper, we studied and analyzed the significant ontologies by gene ontology in which the differentially expressed genes (DEG) of human pancreatic stellate cell participate. We identified up-regulated and down-regulated differentially expressed genes between dose response and time course gene expression data after retinoic acid treatment of human pancreatic stellate cells. We first perform statistical t-test and calculate false discovery rate (FDR) then compute quantile value of test and found minimum FDR. We set the pvalue cutoff at 0.02 as threshold and get 213 up-regulated (increased in expression) genes and 99 down-regulated (decreased in expression) genes and analyzed the significant GO terms.

Bandana Barman, Anirban Mukhopadhyay

Software Effort Estimation through a Generalized Regression Neural Network

Management of large software projects includes estimating software development effort as the software industry is unable to provide a proper estimate of effort, time and development cost. Though many estimation models exist for effort prediction, a novel model is required to obtain highly accurate estimations. This paper proposes a Generalized Regression Neural Network to utilize improved software effort estimation for COCOMO dataset. In this paper, the Mean Magnitude Relative Error (MMRE) and Median Magnitude Relative Error (MdMRE) are used as the evaluation criteria. The proposed Generalized Regression Neural Network is compared with various techniques such as M5, Linear regression, SMO Polykernel and RBF kernel.

Parasana Sankara Rao, Reddi Kiran Kumar

Alleviating the Effect of Security Vulnerabilities in VANETs through Proximity Sensors

As the rate of road accidents are increasing day by day, an intelligent mechanism is essential to improve road safety. As a solution to this, current researchers focus is to use sensors which is a cost effective and leads to tremendous improvement in Vehicular Ad hoc Networks (VANET) and the need of their existence. Vehicular Ad hoc Network (VANET) is a subset of Mobile Ad hoc Networks (MANETs). VANETs exchange information between vehicles and the Road Side Unit (RSU) for making intelligent decisions spontaneously. Accidents on roads not only lead to the risk of life to victims and also create inconvenience to the public by either traffic jam or traffic diversion. Since VANETs has mobility, GSM communication fails as the messages are overlapped due to coherence. Apart from GSM communication, if the VANETs and typical road junctions are equipped with sensors, provides cost effective solution for reliable communication. In this paper, it is proposed to adopt a proximity sensor approach in VANETs to capture data, transmit it and store it in the local database for future reference if required. The capturing of data is done through Proximity sensors. These sensors will be mainly located at typical junctions and also in secure cars for an immediate response. This work is optimized using Ant-Colony metaheuristic optimization algorithm to trace the shortest path to overcome the inconsistent situations happened during the times of accident occurrences.

R. V. S. Lalitha, G. JayaSuma

A GIS Anchored System for Clustering Discrete Data Points – A Connected Graph Based Approach

Clustering is considered as one of the most important unsupervised learning problem which groups a set of data objects, in such way, so that the data objects belongs to the same group (known as cluster) are very similar to each other, compared to the data objects in another group (i.e. clusters). There is a wide variety of real world application area of clustering. In data mining, it identifies groups of related records, serving as the basis for exploring more detailed relationships. In text mining it is heavily used for categorization of texts. In marketing management, it helps to group customers of similar behaviors. The technique of clustering is also heavily being used in GIS. In case of city-planning, it helps to identify the group of vacant lands or houses or other resources, based on their type, value, location etc. To identify dangerous zones based on earth-quake epi-centers, clustering helps a lot. In this paper, a set of data objects are clustered using two connected graph based techniques – MST based clustering and Tree Based clustering. After considering a lot of test cases, at the end of the paper, the second technique is found to be more suitable for clustering than the first one.

Anirban Chakraborty, J. K. Mandal

Comparative Analysis of Tree Parity Machine and Double Hidden Layer Perceptron Based Session Key Exchange in Wireless Communication

In this paper, a detail analysis of Tree Parity Machine (TPM) and Double Hidden Layer Perceptron (DHLP) based session key exchange tecchnique has been presented in terms of synchronization time, space complexity, variability of learning rules, gantt chart, total number of threads and security. TPM uses single hidden layer in their architecture and participated in mutual learning for producing the tuned weights as a session key. DHLP uses two hidden layers instead of single. Addition of this extra layer enhances the security of the key exchange protocol. Comparisons of results of both techniques has been presented along with detail analysis.

Arindam Sarkar, J. K. Mandal

Comparative Analysis of Compression Techniques and Feature Extraction for Implementing Medical Image Privacy Using Searchable Encryption

The secure preservation of biomedical image data is a primary concern in today’s technology enabled medical world. The new advancements in the technology are insisting us to outsource our digital data to a third party server and bring as and when needed. In this regard, efficient storage and transmission of this large medical data set becomes an important concern. In this paper we studied different compression techniques as a significant step of data preparation for implementing searchable encryption of medical data privacy preservation. We also shown texture based feature extraction for enabling privacy preserving query search. The simulation results obtained using different modalities of CT and MRI images with the performance comparison of wavelet and contourlet transform in peak signal to noise ratio for different compression ratios.

J. Hyma, P. V. G. D. Prasad Reddy, A. Damodaram

Offline Detection of P300 in BCI Speller Systems

The paper presents a framework for offline analysis of P300 speller system using seeded k-means based ensemble SVM. Due to the use of small-datasets for the training of classifier, the performance deteriorates. The Proposed framework emphases on semi-supervised clustering approach for training the SVM classifier with large amount of data. The normalized mutual information (NMI) has used for cluster validation that gives maximum 88 clusters on 10 fold cross-validation dataset with NMI approx equals to 1. The proposed framework has applied to the EEG data acquired from two subjects and provided by the Wadsworth center for brain-computer interface (BCI) competition III. The experimental results show the increase in SNR value and obtain better accuracy results than linear, polynomial or rbf kernel SVM.

Mandeep Kaur, A. K. Soni, M. Qasim Rafiq

Nanorobotics Control Systems Design – A New Paradigm for Healthcare System

Nanorobotics is currently emerging as an attractive area of scientific research bridging biological and computational science along with mechanical science at the molecular level. We present a new approach to control the machines at the nano-meter or molecular scale (10

− 9

meter) in the perspective of the theory of cybernetics, the science of control, communication and computation with integration to complex man–machines systems. The problem under study concentrates its main focus on nano-robot control systems design including systems casualty, state notation and automata. We also describe the theory of nano-scale thermodynamically driven self assembly for collaborative information processing in a Bio-nanorobotics systems. A fuzzy shaped based approach is described in context of recognizing a single malignant cell along with its stage, as a target for medical treatment. The synthesis and imaging of magnetic nanoparticles, that can be functionally bind with the medicine and reached the effected regions for targeted drug delivery, such as in cancer treatment is also presented.

Sankar Karan, Bhadrani Banerjee, Anvita Tripathi, Dwijesh Dutta Majumder

A Comparative Study on Subspace Methods for Face Recognition under Varying Facial Expressions

Face recognition is one of the widely used research topic in biometric fields and it is rigorously studied. Recognizing faces under varying facial expressions is still a very challenging task because adjoining of real time expression in a person face causes a wide range of difficulties in recognition systems. Moreover facial expression is a way of nonverbal communication. Facial expression will reveal the sensation or passion of a person and also it can be used to reveal someone’s mental views and psychosomatic aspects. Subspace analysis are the most vital techniques which are used to find the basis vectors that optimally cluster the projected data according to their class labels. Subspace is a subset of a larger space, which contains the properties of the larger space. The key contribution of this article is, we have developed and analyzed the 2 state of the art subspace approaches for recognizing faces under varying facial expressions using a common set of train and test images. This evaluation gives us the exact face recognition rates of the 2 systems under varying facial expressions. This exhaustive analysis would be a great asset for researchers working world-wide on face recognition under varying facial expressions. The train and test images are considered from standard public face databases ATT, and JAFFE.

G. P. Hegde, M. Seetha

Recognizing Faces When Images Are Corrupted by Varying Degree of Noises and Blurring Effects

Most images are corrupted by various noises and blurring effects. Recognizing human faces in the presence of noises and blurring effects is a challenging task. Appearance based techniques are usually preferred to recognize faces under different degree of noises. Two state of the art techniques considered in our paper are Locality Preserving Projections (LPP) and Hybrid Spatial Feature Interdependence Matrix (HSFIM) based face descriptors. To investigate the performance of LPP and HSFIM we simulate the real world scenario by adding noises: Gaussian noise, Salt and pepper noise and also adding blurring effects: Motion blur and Gaussian blur on six standard public face databases: IITK, ATT, JAFEE, CALTECH, GRIMANCE, and SHEFFIELD.

Steven Lawrence Fernandes, Josemin G Bala

Functional Analysis of Mental Stress Based on Physiological Data of GSR Sensor

Stress plays a vital role in everyday life. It is mental state and is accompanied by physiological changes. So monitoring of these significant changes are important, which can help to identify the matter of anxiety at an early stage before serious. Various methods have been adopted to detect the stress with various sensors. GSR sensor is one of them to detect the stress at a particular time in different position with moods. In this paper three different positions like lying, sitting and standing have been considered with three moods. Normal, tension, and physical exercise have been considered for three different moods of human life. It has been observed that, the result of GSR value in term of physiological data are constantly varies in respect to surface area contact with body and maximum GSR values observed during tension moods.

Rmesh Sahoo, Srinivas Sethi

Low Power Affordable and Efficient Face Detection in the Presence of Various Noises and Blurring Effects on a Single-Board Computer

Till today face detection is a burning topic for the researchers. In the areas like digital media, intelligent user interface, intelligent visual surveillance and interactive games. Various noises and blurring effects face images captured in real time. Single board computer for efficient face detection system is introduced in this paper which works well in the presence of Gaussian Noise, Salt & Pepper Noise, Motion Blur and Gaussian Blur. Raspberry Pi based single-board computer is used for the experiments, because it consumes less power and is available at an affordable price. The developed system is tested by introducing varying degree of noises and blurring effects on standard public face databases: GRIMACE, JAFEE, INDIAN FACE, CALTECH, FACE 95, FEI – 1, FEI – 2. In the absence of noise and blurring effects also the system is tested using standard public face databases: GRIMACE, JAFEE, INDIAN FACE, CALTECH, FACE 95, FEI – 1, FEI – 2, HEAD POSE IMAGE, SUBJECT, and FGNET. The key advantage of the proposed system is excellent face detection rates in the presence of noises, blurring effects and also in the presence of varying facial expressions and across age progressions. Python scripts are developed for the system, resulted are shared on request.

Steven Lawrence Fernandes, Josemin G Bala

The Computational Analysis of Protein – Ligand Docking with Diverse Genetic Algorithm Parameters

The binding energy is the significant factor which elucidates the efficiency of docking between protein-ligand and protein-protein. To perform the Docking process, Genetic algorithm with its standard parameters is been used very often which will furnish the docking conformations, binding energies, interactions etc. In this proposed work, the parameters of genetic algorithm are variably changed for the docking process and we have observed the enhancement of binding energy, number of interactions etc. which plays a substantial role in the drug design.

S. V. G. Reddy, K. Thammi Reddy, V. Valli Kumari

An Auto Exposure Algorithm Using Mean Value Based on Secant Method

Auto exposure is the fundamental feature or property for the scenes to be exposed properly. This paper proposes an automatic exposure algorithm for well exposure. This algorithm uses one of the numerical methods, secant method. The correct exposure values are determined using center weighted average metering technique in which the center of the scene is mainly considered. The exposure values are based on the shutter speed and the gain. At a particular range of the mean the scene is said to be properly exposed. The implementation of this algorithm is done using point grey research programmable camera.

V. Krishna Sameera, B. Ravi Kiran, K. V. S. V. N. Raju, B. Chakradhar Rao

Image Authentication in Frequency Domain through G-Let Dihedral Group (IAFD-D3)

In this paper a G-Let based authentication technique has been proposed to authenticate digital documents by embedding a secret image/message inside the cover image in frequency domain. Cover image is transformed into G-Let domain to generate 2n number of G-Lets out of which selected G-Let(s) are embedded with secret message for the purpose of authentication or copyright protection. The special feature of IAFD-D3 is the use of Dihedral Group with ‘n’ equals to three. Thus total six G-Lets are generated out of which only single G-Let is used as a locker to lock the secret. Experimental results are computed and compared with the existing authentication techniques like GASMT, Li’s Method, STMDF, Region-Based based on Mean Square Error (MSE), Peak Signal to Noise Ratio (PSNR), Image Fidelity (IF), Universal Quality Image (UQI) and Structural Similarity Index Measurement (SSIM) shows better performance in IAFD-D3, in terms of low computational complexity and better image fidelity.

Madhumita Sengupta, J. K. Mandal, Sakil Ahamed Khan

Information Tracking from Research Papers Using Classification Techniques

The research area identification is a challenging issue to solve as there lakhs of millions of research papers available, it is required to classify the papers based on primary and secondary areas. The existing text mining techniques classifies research documents in the static manner. So there is a need to develop a framework that can classify the research documents in dynamic manner. This paper mainly describes a framework that can classify area of research documents arrived to the repository. The proposed frame work consists of two phases where first is to construct word list for each area of the paper. In second phase it continuously updates the word list associated to the new stream of research documents. The experimental results compared with existing techniques are reported and that satisfies the minimum requirement.

J. S. V. Sai Hari Priyanka, J. Sharmila Rani, K. S. Deepthi, T. Kranthi

Sentiment Analysis on Twitter Streaming Data

Twitter, an online social networking service is devised so as to treasure trove what is circumstance at any juncture in time, everywhere in the globe and it can provoke the data streams at rapid momentum. In the twitter network all the messages generate a data momentum and handle eminently vigorous behaviours of the actors in the twitter network. Twitter serves an enormous collection of APIs and actors can utilize them without registering. In twitter data information streams are mannered and categorizing issues are concentrated and these streams are evaluated for discovering analysis of sentiment and extracting the opinion. The automatic collection of corpus and linguistic analysis of the collected corpus for sentiment analysis is shown. A sentiment classifier that is able to determine decisive, pesimisive and non-decisive sentiments for a document is performed using the collected corpus. Using various learning algorithms like Naive Bayesian Algorithm, Max Entropy Algorithm, Baseline Algorithm and Support Vector Machine, a research on twitter data streams is performed.

Santhi Chinthala, Ramesh Mande, Suneetha Manne, Sindhura Vemuri

A Secure and Optimal Data Clustering Technique over Distributed Networks

Clustering is an automatic learning technique aimed at grouping a set of objects into subsets or clusters. The goal is to create clusters that are coherent internally, but substantially different from each other. Privacy is an important factor while datasets or data integrates from different data holders for mining over a distributed networks. Secured and optimal data clustering in distributed networks has played an important role in many fields like Information Retrieval, Data mining, Knowledge and Data engineering or community based clustering. Secured mining of data is required in open network. In this paper we are proposing an efficient privacy preserving and optimal data clustering technique over distributed networks.

M. Yogita Bala, S. Jayaprada

Enabling the Network with Disabled-User-Friendliness

Large-scale facilities are to be established to help the under-privileged population to make their way up the education ladder and to reach out for high level employment opportunities. Tapping the vast potentials of the Internet and tuning it for this special community can show quick and effective results. In this line, we introduce the concept of ‘Disabled-aware Network Infrastructure’ that can ease the access to Internet resources, specifically for the disabled users. We present a model based on Deep packet inspection techniques and content adaptation algorithms for applying at the intelligent Networking elements. Using a sample scenario we describe how this model, upon implementation, can deliver a more disabled-friendly Internet content to a differently-abled end-user.

T. K. S. Lakshmi Priya, S. Ananthalakshmi

Hierarchical Clustering for Sentence Extraction Using Cosine Similarity Measure

Clustering is an unsupervised learning technique, grouping a set of objects into subsets or clusters. It forms the clusters that are similar with the data points internally, but dissimilar with the data points that are present in other clusters from each other. Extraction of data efficiently and effectively from the datasets or data holders need enhanced mechanism. Extraction of relevant sentences based on user query plays a big role in data mining and web mining etc. In this paper we propose an efficient and effective way to extract sentences by taking query as input and forming hierarchical clustering with cosine similarity measure. A Threshold value is taken initially, and clusters are divided depending on it. Further clustering is done based on the previous Threshold value.

D. Kavyasrujana, B. Chakradhara Rao

A Novel Encryption Using One Dimensional Chaotic Maps

This paper introduces a simple and efficient chaotic system using one-dimensional (1D) chaotic map. For high security purpose this paper proposing a new encryption algorithm and it consists of uniform density function. This algorithm has more security to different type of images such as gray scale images, color images. The proposed algorithm transform images into different noise like encrypted images with excellent confusion and diffusion properties. Using a security keys this algorithm provides different encrypted image. In this paper we explain about scrambling system using hyper chaotic system it provides RGB color image which splits the three components and then by using hyper chaotic sequence generation we generate the scramble the original image. Using a same set of security keys, novel image encryption algorithm is able to generate a completely different encrypted image each time when it is applied to the same original image, and it has high sensitivity and good ability of resisting statistic attack.

Saranya Gokavarapu, S. Vani Kumari

An Empirical Analysis of Agent Oriented Methodologies by Exploiting the Lifecycle Phases of Each Methodology

Agent oriented methodologies illustrates the potential steps and offers a promising solution to analyze, design and build complex software systems and this makes these methodologies a significant entity to improvise the current practices that exists in today’s software engineering process. Agent paradigm exhibits exigent humanoid properties. Our demand is to produce a wide range of enterprise and mission-critical applications that has to be autonomous, extensible, flexible, robust, reliable and capable of being remotely monitored and controlled. At this juncture the agent oriented methodologies need to be analyzed and chosen based on our application’s need. The previous analyzes were purely conceptualized and attribute based and the phases in each methodology were not exposed in the analysis. This paper presents an empirical approach to compare each phase on course of selecting the appropriate agent oriented methodology that satisfies each scenario of our applications. An agent based system for online shopping is developed for analysis. Since software engineering methodologies are a quantifiable approach, we justify that AOSE methodologies must also be quantifiable.

E. Ajith Jubilson, P. M. Joe Prathap, V. Vimal Khanna, P. Dhanavanthini, W. Vinil Dani, A. Gunasekaran

An Expeditious Algorithm for Random Valued Impulse Noise Removal in Fingerprint Images Using Basis Splines

In image forensics, the accuracy of biometric based identification and authentication system depends upon the quality of fingerprint images. The quality of the fingerprint images gets compromised by the introduction of various kinds of noise into them. The fingerprint images may get corrupted by random valued impulse noise mainly during the capturing or transmission process. To obtain noise free fingerprint images, they are processed using noise removal methods and filters. In this paper, a two stage novel and efficient algorithm for suppression of random valued impulse noise using basis splines interpolation is proposed. The algorithm removes noise from the image in the first stage and in second stage, it regularizes the edge deformed during noise removal process.

Mohit Saxena

TERA: A Test Effort Reduction Approach by Using Fault Prediction Models

It is known fact that testing consumes more than fifty percent of development effort in software development life cycle. Hence it may be advantageous for any organization if the testing effort is reduced. Various fault prediction models have been proposed but how these prediction models reduce the test effort after prediction is rarely explored. An approach is proposed which uses the prediction models in order to estimate the reduction of test effort. Initially by using the prediction models the number of faults are predicted and based on these faults appropriate test effort is allocated to each module. The basic strategy used to allocate the test effort is to let the test effort be proportional to the predicted number of faults in a module. The test effort can be reduced only if the suitable test strategy is used and also the fault prediction must be accurate.

Inayathulla Mohammed, Silpa Chalichalamala

A Multibiometric Fingerprint Recognition System Based on the Fusion of Minutiae and Ridges

Fingerprints are widely used since more than 100 years for personal identification due to its feasibility, permanence, distinctiveness, reliability, accuracy and acceptability. This paper proposes a multibiometric fingerprint recognition system based on the fusion of minutiae and ridges as these systems render more efficiency, convenience and security than any other means of identification. The increasing use of these systems will reduce identity theft and fraud and protect privacy. The fingerprint minutiae and ridge features are ridge bifurcations and ridge endings respectively that are combined to enhance overall accuracy of the system. The existence of multiple sources adequately boosts the dimensionality of the feature space and diminishes the overlap between the feature spaces of different individuals. These features are fused at feature level as it provides better recognition performance because the feature set comprises abundant source information than matching level or decision level.

Madhavi Gudavalli, D. Srinivasa Kumar, S. Viswanadha Raju

Segment Based Image Retrieval Using HSV Color Space and Moment

This paper proposed a color image retrieval method based on the primitives of color space and moments (HSVCSM).The proposed HSVCSM analyses the visual properties of the HSV (Hue, Saturation, Value) color space and human visual system. This method is effectively generating the color histogram for CBIR applications. HSVCSM introduced color features in HSV space to quantize the color space into 15 non uniform bins to calculate color spatial feature. Proposed HSVCSM divides the query image into three segments, in which color moments for all segments are extracted like Mean, Variance and Skewness and clustered into three classes for HSV channels. Performance of HSVCSM is evaluated on the basis of precision and recall. Experimental results show that the proposed method HSVCSM for image retrieval is more precise, well-organized and quite comprehensible in spite of the accessibility of the existing retrieving algorithms.

R. Tamilkodi, R. A. Karthika, G. RoslineNesaKumari, S. Maruthuperumal

Pedestrian with Direction Detection Using the Combination of Decision Tree Learning and SVM

In the real world scenario of automatic navigation of a pedestrian on busy roads is still a challenging job to identify the pose and the direction of the pedestrian. To detect pedestrian with direction will involve large number of categories or class types. This paper proposes a combination of two techniques. The goal is to train the system on the bases of gradients, use the decision tree from that we can generate the candidate list (confusing pairs) with similar features. Taking the confusion matrix into consideration SVM is trained; this method will reduce the computational cost and generate appropriate results. The proposed work can be to classification and track the pedestrian direction on the still images.

G. Santoshi, S. R. Mishra

Application of Radon Transform for Image Segmentation on Level Set Method Using HKFCM Algorithm

In this Paper, HKFCM Clustering algorithm was used to generate an initial contour curve which overcomes leaking at the boundary during the curve propagation. Firstly, HKFCM algorithm computes the fuzzy membership values for each pixel. On the basis of HKFCM the edge indicator function was redefined. Using the edge indicator function the biomedical image was performed to extract the regions of interest for advance processing. Applying the radon transform on the output image will be shown with experiment demonstration. The above process of segmentation showed a considerable improvement in the evolution of the level set function.

R. Nirmala Devi, T. Saikumar

Performance Analysis of Filters to Wavelet for Noisy Remote Sensing Images

In this paper, we have used Linear Imaging Self Scanning Sensor (LISS- III) remote sensing image data sets which are having four bands of Aurangabad region. For an empirical preprocessing work at lab an image is loaded and taken band image of spectral reflectance values and applied median 3x3, median 5x5, sharp 14, sharp 18, smooth 3x3, smooth 5x5 filters and the quality has been successfully measured. It gives better results than original noisy remote sensing image; therefore, the quality has been improved in all filters. Moreover to achieving high quality we have used multilevel 2D wavelet decomposion based on haar wavelet filter while applying various above filters on noisy remote sensing images, we can remove noise from remote sensing images at large level through multilevel 2D Wavelet decomposing based on haar wavelets over above filters has been proved successfully. Thus, this work plays significance important role in the domain of satellite image processing or remote sensing image analysis and its applications as a preprocessing work.

Narayan P. Bhosale, Ramesh R. Manza, K. V. Kale, S. C. Mehrotra

Singer Identification Using MFCC and LPC Coefficients from Indian Video Songs

Singer identification is one of the challenging tasks in Music information retrieval (MIR) category. Music of India generates 4-5% of net revenue for a movie. Indian video songs include variety of singers. The research presented in this paper is to identify singer using MFCC and LPC coefficients from Indian video songs. Initially Audio portion is extracted from Indian video songs. Audio portion is divided into segments. For each segment, 13 Mel-frequency cepstral coefficients (MFCC) and 13 linear predictive coding (LPC) coefficients are computed. Principal component analysis method is used to reduce the dimensionality of segments. Singer models are trained using Naive bayes classifier and back propagation algorithm using neural network. The proposed approach is tested using different combinations of coefficients with male and female Indian singers.

Tushar Ratanpara, Narendra Patel

Role of Clinical Attributes in Automatic Classification of Mammograms

It has been established that mammogram plays vital role in early detection of breast diseases. We also know that computational empowerment of mammogram facilitates relevant and significant information. Several research groups are exploring various aspects of mammograms in terms of feature selection to develop an effective automatic classification system.

Mammographic attributes including textural features, statistical features as well as structural features are used effectively to develop automatic classification systems. Several clinical trials explained that attributes of patient’s clinical profile also plays an important role in determination of class of a breast tumor. However, usage of patients clinical attributes for automatic classification and results thereof are not reported in literature. None of the existing standard mammogram datasets provide such additional information about patients history attributes.

Our focus is to validate observations revealed by clinical trials using automatic classification techniques. We have developed a dataset of mammogram images along with significant attributes of patients clinical profile. In this paper, we discuss our experiments with standard mammogram datasets as well as with our extended, informative live data set. Appropriate features are extracted from mammograms to develop Support Vector Machine (SVM) classifier. The results obtained using mere mammographic features are compared with the results obtained using extended feature set which includes clinical attributes.

Aparna Bhale, Manish Joshi, Yogita Patil

Quantifying Poka-Yoke in HQLS: A New Approach for High Quality in Large Scale Software Development

Improving performance of software, web sites and services is a holy grail of software industry. A new approach is the implementation of Poka-Yoke method in software performance engineering is proposed. Poka-Yoke is a mistake proofing technique used in product design. We are proposing HQLS: a new approach for high quality in the large scale software development in this paper. The effectiveness of Poka-Yoke in software development was evaluated using a case study: product redesign mini-project given to six groups of students with this both quantitative and qualitative evaluation was done. Our proposed mistake proofing technique for software development using Poka-Yoke evaluation demonstrated the usability goals. The results showed that implementing Poka-Yoke technique improves the software development process. Improved UGAM and IOI scores showed linearity and justified Poka-Yoke implementation. Our findings recommend usage of techniques for mistake proofing for overall software performance. The main aim is to reduce errors in software development process.

K. K. Baseer, A. Rama Mohan Reddy, C. Shoba Bindu

SIRIUS-WUEP: A Heuristic-Based Framework for Measuring and Evaluating Web Usability in Model-Driven Web Development

Now-a-days websites providing all kinds of services to the users, this role of importance of the web in our society has led to a tremendous growth of websites. Websites are now generally considered the most effective and efficient marketing channel. Usability plays an important role in the development of successful websites, in order to make usable designs expert recommendations are considered as guidelines. There is a lack of empirically validated usability evaluation methods that be applied to models in model-driven web development. To evaluate these models WUEP (Web Usability Evaluation Process) method is proposed. And also presents operationalization and empirical validation of WUEP into another method: WebML. The evaluation methods WUEP and HE (Heuristic Evaluation) were evaluated from the viewpoint of novice inspectors to know the effectiveness, efficiency, perceived ease of use and satisfaction of evaluation methods. Inspectors were satisfied when applying WUEP; it is easier to use than HE.Usability measurement is considered as part of the development process stands out amongthe expert’s recommendations. Here we are defining Sirius, to perform expert evaluations an evaluation framework is proposed.

S. Sai Aparna, K. K. Baseer

Implementation of Secure Biometric Fuzzy Vault Using Personal Image Identification

Biometric is proved to be an exceptional tool for identifying an individual. Security of biometric template is the most challenging aspect of biometric identification system. Storing the biometric template in the database increases the chance of compromising it which may lead to misuse of the individual identity. This paper proposes a novel and computationally simpler approach to store a biometric sample in the form of template by using cryptographic salts. Use of Personal Image Identification (PII) makes the proposed algorithm more robust and adds another level of security. The saltcrypted templates are created and stored instead of storing the actual sample behaving as a fuzzy vault. The algorithm has been analytically proved computationally simple compared to the existing template security mechanisms. The fuzzy structure of saltcrypted template is entirely dependent on user interaction through PII. Actual template is not stored at any point of time which adds new dimension to the security and hence to individual identity.

Sarika Khandelwal, P. C. Gupta

Robust Pattern Recognition Algorithm for Artifacts Elimination in Digital Radiography Images

In projection radiography stations image quality is enhanced by using anti-scatter grids that improve image contrast but form specific patterns that may be visible or may cause Moiré effect when digital image is resized on a diagnostic monitor. In this paper a robust, efficient and fully automated grid pattern recognition and elimination algorithm is proposed which is still an actual problem especially in computer aided diagnosis. The pattern recognition is based on statistical approach in both spatial and frequency domains and provides features extracted for the elimination stage. The pattern suppression is based on a 2-D filter approach preserving diagnostic quality of the image. Experimental results and advantages over existing approaches are discussed.

Igor Belykh

Hull Detection from Handwritten Digit Image

In this paper we proposed a novel algorithm for recognizing hulls in a hand written digits. Those hull regions are detected in order to find out in a digit of user drawn. To achieve that it is necessary to follow the three steps. Those are Pre-processing, Boundary Extraction and at last apply the Hull Detection system in a way to attain the most relevant results. The detection of Hull Regions is mainly aim to intend the increase of machine learning capability in detection of characters or digits. This provides an artificial intelligence to the system in away in can identify the digits or characters easily. This can also be extended in order to get to detect the hull regions and their intensities in Black Holes in Space Exploration.

Sriraman Kothuri, Mattupalli Komal Teja

Learning Approach for Offline Signature Verification Using Vector Quantization Technique

Signature is a behavioral trait of an individual and forms a special class of handwriting in which legible letters or words may not be exhibited. Signature Verification System (SVS) can be classified as either offline or online. [1] In this paper, we used vector quantization technique for signature verification. The data is captured at a later time by using an optical scanner to convert the image into a bit pattern. The features thus extracted are said to be static. Our system is designed using cluster based features which are modeled using vector quantization as its density matching property provides improved results compared to statistical techniques. The classification ratio achieved using Vector Quantization is 67%.

Aarti Chugh, Charu Jain, Priti Singh, Preeti Rana

Fuzzified Inverse S-Transform for Identification of Industrial Nonlinear Loads

In this paper a modified inverse Stockwell transform has been proposed for the identification of industrial nonlinear loads. The proposed method is based on the maximum values of unfiltered inverse Stockwell Transform termed as MUNIST. It is a well known fact that Stockwell transform technique produces time-frequency representation. As the proposed MUNIST technique is obtained from inverse operation of time-frequency data it gives only time resolution. MUNIST technique found to provide unique signatures for accurate identification of the industrial loads. Later the results obtained using the proposed technique has been used as input to the fuzzy decision box. Using the fuzzy logic design automatic identification of different nonlinear loads has been carried out efficiently and accurately. Current measurements of different industrial loads have been used as the input for the proposed MUNIST algorithm. The results obtained using the proposed technique, have been compared with the existing techniques to show its efficacy.

Srikanth Pullabhatla, Chiranjib Koley

Bandwidth Allocation Scheme in Wimax Using Fuzzy Logic

WiMAX referes to IEEE 802.16 standard for metropolitan area networks that has inherent support for variety of real and non real applications. The quality of service mechanism in WiMAX has been left as open issues for vendors. This paper proposes a bandwidth allocation scheme for WIMAX networks using fuzzy logic concepts. The system works as adaptive technology in granting bandwidth to all traffic classes and helps to satisfy quality of service requirements for all service classes. The results demonstrate that proposed system was able to fulfill requirements of all classes and avoid starvation of low priority classes.


GMM Based Indexing and Retrieval of Music Using MFCC and MPEG-7 Features

Audio which includes voice, music, and various kinds of environmental sounds, is an important type of media, and also a significant part of video. The digital music databases in place these days, people begin to realize the importance of effectively managing music databases relying on music content analysis. The goal of music indexing and retrieval system is to provide the user with capabilities to index and retrieve the music data in an efficient manner. For efficient music retrieval, some sort of music similarity measure is desirable. In this paper, we propose a method for indexing and retrieval of the classified music using Mel-Frequency Cepstral Coefficients (MFCC) and MPEG-7 features. Music clip extraction, feature extraction, creation of an index and retrieval of the query clip are the major issues in automatic audio indexing and retrieval. Indexing is done for all the music audio clips using Gaussian mixture model (GMM) models, based on the features extracted. For retrieval, the probability that the query feature vector belongs to each of the Gaussian is computed. The average Probability density function is computed for each of the Gaussians and the retrieval is based on the highest probability.

R. Thiruvengatanadhan, P. Dhanalakshmi, S. Palanivel

Heart Disease Prediction System Using Data Mining Technique by Fuzzy K-NN Approach

Data mining technique in the history of medical data found with enormous investigations resulted that the prediction of heart disease is very important in medical science. The data from medical history has been found as heterogeneous data and it seems that the various forms of data should be interpreted to predict the heart disease of a patient. Various techniques in Data Mining have been applied to predict the patients of heart disease. But, the uncertainty in data was not removed with the techniques available in data mining. To remove uncertainty, it has been made an attempt by introducing fuzziness in the measured data. A membership function was designed and incorporated with the measured value to remove uncertainty. Further, an attempt was made to classify the patients based on the attributes collected from medical field. Minimum distance K-NN classifier was incorporated to classify the data among various groups. It was found that Fuzzy K-NN classifier suits well as compared with other classifiers of parametric techniques.

V. Krishnaiah, G. Narsimha, N. Subhash Chandra

Applying NLP Techniques to Semantic Document Retrieval Application for Personal Desktop

Information retrieval in a semantic desktop environment is an important aspect that is to be stressed on. There are many works in the past which proposed many improved information retrieval techniques in this environment. But in most of them there is one thing that is lacking, is the involvement of user in the retrieval process. In other words it should be an interactive information retrieval system. This paper proposes an interaction based information retrieval system which interacts with the user to find out the hints and the suggestions from his/her side to get the best of the results which might satisfy the user.

D. S. R. Naveenkumar, M. Kranthi Kiran, K. Thammi Reddy, V. Sreenivas Raju

An Adaptive Edge-Preserving Image Denoising Using Epsilon-Median Filtering in Tetrolet Domain

Image denoising is a well-studied problem in the field of image processing and computer vision. It is a challenge to important image features, such as edges, corners, etc., during the denoising process. Wavelet transform provides a suitable basis for suppressing noisy signals from the image. This paper presents a novel edge-preserving image denoising technique based on tetrolet transform to preserve edges. Experimental results, compared to other approaches, demonstrate that the proposed method is suitable especially for the natural images corrupted by Gaussian noise.

Paras Jain, Vipin Tyagi

A Comparison of Buyer-Seller Watermarking Protocol (BSWP) Based on Discrete Cosine Transform (DCT) and Discrete Wavelet Transform (DWT)

Buyer-Seller watermarking protocol (BSWT) is used to preserve the rights for the buyer and the seller. Frequency domain watermarking embedding that is DCT and DWT can affect the robustness and imperceptibility of watermarking algorithm. This paper studies the comparison of both domain which is DCT and DWT and concludes which one is better on the bases of some parameters. Digital watermarking is a key technology to embed information as unperceivable signals in digital contents. Buyer-seller watermarking protocols based on Discrete Cosine Transform (DCT) and Discrete Wavelet Transform (DWT) integrate digital watermarking algorithm and cryptography techniques for copyright protection. In this paper we have shown the comparison of these two, buyer-seller watermarking protocol based on Discrete Cosine Transform (DCT) and Discrete Wavelet Transform (DWT). These two protocols use Public Key Infrastructure (PKI), arbitrator and watermarking certificate authority (WCA) for better security. This paper shows results of watermark image quality based on peak signal-to-noise ratio (PSNR) mean square error (MSE) and similarity factor (SF).

Ashwani Kumar, S. P. Ghrera, Vipin Tyagi

Log Analysis Based Intrusion Prediction System

This paper proposes an intrusion prediction system in which network log file entries are used for the prediction of attacks. Good filtering and classification techniques helps to process huge amount of data and find patterns of anomalies pertaining to network attacks. The techniques used in this paper are Naive Bayes and Adaboost Cost Sensitive Learning algorithms. The network log files obtained from network devices like IDS, Firewalls etc. are collected, normalized and correlated with the help of Alienvault SIEM and the fields which are important for classification are extracted. Next, the training data is classified with the help of Naive Bayes and misclassified entries are passed on to Cost Sensitive variant of Adaboost by which the classification rate is improved. Now with the help of this train data the system creates an attack model with the help of which it predicts whether an attack is about to happen or not.

Rakesh P. Menon

Multi-objective k-Center Sum Clustering Problem

Given a set




objects in two dimensional plane and a positive integer


( ≤ 


), we have considered the problem of partitioning




clusters of circular shape so as to minimize the following two objectives: (


) the sum of radii of these


circular clusters and (


) the number of points of


covered by more than one circular cluster. The



multi-objective genetic algorithm



) has been proposed to solve this problem.

Soumen Atta, Priya Ranjan Sinha Mahapatra

Reliability Aware Load Balancing Algorithm for Content Delivery Network

With increasing use of internet and data sharing over internet network traffic over internet has increased beyond a limit, which has also increased the number of request made for a resource over a server. So to maintain the Quality of service even if the request made become large CDN (Content Delivery Network) is used. Main goal of CDN is to balance the load over the servers. But with the increase of load over a server even after balancing decreases the reliability of server with increase in fault rate and processing time. Proposals made for CDN do not take into consideration real time faults accuring in a server with time. To overcome this fault and reliability aware load balancing algorithm for CDN is proposed in this paper to increase the scalability and reliability of CDN.

Punit Gupta, Mayank Kumar Goyal, Nikhil Gupta

SQL Injection Detection and Correction Using Machine Learning Techniques

SQL is a database language which is used to interact with the database. SQL is a language with the help of which database could be created, modified and deleted. Nowadays every organization used to have their own databases which may keep important information which should not be shared publicly. The SQL injection technique is now one of the most common attacks on the Internet. This paper is all about SQL injection, SQL injection attacks, and more important, how to detect and correct SQL injection. This paper proposes an algorithm to detect not only the SQL injection attack but also detects unauthorized user by maintaining an audit record using machine learning technique (clustering).

Garima Singh, Dev Kant, Unique Gangwar, Akhilesh Pratap Singh

An Enhanced Ontology Based Measure of Similarity between Words and Semantic Similarity Search

Measures of Semantic Similarity of two sets of words that describe two entities is an important problem in Web Mining. Semantic Similarity measures are used in various applications in Information Retrieval (IR) , Natural Language Processing (NLP) such as Word Sense Disambiguation (WSD), synonym extraction, query expansion and automatic thesauri extraction. The Computer being a syntactic machine, it cannot understand the semantics. Ontology is the explicit specialization of concepts, attributes and the relationships between them. It is for providing relevant and accurate information to the users for a particular domain. A new Semantic Similarity measure based on the domain Ontology is proposed here. It brings out a more accurate relationship between the two words The main purpose of finding Semantic Similarity is to enhance the integration and retrieval of resources in a more meaningful and accurate way. The performance analysis in terms of Precision and Recall for Traditional Search and Semantic Similarity Search is done. The Precision value of Semantic Similarity Search is high compared with the Traditional Search. This paper focuses on the approaches that differentiates the Semantic Similarity Research from other related areas.

M. Uma Devi, G. Meera Gandhi

Intelligent Traffic Monitoring Using Internet of Things (IoT) with Semantic Web

The sudden rise in population has brought a very heavy demand of vehicles, hence the need to control them. The paper highlights the issue of Intelligent Traffic monitoring system using the technologies like IoT, MultiAgent system and Semantic Web. The paper links IoT sensors using Zigbee protocol and traffic movements are continously monitored by control centre using Granular classification in Ontology.

Manuj Darbari, Diwakar Yagyasen, Anurag Tiwari

Development of a Real-Time Lane Departure Warning System for Driver Assistance

According to World Health Organization’s (WHO) global status report on road safety, the number of road accidents per day is increasing drastically. Majority of these accidents occur due to violation of the safety norms by the drivers. The main objective of this paper is to develop a real-time lane departure warning system based on Beagle board. The proposed system avoids road accidents by giving warning to the driver when the vehicle begins move out of its lane. The developed system is also useful to minimize accidents by addressing the main causes due to drivers error and drowsiness.

The proposed system runs on an embedded operating system called angstrom. The system architecture deals with the development of device driver for the USB web camera on OMAP3530 processor in standard Linux kernel and installation of Open CV package to merge USB web cameras with the navigation system.

N. Balaji, K. Babulu, M. Hema

Visual Simulation Application for Hardware In-Loop Simulation (HILS) of Aerospace Vehicles

Visual simulation which is a basic form of virtual reality (VR), uses 3-D images and models of real world which can be explored interactively by the user in a 3-dimensional realistic immersive environment. Some of the well known applications of VR and visual simulation are in the areas of Flight simulator and interactive games. VR is now extensively being used for many practical and scientific applications including mining, oil & gas exploration, automobile design, business training, medical science, robotics to name a few. Advanced Simulation centre (ASC) at RCI is primarily involved in carrying out Hardware In-loop Simulation (HILS) of various aerospace vehicles (ASVs). HILS is the only tool for carrying out complete validation and performance analysis of embedded systems, hardware, control & guidance software of missile systems in an integrated manner subjecting it to trajectory dynamics. HILS generates lot of data which needs to be analyzed thoroughly before the test launch. It’s a quite tedious task to go through and analyze such a huge data manually. Modern developments in the field of VR based man machine interactions has made it easy to facilitate such tasks enormously to analyze and understand in a more convenient and collaborative manner. In our work, we have exploited the strengths of this technology for analysis purpose with fault injection & detection during the HILS of ASV. A 3-channel distributed rendering based immersive projection system has been established where visual application runs on a high end graphics cluster.The application is capable to visualize complete trajectory dynamics of an ASV from lift-off to impact point as per mission sequence and makes simulation engineer able to walkthrough inside the vechicle, visualize effects and troubleshoot the defects caused due to failures in the hardware and software. It also enhances post flight analysis capability. This paper focuses on the visual simulation system and its application in HILS of Aerospace vehicle.

Pramod Kumar Jha, Chander Shekhar, L. Sobhan Kumar

Multiprotocol Label Switching Feedback Protocol for Per Hop Based Feedback Mechanism in MPLS Network

Multiprotocol feedback (MFB) protocol is suggested in this paper which suggest per hop base feedback mechanism in MPLS network including piggy backing in the algorithm, thus reduces the time needed for error detection and correction and also efficient for bandwidth utilization in the MPLS network. This is achieved by including a packet of 7 bit which carry three different types for correction and detection. Thus increases efficiency of MPLS network and also provide efficient method to find the error detection and correction and efficient for bandwidth utilization.

Ankur Dumka, Hardwari Lal Mandoria

Towards Improving Automated Evaluation of Java Program

E-learning is gaining widespread use and is being used as importance method of education, particularly in higher education. Manual program evaluation by the instructor or expert is a time consuming process and subject to manual variations and errors. The automatic assessment and grading of student’s answers/assignments plays an important role in improving e-learning and providing relief to instructors from the time consuming and lengthy task of manually evaluating student’s assignments/computer programs and grading them. To address this problem, we have proposed a model for automated evaluation and grading of Java programs submitted by students during term work submission or practical examination. This research work focuses on automated evaluation of Java program using various parameters like number of compilation errors, correct output, lines of code, use of coding and naming conventions, cyclomatic, time and space complexity. The experimental results obtained from the initial prototype implementation are encouraging and validates the effectiveness of the proposed model.

Aditya Patel, Dhaval Panchal, Manan Shah

Fault Tolerant Scheduling - Dual Redundancy in an Automotive Cruise Control System

Safety Critical real time systems are required to meet high reliability requirements, stringent deadlines and temporal demands. Such demands are met with fault tolerant mechanisms for applications like automotive, space and avionics systems. For such safety critical systems, to ensure the success of systems, various redundancy schemes are built into hard real-time systems. In this paper, a dual redundant scheme with active hot standby system has been employed in a Cruise Control System. A framework based on a paradigm for fault tolerance to provide adaptive fault tolerance scheduling of tasks in a DAG of the CCS is proposed. The scheme when implemented gives an efficient offline task scheduling, adaptive online dynamic reconfiguration of resources for single point of failure and guarantees functional and timing correctness of essential tasks. Efficient use of the redundant resources under fault free conditions and fail safe mechanism for fault ensures full functionality and enhanced performance. The comparative evaluation with a typical traditional dual system with performance metric highlights the enhanced performance and the importance of this work for the automotive industry.

Manne Lakshmisowjanya, Annam Swetha, V. Radhamani Pillay

A Simple Mathematical Model for Performance Evaluation of Finite Buffer Size Nodes in Non- Saturated IEEE 802.11 DCF in Ad Hoc Networks

Analytical models help in predicting the change in results once the input parameters of the networks are changed. A lot of modeling work has already been carried out to evaluate the performance of IEEE 802.11 DCF for both saturated networks and non-saturated networks. Most of work considering the arbitrary buffer size nodes in non-saturated conditions uses complex mathematically approach. In the current paper, we present a flexible and practical transform-free approach for evaluating the performance of IEEE 802.11 MAC protocol. The simplicity and fastness of our algorithm is due to the simple closed form expressions to determine idle probability of buffer and blocking probability of buffer with size K assuming each node as a discrete M/G/1/K queue. The proposed model gives accurate results which are validated through ns2 simulations.

Neeraj Gupta, C. S. Rai

An Efficient Algorithm and a Generic Approach to Reduce Page Fault Rate and Access Time Costs

Memory management systems are generally custom-made where the replacement algorithm deals only with the number of memory hits. So, to reduce the page fault rate, we are proposing a page replacement algorithm called Farthest Page Replacement (FPR) algorithm. More the page faults lesser the performance. However, there are many other parameters which are to be considered like access time costs in the aspects of time and energy. Access time is the time required by a processor to access data or to write data from and to memory. Higher performance can be achieved with a reduction of access time costs. So, to decrease the access time and energy consumption, we are proposing an approach for selecting the page to be replaced.

Riaz Shaik, M. Momin Pasha

A Cryptography Scheme through Cascaded Session Based Symmetric Keys for Ubiquitous Computing

A cryptography scheme through cascaded implementation of six sessions based symmetric key techniques, which are available in literature, has been proposed. The scheme may introduce new dimension to ensure data security at maximum possible level based on available infrastructure and is suitable for the security of the system in the paradigm of unify computing. The scheme is idle to trade-off between security and performance of light weight devices having very low processing capabilities or limited computing power.

Manas Paul, J. K. Mandal

A Comparative Study of Performance Evaluation of Services in Cloud Computing

Cloud Computing is a new service area in Information Technology environment with huge requirements on the shared information, infrastructure, software, resources, devices and services. Performance Evaluation is an important aspect of Cloud Computing environment. Efficient Performance evaluation technique is used to evaluate the all Performance activities, because Cloud users on demand basis in pay-as-you-go model. In this paper, we analyze and evaluate the cloud performance in different environment based on quality attributes, features, services, support specifications and access.

L. Aruna, M. Aramudhan

A Case Study: Embedding ICT for Effective Classroom Teaching & Learning

Information and Communication Technology (ICT) has presence in all sectors and so as in Education too. Instructors, Learners, Administrators and Researchers of education field are thriving for innovative teaching pedagogy to make the learning experience effectual. In this paper we have discussed the use of current trends in ICT such as BYOD, LMS and learning through Cloud Computing. We have also summarized the responses of teachers and students on the campus through online survey. Analysis shows the significant improvement in class participation and examination result of the students.

Sandeep Vasant, Bipin Mehta

Cloud Based Virtual Agriculture Marketing and Information System (C-VAMIS)

Today’s agricultural marketing has to undergo a series of exchanges or transfers from one person to another before it reaches the consumer. The challenges of the traditional agriculture are addressed significantly by using information and communication technologies (ICT) that play an important role in uplifting the livelihoods of the rural poor. Our main objective of the proposed system is to provide an environment for the farmers that would facilitate their cultivation to deliver the Agricultural Products to the marketing place in time and enriches the farmers with up to date farming technology by coordinating all transactions in a

Cloud based E-commerce Environment

. The usage of Cloud platform will reduce the cost of maintenance in isolated environments and with the application of big data helps the activity of data analytics fast, which is utilized in cost effective manner and extract needed forecasting from very large volumes of data.

A. Satheesh, D. Christy Sujatha, T. K. S. Lakshmipriya, D. Kumar

An Adaptive Approach of Tamil Character Recognition Using Deep Learning with Big Data-A Survey

Deep learning is currently an extremely active research area in machine learning and pattern recognition society. It has gained huge successes in a broad area of applications such as speech recognition, computer vision, and natural language processing. With the sheer size of data available today, big data brings big opportunities and transformative potential for various sectors; on the other hand, it also presents unprecedented challenges to harnessing data and information. As the data keeps getting bigger, deep learning is coming to play a key role in providing big data predictive analytics solutions. This paper presents a brief overview of deep learning and highlight how it can be effectively applied for optical character recognition in Tamil language.

R. Jagadeesh Kannan, S. Subramanian

Tackling Supply Chain Management through Business Analytics: Opportunities and Challenges

Information and Communication Technology (ICT) tools and technologies are revolutionizing enterprise workflow and processes. Deployment of ICT tools for supply chain planning and execution has resulted in greater agility, robustness, collaboration, visibility and seamless integration of all stakeholders in the enterprise and also extended enterprise consisting of suppliers and customers. Business analytics is emerging as a potent tool for enterprises to improve their profitability and competitive edge. Business analytics aims at building fresh perspectives and new insights into business performance using data, statistical methods, quantitative analysis and predictive modeling. Advanced analytics is being employed for several processes in supply chain planning and execution like demand forecasting, inventory management, production & distribution planning etc. Enterprise case studies of successful deployments of business analytics for supply chain management are showcased. Some challenges faced in business analytics usage like high cost, need for data aggregation from multiple sources etc are also highlighted. An integration of business analytics with disruptive and game-changing technologies like social media, cloud computing and mobile technologies in the form of the SMAC - Social, Mobile, Analytics and Cloud stack holds tremendous promise to be the next wave in enterprise computing with wideranging advantages like improved supply chain planning, collaboration, execution and stake-holder engagement.

Prashant R. Nair

A Multiple Search and Similarity Result Merging Approach for Web Search Improvisation

The increase of dependency on web information demands an improvisation in web search for accurate and highly similar results. Most search engines retrieves results based on its web crawled index database it maintains, which limited the scope of search. It also performs similar search technique for both sort and long query. To deliver consistently superior results one must understand the exact intent of the query and each keyword in the query strength. This paper focuses on proposing a multiple search and similarity result merging framework for web search improvisation with identification of strength of keywords of user query. Based on the strength of query computation multiple searches are made in different search engine for obtaining multiple results. To merge the obtain result a similarity conversion based method is proposed which provides high accurate and similar results for search improvisation.

Vijayalakshmi Kakulapati, Sudarson Jena, Rajeswara Rao

System for the Detection and Reporting of Cardiac Event Using Embedded Systems

Premature death and disability from sudden cardiac arrest continue to be a serious public health burden. Electrocardiography (ECG) is a ubiquitous vital sign health monitoring method used in the healthcare systems. The early detection of abnormality in ECG signal for cardiac disease leads to timely diagnosis and prevents the person from death. Normally the surgeons have to study a large amount of ECG data to search for abnormal beat in ECG. If the surgeons fail to note the abnormal cycles that are very less in number, it leads to fatigue. In the proposed research work, an intelligent ECG digital system is designed to interpret these abnormal signals which reduce the tedious work of interpreting ECG. The proposed work monitors the heart beat continuously in a convenient manner and with maximum accuracy for diagnosing. The ECG signals are acquired in real time and amplified through an Instrumentation amplifier. The resulting signals are processed for the cancellation of noise through a low pass filter and notch filter. Further processing of these signals in the microcontroller detects the abnormalities of cardiac arrest. The result is communicated through GSM which reduces the burden of the doctors to a greater extent. These signals are stored in the Secure Data (SD) card to have a complete history of the signals before and after the occurrence of the cardiac event. The proposed research work combines the capabilities of real-time monitoring of ECG signals and the abnormal symptom-reporting systems.

Maheswari Arumugam, Arun Kumar Sangaiah

An Early Warning System to Prevent Human Elephant Conflict and Tracking of Elephant Using Seismic Sensors

Human Elephant Conflict has been a major issue in the forest border areas, where the human habitat is troubled by the entry of wild elephants. This makes HEC a major real time environmental based on research problem. The aim of this paper is to reduce HEC, by identifying the nature of the elephants as proposed by many ecology professors and researchers. The conflict varies depending on the field and the habitation of human and elephant. Hence the objective is to take a survey of elephant tracking using different methodologies and to help both human and the elephant. This article completely focus on the field based on survey, caused by both human and elephant and the technical and Non-technical methodologies used for elephant tracking. This paper also has a proposed methodology using seismic sensors (Vibration) with high quality video cameras. These methodologies illustrate a crystal clear view of elephant path tracking. The outcome of the proposed methodology expects to produce an early warning system, which tries to save the life of both human and elephants.

D. Jerline Sheebha Anni, Arun Kumar Sangaiah

Impact of ICT Infrastructure Capability on E-Governance Performance: Proposing an Analytical Framework

The impact of IT infrastructure capability on the performance of a firm has been addressed quite sufficiently in the past literature but how these factors when coupled together to study the impact on E-Governance performance is yet to garner the main focus of attention. Effective E-Government readiness is characterised by efficient deployment of ICT infrastructure capability and the same has been recognized worldwide in the various studies on E-Government. A two phased study was proposed to meet the objectives defined for this research. This paper describes the work completed in the first phase. In the first phase of the study, an exploratory study was conducted to identify, articulate and contextualize variables extracted from literature survey and field survey. Finally, based on the exploratory study, the author’s work lays the groundwork for the development of a conceptual framework along with suggested methodology for understanding how ICT infrastructure capability affects E-Governance performance and direction for future work.

Deepak Dahiya, Saji K. Mathew

Towards Identifying the Knowledge Codification Effects on the Factors Affecting Knowledge Transfer Effectiveness in the Context of GSD Project Outcome

Global software development (GSD) is a knowledge intensive process of offshore/onsite teams that helps in planning and designing a coherent software system to meet the business needs. The aim of this research is to reveal the influence of GSD teams’ (offshore/onsite) knowledge codification factors on knowledge transfer (KT) effectiveness with respect to the outcome of GSD projects. Knowledge codification is the process of converting the tacit knowledge to explicit knowledge. In the GSD projects, where offshore/onsite teams’ are distributed and working in various geographic locations, the process of knowledge codification has influenced by several factors. Thus, our objective of this paper is to address the knowledge codification factors on knowledge transfer effectiveness in the context of GSD project outcome perceived by GSD teams. Moreover, this study explores to integrate effectiveness of knowledge transfer in GSD project outcome relationship from the service provider perspective in following dimensions: product success, successful collaboration, and personal satisfaction. This research study employs survey methods to empirically validate the research model in Indian software companies for their view of GSD projects.

Jagadeesh Gopal, Arun Kumar Sangaiah, Anirban Basu, Ch. Pradeep Reddy

Formulation and Computation of Animal Feed Mix: Optimization by Combination of Mathematical Programming

The aim of this paper is to develop the models for ruminant ration formulation for different weight classes. Least cost ration and better shelf life are taken as main objectives of the study. Firstly the linear programming (LP) models have been developed for obtaining least cost ration. Then stochastic programming (SP) models have been developed to incorporate nutrient variability.

Pratiksha Saxena, Neha Khanna

Resource Grid Architecture for Multi Cloud Resource Management in Cloud Computing

Multi Cloud Architecture is an emerging technology in cloud computing to improve the security of data and process management in single clouds. Dynamic Resource Allocation is a proven technique for efficient Resource Management in Single Cloud Architecture to mitigate resource overflow and underflow problems. Resource Management in Single Cloud Architecture is an improving area and still it has to address many problems of resource management like resource bottlenecks. Resource Management at single cloud always recommends the additional resources adoption due to frequent resource underflow problems. In this paper we proposed a Resource Grid Architecture for Multi Cloud Environment to allocate and manage the resources dynamically in virtual manner. This architecture introduces a resource management layer with logical resource grid and uses the virtual machines to map the resources against physical systems of single cloud. Experiments are proving that our Resource Grid Architecture is an efficient architecture to manage the resources against multiple clouds and supports the green computing.

Chittineni Aruna, R. Siva Ram Prasad

An Efficient Framework for Building Fuzzy Associative Classifier Using High-Dimensional Dataset

Association Rule Mining (ARM) with reference to fuzzy logic is used to further data mining tasks for classification and clustering. Traditional Fuzzy ARM algorithms have failed to mine rules from high-dimensional data efficiently, since those are meant to deal with relatively much less number of attributes or dimensions. Fuzzy ARM with high-dimensional data is a challenging problem to be addressed. This paper uses a quick and economical Fuzzy ARM algorithm FAR-HD, which processes frequent item sets using a two-phased multiple-partition approach especially for large high-dimensional datasets. The proposed algorithm is an extension to the FAR-HD process in which it improves the accuracy in terms of associative soft category labels by building a framework for fuzzy associative classifier to leverage the functionality of fuzzy association rules. Fuzzy ARM represent latent and dominant patterns in the given dataset, such a classifier is anticipated to supply superb accuracy particularly in terms of fuzzy support.

S. Naresh, M. Vijaya Bharathi, Sireesha Rodda

A Survey on Access Control Models in Cloud Computing

Cloud computing, is an emerging computing paradigm, enabling users to remotely store their data in a server and provide services on-demand. In cloud computing cloud users and cloud service providers are almost certain to be from different trust domains. Cloud computing’s multitenancy and virtualization features pose unique security and access privilege challenges due to sharing of resources among potential un trusted tenants. Hence Privacy, trust and access control are the critical issues met in cloud computing. Access control is of vital importance in cloud computing environment, since it is concerned with allowing a user to access various cloud resources. A secure user enforced data access control mechanism must be provided before cloud users have the liberty to outsource sensitive data to the cloud for storage. Heterogeneity of services in cloud computing environment demands to adopt varying degrees of granularity in access control mechanism with an efficient encryption system. In this paper, we are going to analysis various Access control models for cloud computing and possible solutions for their limitations

RajaniKanth Aluvalu, Lakshmi Muddana


Weitere Informationen

Premium Partner

BranchenIndex Online

Die B2B-Firmensuche für Industrie und Wirtschaft: Kostenfrei in Firmenprofilen nach Lieferanten, Herstellern, Dienstleistern und Händlern recherchieren.



Best Practices für die Mitarbeiter-Partizipation in der Produktentwicklung

Unternehmen haben das Innovationspotenzial der eigenen Mitarbeiter auch außerhalb der F&E-Abteilung erkannt. Viele Initiativen zur Partizipation scheitern in der Praxis jedoch häufig. Lesen Sie hier  - basierend auf einer qualitativ-explorativen Expertenstudie - mehr über die wesentlichen Problemfelder der mitarbeiterzentrierten Produktentwicklung und profitieren Sie von konkreten Handlungsempfehlungen aus der Praxis.
Jetzt gratis downloaden!