Skip to main content

2012 | Buch

Advances in Computer Science, Engineering & Applications

Proceedings of the Second International Conference on Computer Science, Engineering and Applications (ICCSEA 2012), May 25-27, 2012, New Delhi, India, Volume 1

herausgegeben von: David C. Wyld, Jan Zizka, Dhinaharan Nagamalai

Verlag: Springer Berlin Heidelberg

Buchreihe : Advances in Intelligent and Soft Computing

insite
SUCHEN

Über dieses Buch

The International conference series on Computer Science, Engineering & Applications (ICCSEA) aims to bring together researchers and practitioners from academia and industry to focus on understanding computer science, engineering and applications and to establish new collaborations in these areas. The Second International Conference on Computer Science, Engineering & Applications (ICCSEA-2012), held in Delhi, India, during May 25-27, 2012 attracted many local and international delegates, presenting a balanced mixture of intellect and research both from the East and from the West. Upon a strenuous peer-review process the best submissions were selected leading to an exciting, rich and a high quality technical conference program, which featured high-impact presentations in the latest developments of various areas of computer science, engineering and applications research.

Inhaltsverzeichnis

Frontmatter
Automatic FAPs Determination and Expressions Synthesis

This paper presents a novel method that automatically generates facial animation parameters (FAPs) as per MPEG 4 standard using a frontal face image. The proposed method extracts facial features like eye, eyebrow, mouth, nose etc. and these 2D features are used to evaluate facial parameters, namely called facial definition parameters using generic 3D face model. We determine FAPs by finding the difference between displacement of FDPs in specific expression face model and neutral face model. These FAPs are used to generate six basic expressions for any person with neutral face image. Novelty of our algorithm is that when expressions are mapped to another person it also captures expression detail such as wrinkle and creases. These FAPs can be used for expression recognition. We have tested and evaluated our proposed algorithm using standard database, namely, BU-3DFE.

Narendra Patel, Mukesh A. Zaveri
Generation of Orthogonal Discrete Frequency Coded Waveform Using Accelerated Particle Swarm Optimization Algorithm for MIMO Radar

Design of orthogonal code sets with correlation properties can effectively improve the radar performance by transmitting specially designed orthogonal Multiple Input Multiple Output (MIMO) radar. A novel particle swarm algorithm is proposed to numerically design orthogonal Discrete Frequency Waveforms and Modified Discrete Frequency Waveforms (DFCWs) with good correlation properties for MIMO radar. We employ Accelerated Particle Swarm Optimization algorithm (ACC_PSO), Particles of a swarm communicate good positions, velocity and accelerations to each other as well as dynamically adjust their own position, velocity and acceleration derived from the best of all particles. The simulation results show that the proposed algorithm is effective for the design of DFCWs signal used in MIMO radar.

B. Roja Reddy, M. Uttara Kumari
Text Independent Speaker Recognition Model Based on Gamma Distribution Using Delta, Shifted Delta Cepstrals

In this paper, we present an efficient speaker identification system based on generalized gamma distribution. This system comprises of three basic operations, namely speech features classification and metrics for evaluation. The features extracted using MFCC are passed to shifted delta cepstral coefficients (SDC) and then applied to linear predictive coefficients (LPC) to have effective recognition. To demonstrate our method, a database is generated with 200 speakers for training and around 50 speech samples for testing. Above 90% accuracy reported.

K. Suri Babu, Srinivas Yarramalle, Suresh Varma Penumatsa
Skin Segmentation Based Elastic Bunch Graph Matching for Efficient Multiple Face Recognition

This paper is aimed at developing and combining different algorithms for face detection and face recognition to generate an efficient mechanism that can detect and recognize the facial regions of input image.

For the detection of face from complex region, skin segmentation isolates the face-like regions in a complex image and following operations of morphology and template matching rejects false matches to extract facial region.

For the recognition of the face, the image database is now converted into a database of facial segments. Hence, implementing the technique of Elastic Bunch Graph matching (EBGM) after skin segmentation generates Face Bunch Graphs that acutely represents the features of an individual face enhances the quality of the training set. This increases the matching probability significantly.

Sayantan Sarkar
A Study of Prosodic Features of Emotional Speech

Speech is a rich source of information which gives not only about what a speaker says, but also about what the speaker’s attitude is toward the listener and toward the topic under discussion—as well as the speaker’s own current state of mind. Recently increasing attention has been directed to the study of the emotional content of speech signals, and hence, many systems have been proposed to identify the emotional content of a spoken utterance.

The focus of this research work is to enhance man machine interface by focusing on user’s speech emotion. This paper gives the results of the basic analysis on prosodic features and also compares the prosodic features of, various types and degrees of emotional expressions in Tamil speech based on the auditory impressions between the two genders of speakers as well as listeners. The speech samples consist of “neutral” speech as well as speech with three types of emotions (“anger”, “joy”, and “sadness”) of three degrees (“light”, “medium”, and “strong”). A listening test is also being conducted using 300 speech samples uttered by students at the ages of 19 -22. The features of prosodic parameters based on the emotional speech classified according to the auditory impressions of the subjects are analyzed. Analysis results suggest that prosodic features that identify their emotions and degrees are not only speakers’ gender dependent, but also listeners’ gender dependent.

X. Arputha Rathina, K. M. Mehata, M. Ponnavaikko
Gender Classification Techniques: A Review

Face is one of the most important biometric traits. By analyzing the face we get a lot of information such as age, gender, ethnicity, identity, expression, etc. A gender classification system uses face of a person from a given image to tell the gender (male/female) of the given person. A successful gender classification approach can boost the performance of many other applications including face recognition and smart human-computer interface. This paper illustrates the general processing steps for gender classification based on frontal face images. In this study, several techniques used in various steps of gender classification, i.e. feature extraction and classification, are also presented and compared.

Preeti Rai, Pritee Khanna
Text Dependent Voice Based Biometric Authentication System Using Spectrum Analysis and Image Acquisition

Biometrics is concerned with identifying a person based on the physical or behavioral traits of him such as face, fingerprints, voice and iris. With the pronounced need for robust human recognition techniques in critical applications such as secure access control, international border crossing and law enforcement. Biometrics is a viable technology that can be used into large-scale identity management systems. Biometric systems work under the assumption that many of the physical or behavioral traits of humans are distinctive to an individual, and that they can be precisely acquired using sensors and represented in a numerical format that helps in automatic decision-making in the context of authentication. In the presented approach effort has been made to design a Voice based Biometric Authentication system with desired aspiration level.

Somsubhra Gupta, Soutrik Chatterjee
Interactive Investigation Support System Design with Data Mining Extension

The Government and non Government investigation organizations (e.g. CID, CBI etc) are equipped with huge framework of databases constantly being operated and analyzed by high-end professional officials for updating and retrieving facts which assists these organizations in information requirement for investigation, investigation proceedings and finally for solving the case. This Interactive Investigation Support System is designed for the purpose of supporting crime investigation conducted under the jurisdiction of local Police Station (including District level authorization) and related issues of administrative bureaucratic hierarchy. Within the scope of this support system, fields of crime investigation have been streamlined to crimes based on vehicle theft. There is an option for providing support to search among old criminal(s) who has already committed similar crime on the same area using data mining technique. It will not detect the criminal(s). It only gives an additional support for decision making. The objective of the system is to encompass these aforesaid dynamic features operating within a large framework of databases. It interacts with a usability tested Graphical User Interface, provides user-friendly searching method – which computationally less complex, interactive and with least bug.

Somsubhra Gupta, Saikat Mazumder, Sourav Mondal
Pattern Recognition Approaches to Japanese Character Recognition

Optical character recognition is a crucial step in the document retrieval and analysis. However this process could be error prone, especially in Japanese language, where the text is composed from over 3000 characters which can be classified as syllabic characters, or Kana, and ideographic characters, called Kanji. Moreover, Japanese text does not have delimiters like spaces, separating different words. Also, the fact that several characters could be homomorphic, i.e. having similar shape definition could add to the complexity of the recognition process. In the note, a survey has been conducted of some of the approaches that have been attempted to address these issues and devise schemes for Japanese character recognition in texts. Also, our efforts to extract Japanese text using image processing techniques have been described and some of the results have been presented.

Soumendu Das, Sreeparna Banerjee
Fast Fingerprint Image Alignment

Fingerprint is one of the various modalities used in biometrics for authentication. An important issue, when designing a fingerprint–based biometric system / application is alignment of fingerprint images before feature extraction and matching. In this paper we present fingerprint alignment algorithm based on Principal Component Analysis (PCA). PCA based method is compared in terms of average time taken for fingerprint image alignment with the existing methods for fingerprint alignment. Experiments show that PCA based method is able to achieve alignment of fingerprint images in FVC2002 DB1A accurately and the algorithm is robust and fast.

Jaspreet Kour, M. Hanmandlu, A. Q. Ansari
Colour and Texture Feature Based Hybrid Approach for Image Retrieval

The Content Based Image Retrieval (CBIR) is a technique that works on images and in response extracts relevant images. A novel hybrid two stage universal CBIR technique using both colour and texture features extraction is proposed in this paper. In the first stage for colour feature extraction, colour moments up to the fourth order are extracted and are used in deriving the respective histograms which forms the colour feature vector. In the second stage for the texture feature extraction the CCM (Colour Co-occurrence Matrix) technique employed takes into account the correlation between the RGB colour bands in all the eight directions while computing the texture features. In every stage the distance between the query image and the image in the database is calculated by using relative distance measure. The resultant distance between the query image and the image in the database is calculated by using a weighted distance classifier. Thus, a hybrid fusion method is achieved that has better performance than other colour-spatial based methods and promises to give more relevant output to the user.

Dipti Jadhav, Gargi Phadke, Satish Devane
Application of Software Defined Radio for Noise Reduction Using Empirical Mode Decomposition

Software Defined Radios (SDR) are aimed at reducing the efforts required specifically in Wireless Communication. Many hardware devices are currently being used for communicating via radio waves. The function of SDR is to replace possibly all the hardware stuff by software which results in great flexibility and portability. This concept has opened new windows to the world of digital communication. Today there exists many flavors of SDR. This paper focuses on the open source GNU Radio and its capabilities. The GNU Radio Project serves as a reference for experiments in the area of signal processing and communications. This paper deals with utilizing the capabilities of software radios to improve the quality of the incoming signal. Our objective was to improve the received signal by reducing noise and thus enhancing the overall communication quality. We propose the use of Empirical Mode Decomposition (EMD) method embedded into GNU Radio. The idea presented here is to include the EMD functionality in GNU Radio toolkit so as to ensure reduction of error for better communication. We have integrated the capabilities of Empirical Mode Decomposition into GNU Radio and found improvements in the simulated environment.

Sapan H. Mankad, S. N. Pradhan
An Approach to Detect Hard Exudates Using Normalized Cut Image Segmentation Technique in Digital Retinal Fundus Image

Diabetic retinopathy is a disease commonly found in case of diabetes mellitus patients. This disease causes severe damage to retina and may lead to complete or partial visual loss. As changes occurs due to the disease is irreversible in nature, the disease must be detected in early stages to prevent visual loss. One of the most important sign of presence o f diabetic retinopathy in diabetes mellitus patients is the exudates. But detection of exudates in early stages of the disease is extremely difficult only by visual inspection. But an efficient automated computerized system can have the ability to detect the disease in very early stage. In this paper one such method is discussed.

Diptoneel Kayal, Sreeparna Banerjee
Latency Study of Seizure Detection

Epilepsy is a physical condition that occurs when there is a sudden, brief change in the normal working of brain. At this time, the brain cells are unable to function properly and the level of consciousness, movement etc. may get affected. These physical changes occur due to the hyper-synchronous firing of neurons within the brain. Most of the existing methods to analyze epilepsy depend on visual inspection of EEG recording of patients by experts who are very small in number. Also this method takes more time in diagnosis of epilepsy since EEG recording creates very lengthy data. This makes automatic seizure detection necessary. In this study a method to detect the onset of seizures is proposed in which the latency in detecting the onset has been decreased very much. The proposed method detected the onset of seizures with the mean latency of 0.70 seconds when applied on CHB-MIT scalp EEG database.

Yusuf U. Khan, Omar Farooq, Priyanka Sharma, Nidal Rafiuddin
Analysis of Equal and Unequal Transition Time Effects on Power Dissipation in Coupled VLSI Interconnects

Packing more circuits on chip is achieved through aggressive device scaling and/or increase in chip size. This results in complex geometry of interconnect wires on-chip. High density chips have introduced problems like interconnect noise and power dissipation. The CMOS technology is best known for making ICs owing to its low power static dissipation and ease of integration when compared to BJT technology. As the CMOS technology moved below sub micron levels, the power consumption per unit chip area has increased manifolds. Low power consumption leads longer battery life and lesser heat generation in the circuit. The overall performance of a chip is largely dependent on interconnects. This paper addresses the impact of equal and unequal transition time of inputs on power consumption in coupled interconnects. To demonstrate the effects, a model of two distributed

RLC

lines coupled inductively and capacitively is considered. Each interconnect line is 4 mm long and terminated by capacitive load of 30 fF. The power dissipation is measured for dynamically switching inputs.

Devendra Kumar Sharma, Brajesh Kumar Kaushik, Richa K. Sharma
Image Analysis of DETECHIP® – A Molecular Sensing Array

Several image analysis techniques were applied to a colorimetric chemical sensor array called DETECHIP

®

. Analytes such as illegal and over the counter drugs can be detected and identified by digital image analysis. Jpeg images of DETECHIP

®

arrays with and without analytes were obtained using a camera and a simple flatbed scanner. Color information was obtained by measuring red-green-blue (RGB) values with image software like GIMP, Photoshop, and ImageJ. Several image analysis methods were evaluated for analysis of both photographs and scanned images of DETECHIP

®

. We determined that when compared to photographs, scanned images of DETECHIP

®

gave better results through the elimination of parallax and shading that lead to inconsistent results. Furthermore, results using an ImageJ macro technique showed improved consistency versus the previous method when human eyesight was used as a detection method.

Marcus Lyon, Mark V. Wilson, Kerry A. Rouhier, David J. Symonsbergen, Kiran Bastola, Ishwor Thapa, Andrea E. Holmes, Sharmin M. Sikich, Abby Jackson
A Gaussian Graphical Model Based Approach for Image Inpainting

Digital image inpainting means reconstruction of small damaged portions of images. In this paper, we propose an algorithm for digital image inpainting which is a combination of pixel-diffusing technique and a user interaction mechanism. In our approach, the user manually specifies important missing structure information by extending a few curves or line segments from the known region to the unknown regions. Our approach synthesizes image patches along these user-specified curves in the unknown region using patches selected around the curves in the known region. We call this step as structure propagation. After completing structure propagation, we fill in the remaining unknown regions using Gaussian Graphical Model which is MRF based. The experiment results show that our approach is reasonable and efficient. In addition, our method is very simple to be implemented and fast.

Krishnakant Verma, Mukesh A. Zaveri
A Survey on MRI Brain Segmentation

With the growing research on medical image segmentation, it is essential to categorize the research outcomes and provide readers with an overview of the existing segmentation techniques in medical images. In this paper, different image segmentation techniques applied on magnetic resonance brain images are reviewed. The selection of papers includes sources from image processing journals, conferences, books, dissertations and thesis. The conceptual details of the algorithms are explained and mathematical details are avoided for simplicity. Both broad and detailed categorizations of reviewed segmentation techniques are provided. The state of art research is provided with emphasis on developed algorithms and image properties used by them. The methods defined are not always mutually independent. Hence, their inter relationships are also stated. Finally, conclusions are drawn summarizing commonly used techniques and their complexities in application.

M. C. Jobin Christ, R. M. S. Parvathi
Can Ear and Soft-Biometric Traits Assist in Recognition of Newborn?

Missing, swapping, mixing, and illegal adoption of newborns is a global challenge and research done to solve this issue is minimal and least reported in the literature. Most of the biometric systems developed are for adults and very few of them address the issue of newborn identification.The ear of newborn is a perfect source of data for passive identification of newborn as they are the highly non cooperative users of biometrics. The four characteristics of ear biometrics: universality, uniqueness, permanence and collectability make it a very potential biometric trait for the identification of newborn. Further the use of soft-biometric data like gender, blood group, height and weight along with ear enhances the accuracy for identification of newborn. The objective of this paper is to demonstrate the concept of using ear and soft-biometrics recognition for identification of newborn. The main contribution of the research are (a) the preparation of ear and soft biometric database of newborn. (b)Fusion of ear and soft-biometrics data for identification of 210 newborn, results in an improvement of approximately 5.59% over the primary biometric system i.e. ear.

Shrikant Tiwari, Aruni Singh, Sanjay Kumar Singh
Multi Segment Histogram Equalization for Brightness Preserving Contrast Enhancement

Histogram equalization (HE) method proved to be a simple and most effective technique for contrast enhancement of digital images, but it does not preserve the brightness and natural look of images. To overcome this problem, several Bi- and Multi-histogram equalization methods have been proposed. Among them, the Bi-HE methods significantly enhance the contrast and may preserve the brightness, but they destroy the natural look of the image. On the other hand, Multi-HE methods are proposed to maintain the natural look of image at the cost of either the brightness or its contrast. In this paper, we propose a Multi-HE method for contrast enhancement of natural images while preserving its brightness and natural look. The proposed method decomposes the histogram of an input image into multiple segments, and then HE is applied to each segment independently. Simulation results for several test images show that the proposed method enhances the contrast while preserving brightness and natural look of the images.

Mohd. Farhan Khan, Ekram Khan, Z. A. Abbasi
Various Implementations of Advanced Dynamic Signature Verification System

This paper is a research on various implementations of advanced dynamic signature verification and includes error rates, which are false rejection rate and false acceptance rate, the size of signature verification engine, the size of the characteristic vectors of a signature, the ability to distinguish similar signatures, and so on. We suggest comparison method and the performance results of the signature verification system. We have also implemented web client/server with Java technology, PC (MS Windows), PDA (WinCE) and Smart Phones.

Jin Whan Kim
Performance of Face Recognition Algorithms on Dummy Faces

Face recognition is becoming increasingly important in the contexts of computer vision, neuroscience, psychology, surveillance, credit card fraud detection, pattern recognition, neural network, content based video processing, assistive devices for visual impaired, etc. Face is a strong biometric trait for identification and hence criminals always try to hide their face by different artificial means such as plastic surgery, disguise and dummy. The availability of a comprehensive face database is crucial to test the performance of these face recognition algorithms. However, while existing publicly-available face databases contain face images with a wide variety of covariates such as poses, illumination, gestures and face occlusions but there is no dummy face database is available in public domain. The contributions of this paper are: i) Preparation of dummy face database of 50 subjects ii) Testing of face recognition algorithms on the dummy face database, iii) Critical analysis of four algorithms on dummy face database.

Aruni Singh, Shrikant Tiwari, Sanjay Kumar Singh
Locally Adaptive Regularization for Robust Multiframe Super Resolution Reconstruction

Super resolution reconstruction (SRR) is a post processing technique to correct the degradation of the acquired images due to warping, blur, downsampling and noise. In this paper, image is modeled as Markov random field (MRF) and we propose fuzzy logic filter based on gradient potential (FL) to distinguish between edge and noisy pixels. Based on pixel classification, Tikhonov regularization (TR) or bilateral total variation (BTV) is adopted as a prior in maximum a posteriori (MAP) estimation. Such priors are imperative to obtain a stable solution. Tukey’s biweight norm (TBN) is adopted for removing the outliers. The proposed approach is demonstrated on standard test images. Experimental results indicate that the proposed approach performs quite well in terms of visual evaluation and quantitative measurements.

S. Chandra Mohan, K. Rajan, R. Srinivasan
Improved Watermark Extraction from Audio Signals by Scaling of Internal Noise in DCT Domain

Scaling of internal noise in discrete cosine transform (DCT) domain has been presented for copyright protection of audio signals. Watermark as a logo is embedded into the most prominent peaks of the highest energy segment of the audio DCT coefficients. Tuning of the DCT coefficients of the watermarked signal by noise-induced resonance improves the authenticity of the watermarked signal. This scaling is produced by noise-induced resonance, generally known as Dynamic stochastic resonance (DSR). DSR utilizes the noise introduced during different signal processing attacks and it induced here as an iterative process due to which the effect of noise is suppressed and hidden information is enhanced. Response of the proposed extraction scheme suggests increased robustness against various attacks such as noise addition, cropping, re-sampling, re-quantization, MP3 compression, and echo. Comparison with the existing DCT, DWT and SVD techniques shows the better performance in terms of correlation coefficient and visual quality of extracted watermark.

Rajib Kumar Jha, Badal Soni, Rajlaxmi Chouhan, Kiyoharu Aizawa
Performance of Adders with Logical Optimization in FPGA

Addition is an indispensable operation for any high speed digital system, digital signal processing or control system. Therefore, careful optimization of the adder is of the greatest importance. This optimization can be attained in two levels; it can be circuit or logic optimization. In circuit optimization the size of transistors are manipulated, where as in logic optimization the Boolean equations are rearranged (or manipulated) to optimize speed, area and power consumption. In this work technology independent logic optimization is used to design 1-bit full adder with 20 different Boolean expressions and its performance is analyzed in terms of transistor count, delay and power dissipation using Tanner EDA with TSMC MOSIS 250nm technology. All the Boolean expressions are realized in terms of CMOS logic. From the analysis the optimized equation is chosen to construct a full adder circuit in terms of multiplexer. This logic optimized multiplexer based adders are incorporated in selected existing adders like ripple carry adder, carry look-ahead adder, carry skip adder, carry select adder, carry increment adder and carry save adder and its performance is analyzed in terms of area (slices used) and maximum combinational path delay as a function of size. The target FPGA device chosen for the implementation of these adders was Xilinx ISE 12.1 Spartan3E XC3S500-5FG320. Each adder type was implemented with bit sizes of: 8, 16, 32, 64 bits. This variety of sizes will provide with more insight about the performance of each adder in terms of area and delay as a function of size.

R. Uma, P. Dhavachelvan
Robust Iris Templates for Efficient Person Identification

Iris recognition is seen as a highly reliable biometric technology. The performance of iris recognition is severely impacted when encountering irises captured in realistic conditions. The selection of the features subset and the classification is an important issue for iris biometrics. In this paper we propose new methods for feature extraction and template creation during enrollment to improve the performance of iris recognition systems. The experiments are based on storing i) multiple templates (template group) for a user ii) Single template by taking average mean of multiple templates iii) Single template calculated from multiple templates using Direct Linear Discriminant Analysis (DLDA). We used CASIA Iris Interval database for our experiments. Experiments report significant improvement in the performance of iris recognition.

Abhishek Gangwar, Akanksha Joshi, Renu Sharma, Zia Saquib
Thesaurus Based Web Searching

Search engine technology has become quite popular to help users seek information available on the web. The success of a searching system is determined by the quality and efficiency of the search results. There may be very good items on the search topic in other languages, but, search engine will generally retrieve items of only one language. Most of these search engines use pattern search which is not efficient. In this paper we present a tool that addresses this problem. Here we discuss the work carried out in developing an efficient tool that retrieves all the items of the database relevant to search term, not just the term matching. This tool retrieves all the synonym matches from both Telugu and English languages.

K. V. N. Sunitha, A. Sharada
An Adaptive Design Pattern for Genetic Algorithm Based Autonomic Computing System

The need for adaptability in software is growing, driven in part by the emergence of pervasive and Autonomic computing. In many cases, it is desirable to enhance existing programs with adaptive behavior, enabling them to execute effectively in dynamic environments. Increasingly, software systems should self adapts to satisfy new requirements and environmental conditions that may arise after deployment. Due to their high complexity, adaptive programs are difficult to specify, design, verify, and validate. In this paper we propose an approach for Genetic Algorithm based Autonomic System using Design Patterns. The proposed pattern satisfies the properties of Autonomic System. This pattern is an amalgamation of different Design Patterns like Observer, Strategy, Singleton, Adapter and Thread per connection. For monitoring we use Observer pattern, Decision making we use Strategy Design Pattern. Adapter, Singleton and Thread per Connection pattern are used for execution of Genetic Algorithm population. Proposed pattern distribute population of Genetic Algorithm to different clients for execution. Main objective of the proposed system is to reduce the load of the server. This design pattern solve multi objective optimization problem using Genetic Algorithm.The pattern is described using a java-like notation for the classes and interfaces. A simple UML class and Sequence diagrams are depicted.

B. Naga Srinivas Repuri, Vishnuvardhan Mannava, T. Ramesh
Cross-Layer Design in Wireless Sensor Networks

The literature on cross-layer protocols, protocol improvements, and design methodologies for wireless sensor networks (WSNs) is reviewed and taxonomy is proposed. The communication protocols devised for WSNs that focus on cross-layer design techniques are reviewed and classified based on the network layers they aim at replacing in the classical open system interconnection (OSI) network stack. Furthermore, systematic methodologies for the design of cross-layer solution for sensor networks as resource allocation problems in the framework of non-linear optimization are discussed. Open research issues in the development of cross-layer design methodologies for sensor networks are discussed and possible research directions are indicated. Finally, possible shortcomings of cross-layer design techniques such as lack of modularity, decreased robustness, difficulty in system enhancements, and risk of instability are discussed, and precautionary guidelines are presented.

S. Jagadeesan, V. Parthasarathy
Semantic Based Category-Keywords List Enrichment for Document Classification

In this paper we present a text categorization technique that extracts semantic features of documents to generate a compact set of keywords and uses the information obtained from those keywords to perform text classification. The algorithm reduces the dimensionality of the document representation using overlapping semantics. Later, a keyword-category relationship matrix computes the extent of membership of the documents for various input predefined categories. The category of the document is then derived from membership metrics. Also, Wikipedia is used for the purpose of category lists enrichment. The proposed work has shown a new direction towards document classification for web applications.

Upasana Pandey, S. Chakraverty, Richa Mihani, Ruchika Arya, Sonali Rathee, Richa K. Sharma
Selection of Fluid Film Journal Bearing: A Fuzzy Approach

This paper presents a selection procedure for fluid film journal bearing by incorporating fuzzy approach. In this paper a fuzzy based selection model is proposed which can be applied for selection of hydrostatic, hydrodynamic and hybrid journal bearing. Selection criteria is formulated for the choice of space requirement, cost of bearing and load carrying capacity of the corresponding journal bearing. This approach provides a third dimension to the existing method of selection of bearing, which is dynamic and may be further refined to address to every individual needs. Therefore a fuzzy logic approach is used for decision making.

V. K. Dwivedi, Satish Chand, K. N. Pandey
Implementation of New Biorthogonal IOFDM

Proposed biorthogonal interleaved OFDM system is used in Multiuser and multicarrier technique that has been recognized as an excellent method for high speed bi directional wireless mobile communication. In conventional interleaved OFDM system, convolution encoder is used as the channel encoder, but it leads to Bandwidth inefficiency and also reduces the throughput of the transmission and reception. The proposed bi orthogonal interleaved OFDM system is having the baud rate of 9600 kbps. This system is ultimately designed for the Bandwidth optimization and also it supports the Multi user transmission and reception of interleaved OFDM system.

A. V. Meenakshi, R. Kayalvizhi, S. Asha
Application of Real Valued Neuro Genetic Algorithm in Detection of Components Present in Manhole Gas Mixture

The article deals with the implementation of an Intelligent System for detection of components present in manhole gas mixture. The detection of manhole gas is important because the manhole gas mixture contain many poisonous gases namely Hydrogen Sulfide (

H

2

S

), Ammonia (

NH

3

), Methane (

CH

4

), Carbon Dioxide (

CO

2

), Nitrogen Oxide (

NO

x

), and Carbon Monoxide (

CO

). A short exposure to any of these components with human beings endangers their lives. A gas sensor array is used for recognition of multiple gases simultaneously. At an instance the manhole gas mixture may contain many hazardous gas components. So it is wise to use specific gas sensor for each gas component in the gas sensor array. Use of multiple gas sensors and presence of multiple gases together result a cross-sensitivity. We implement a real valued neuro genetic algorithm to unravel the multiple gas detection issue.

Varun Kumar Ojha, Paramarta Dutta, Hiranmay Saha, Sugato Ghosh
Concept Adapting Real-Time Data Stream Mining for Health Care Applications

Developments in sensors, miniaturization of low-power microelectronics, and wireless networks are becoming a significant opportunity for improving the quality of health care services. Vital signals like ECG, EEG, SpO2, BP etc. can be monitor through wireless sensor networks and analyzed with the help of data mining techniques. These real-time signals are continuous in nature and abruptly changing hence there is a need to apply an efficient and concept adapting real-time data stream mining techniques for taking intelligent health care decisions online. Because of the high speed and huge volume data set in data streams, the traditional classification technologies are no longer applicable. The most important criteria are to solve the real-time data streams mining problem with ‘concept drift’ efficiently. This paper presents the state-of-the art in this field with growing vitality and introduces the methods for detecting concept drift in data stream, then gives a significant summary of existing approaches to the problem of concept drift. The work is focused on applying these real time stream mining algorithms on vital signals of human body in health care environment.

Dipti D. Patil, Jyoti G. Mudkanna, Dnyaneshwar Rokade, Vijay M. Wadhai
Efficient Domain Search for Fractal Image Compression Using Feature Extraction Technique

Fractal image compression is a lossy compression technique developed in the early 1990s. It makes use of the local self-similarity property existing in an image and finds a contractive mapping affine transformation (fractal transform) T, such that the fixed point of T is close to the given image in a suitable metric. It has generated much interest due to its promise of high compression ratios with good decompression quality. The other advantage is its multi resolution property, i.e. an image can be decoded at higher or lower resolutions than the original without much degradation in quality. However, the encoding time is computationally intensive [8].

Image encoding based on fractal block-coding method relies on assumption that image redundancy can be efficiently exploited through block-self transformability. It has shown promise in producing high fidelity, resolution independent images. The low complexity of decoding process also suggested use in real time applications. The high encoding time, in combination with patents on technology have unfortunately discouraged results.

In this paper, We have proposed efficient domain search technique using feature extraction for the encoding of fractal image which reduces encodingdecoding time and proposed technique improves quality of compressed image.

Amol G. Baviskar, S. S. Pawale
Case Study of Failure Analysis Techniques for Safety Critical Systems

Safety critical systems are built upon complex software and are difficult to maintain. These systems must effectively deal with the defects identified by analyzing its failure in order to make the system free from hazards. Any chance of human injury or death can be avoided by thoroughly verifying the safety of critical software embedded in any safety system. In this paper, the analysis on different failure analysis techniques such as Failure Modes, Effects Analysis (FMEA), Failure Modes, Effects and Criticality Analysis (FMECA) and Fault Tree Analysis (FTA) are carried out considering dependability as its critical parameter. The risk involved in safety critical system is analyzed with the case study of remote monitoring of a patient with pacemaker. The main observations are: i) Failure mode classification of the software at every stage, ii) Safety critical parameter evaluation, iii) Indication of defensive measures against the severity of hazards, iv) Correlation of FMEA, FMECA and FTA with the computed critical data and v) Recommendation of an appropriate failure analysis method for pacemaker operation to ensure safety.

Aiswarya Sundararajan, R. Selvarani
Implementation of Framework for Semantic Annotation of Geospatial Data

Framework is used to provide effective development, exchange, and use of geospatial as well as non-geospatial data. In previous paper, design of framework for semantic annotation of geospatial data, the architecture of framework and its services has been discussed briefly. This paper is based on the detailed explanation of implementation of some of the services provided by the framework.

Preetam Naik, Madhuri Rao, S. S. Mantha, J. A. Gokhale
Optimized Large Margin Classifier Based on Perceptron

Larger margin of separating hyperplane reduces the chances of generalization error of classifier. The proposed linear classification algorithm implements classical perceptron algorithm with margin, to reduce generalized errors by maximizing margin of separating hyperplane. Algorithm shares the same update rule with the perceptron, to converge in a finite number of updates to solutions, possessing any desirable fraction of the margin. This solution is again optimized to get maximum possible margin. The algorithm takes advantage of data that are linearly separable. Experimental results show a noticeable increment in margin. Some preliminary experimental results are briefly discussed.

Hemant Panwar, Surendra Gupta
Study of Architectural Design Patterns in Concurrence with Analysis of Design Pattern in Safety Critical Systems

Real time safety critical system is focused as it plays pivotal role in rendering software safety that strengthens hardware reliability to prevent hazardous failures. These problems can be addressed through the creation of quality design patterns which will effectively place the safety critical software parameter at the architectural level. This paper highlights the functionalities of each design pattern on the quantitative uphold on the safety quality factors. Here we worked on the design pattern that is compliant for safety critical systems pertaining to the safety parameters and quality attributes of various design patterns. Furthermore, we anticipate the additional quality attributes and features admissible to formulate this as the distinguished pattern for real time safety critical systems. The design patterns used in various applications is emphasized and characterized. More specifically the existing design patterns supporting safety critical systems is correlated to formulate the patterns based on the safety factors and quality attributes.

Feby A. Vinisha, R. Selvarani
A Novel Scheme to Hide Identity Information in Mobile Captured Images

A new research area concerning the application of steganography for hiding source identity information in mobile camera captured images is emerging. The work reported so far is focused on using MMS [Multimedia Messaging Service] and SMS [Short Message Service] as carrier media for covert communication and data security. The techniques employed for data hiding are lacking in a suitable combination of cryptography and steganography. The present paper proposes a novel scheme to hide source identity information (IMEI [International Mobile Equipment Identity] and IMSI [International Mobile Subscriber Identity] numbers) in mobile camera captured images. To achieve this, an application is developed that clicks images through mobile camera and hides IMEI and IMSI numbers in raw images before compressing them in PNG [Portable Network Graphics] format. The application encrypts the IMEI and IMSI numbers before hiding them. It uses a custom made key based algorithm for hiding these numbers randomly inside an image which ensures high security of hidden data. Runtime performance analysis of the above technique reveals that the computational lag due to encryption & steganography is miniscule. The technique is found feasible to run on actual mobile devices and can help identify the source of an anonymous mobile captured image.

Anand Gupta, Ashita Dadlani, Rohit Malhotra, Yatharth Bansal
A License Plate Detection Algorithm Using Edge Features

License Plate Detection (LPD) is a main step in an intelligent traffic management system. Based on many techniques have been proposed to extract license plate of different vehicles in different condition. Finding a technique that provides a good accuracy with a good time response time is difficult. In this paper, we propose a technique for LPD. The proposed technique uses the edge features to find the rows position of the license plate. In order to find the column positions, we find the small blue color part at the left of the license plates in the determined rows position. Our experimental results for different image database show that an accuracy of 96.6 percent can be achieved.

Mostafa Ayoubi Mobarhan, Asadollah Shahbahrami, Saman Parva, Mina Naghash Asadi, Atefeh Ahmadnya Khajekini
Human and Automatic Evaluation of English to Hindi Machine Translation Systems

Machine Translation Evaluation is the most formidable activity in Machine Translation Development. We present the MT evaluation results of some of the machine translators available online for English-Hindi machine translation. The systems are measured on automatic evaluation metrics and human subjectivity measures.

Nisheeth Joshi, Hemant Darbari, Iti Mathur
A Novel Genetic Algorithm Based Method for Efficient QCA Circuit Design

In this paper we have proposed an efficient method based on Genetic Algorithms (GAs) to design quantum cellular automata (QCA) circuits with minimum possible number of gates. The basic gates used to design these circuits are 2-input and 3-input NAND gates in addition to inverter gate. Due to use of these two types of NAND gates and their contradictory effects, a new fitness function has been defined. In addition, in this method we have used a type of mutation operator that can significantly help the GA to avoid local optima. The results show that the proposed approach is very efficient in deriving NAND based QCA designs.

Mohsen Kamrani, Hossein Khademolhosseini, Arman Roohi, Poornik Aloustanimirmahalleh
Adaptation of Cognitive Psychological Framework as Knowledge Explication Strategy

The differences between the way an indigenous expert understands and the way a scientific expert understands a natural resource which are incomplete in themselves can become complementary and become a major strength in achieving sustainability. An integrated system approach is the best practice of managing natural resource. In knowledge management integration is viewed in terms of horizontal and vertical dimensions. The study presents the possibility of knowledge sharing and integration between indigenous and scientific experts in a multi level multi criteria decision making environment using a cognitive psychological model of knowledge discovery called Johari Window model for knowledge sharing in the management of natural resource. Knowledge integration facilitates higher level of knowledge explication. Johari window presents a framework for knowledge integration. The advantage of this model is that it takes the problem of ‘the fourth quadrant’, where very large totally unexplored unpredicted outliers lie, into its account. Knowledge system becomes more robust, efficient and sustainable by narrowing down the knowledge gap between the experts.

S. Maria Wenisch, A. Ramachandran, G. V. Uma
Comparing Fuzzy-C Means and K-Means Clustering Techniques: A Comprehensive Study

Clustering techniques are unsupervised learning methods of grouping similar from dissimilar data types. Therefore, these are popular for various data mining and pattern recognition purposes. However, their performances are data dependent. Thus, choosing right clustering technique for a given dataset is a research challenge. In this paper, we have tested the performances of a Soft clustering (e.g., Fuzzy C means or FCM) and a Hard clustering technique (e.g., K-means or KM) on Iris (150 x 4); Wine (178 x 13) and Lens (24 x 4) datasets. Distance measure is the heart of any clustering algorithm to compute the similarity between any two data. Two distance measures such as Manhattan (MH) and Euclidean (ED) are used to note how these influence the overall clustering performance. The performance has been compared based on seven parameters: (i) sensitivity, (ii) specificity, (iii) precision, (iv) accuracy, (v) run time, (vi) average intra cluster distance (i.e. compactness of the clusters) and (vii) inter cluster distance (i.e. distinctiveness of the clusters). Based on the experimental results, the paper concludes that both KM and FCM have performed well. However, KM outperforms FCM in terms of speed. FCM-MH combination produces most compact clusters, while KM-ED yields most distinct clusters.

Sandeep Panda, Sanat Sahu, Pradeep Jena, Subhagata Chattopadhyay
Comparative Analysis of Diverse Approaches for Air Target Classification Based on Radar Track Data

Air Target Classification in a hostile scenario will be a decisive factor for threat evaluation and weapon assignment. Stealth technology denies any high frequency based regime for such classification. It is observed that kinematics of an air target is one thing that cannot be deceived. The present study makes an attempt to ascertain an appropriate Classification algorithm. On the basis of certain significant feature vectors the classifier classifies the data set of an air target into a target class. Feature vectors are derived from the Radar Track Data using Matlab code. The work presented here aims to compare the predictability importance of features using different classification algorithms.

Manish Garg, Upasna Singh
Retracted: Speed Optimization in an Unplanned Lane Traffic Using Swarm Intelligence and Population Knowledge Base Oriented Performance Analysis

Comparative Analysis of Speed Optimization Technique in Unplanned Traffic is a very promising research problem. Searching for an efficient optimization method to increase the degree of speed optimization and thereby increasing the traffic flow in an unplanned zone is a widely concerning issue. However, there has been a limited research effort on the optimization of the lane usage with speed optimization. This paper presents a novel technique to solve the problem optimally using the knowledge base analysis of speeds of vehicles, using partial modification of Swarm Intelligence which, in turn will act as a guide for design of lanes optimally to provide better optimized traffic with less number of transitions between lanes.

Prasun Ghosal, Arijit Chakraborty, Sabyasachee Banerjee
Automation of Regression Analysis: Methodology and Approach

Test automation is widely used in different practices all over the world often to save time or to reduce manual effort. However, regression analysis consists of mundane tasks that are performed by software engineers on a daily basis. Automation of a regression analysis raises our hopes by promising a reduction in time and effort, yet at the same time it continues to create as many problems as it has solved. Thus the solution has to take into account the limitations of automation yet reap the maximum benefits. This paper will focus on: The paradigm to be followed while developing automation and the Advantages and Limitations that accompany the process of automating regressions.

Kumar Abhishek, Prabhat Kumar, Tushar Sharad
A New Hybrid Binary Particle Swarm Optimization Algorithm for Multidimensional Knapsack Problem

In this paper, we presented a New Hybrid Binary Particle Swarm Optimization (NHBPSO). This hybridization consists at combining some principles of Particle Swarm Optimization (PSO) and Crossover Operation of the Genetic Algorithm (GA). The proposed algorithm is used to solving the NP-hard combinatorial optimization problem of Multidimensional Knapsack Problem (MKP). In the aim to access the efficiency and performance of our NHBPSO algorithm we have tested it on some benchmarks from OR-Library and we have compared our results with the obtained results by the standard binary Particle Swarm Optimization with penalty function technique (PSO-P) algorithm and the quantum version (QICSA) of the new metaheuristic Cuckoo Search. The experimental results show a good and promise solution quality obtained by the proposed algorithm which outperforms the PSO-P and QICSA algorithms.

Amira Gherboudj, Said Labed, Salim Chikhi
A Cooperative Multi-Agent System for Traffic Congestion Management in VANET

Vehicular ad hoc networks (VANETs) are attracting the interest of a great number of academicians and industrials. One of the most interesting features is the possibility to use a spontaneous and inexpensive wireless ad hoc network between vehicles to exchange useful information such as warning the drivers of an accident or a danger. The very recent researches on Vehicular Ad Hoc networks present novel approaches which combine multi-agent technology with transportation systems. In this paper we focus on how to solve the problem of congestion and traffic management through the application of different agent technologies. Having cars equipped with sensors in a VANET, we propose an approach based on multi mobile agent technology. The empirical results have showed the impact of agent and intelligent communications on the Vehicular Ad Hoc networks in reducing the congestion in VANETs.

Mohamed EL Amine Ameur, Habiba Drias
An Aspectual Feature Module Based Service Injection Design Pattern for Unstructured Peer-to-Peer Computing Systems

Adaptability in software is the main fascinating concern for which today’s software architects are really interested in providing the autonomic computing. Different programming paradigms have been introduced for enhancing the dynamic behavior of the programs. Few among them are the Aspect oriented programming (AOP) and Feature oriented programming (FOP) with both of them having the ability to modularize the crosscutting concerns, where the former is dependent on aspects, advice and later one on the collaboration design and refinements. In this paper we will propose an Service Injection design pattern for Unstructured Peer-to-Peer networks, which is designed with Aspect-oriented design patterns .We propose this pattern which is an amalgamation of the Worker Object Pattern, Case-Based Reasoning Pattern and Reactor Pattern that can be used to design the Self-Adaptive Systems. Every peer node in the network provides services to many clients, so in order to handle each client request they must be executed in a separate thread which is the functionality provided by the Worker Object Aspect-Oriented Design Pattern. Then after a service request is accepted by a peer server, it will assign the task of serving the request to client with the help of Reactor Pattern. The reactor pattern handles service requests that are delivered concurrently to a peer server by one or more clients. Case-base reasoning pattern is used for composition of services that are provided at different peer servers with the help of JUDDI to serve clients complex service requests. We’ll study the amalgamation of the Feature-oriented and Aspect-oriented software development methodology and its usage in developing a design pattern for peer-to-peer networks. So with the help of Aspectual Feature Module technique, we can introduce a new service into the peer system without disturbing the current running code in the server at run-time. In the process of development we also use Java Aspect Components (JAC). A simple UML class diagram is depicted.

Vishnuvardhan Mannava, T. Ramesh, B. Naga Srinivas Repuri
A Novel Hybrid Approach to N-Queen Problem

These days computers deals with highly intricate problems. This paper also discusses such kind of complex problems called Constraint Satisfaction Problems (CSP). These problems have a set of variables, domain from which a variable takes its value and a set of constraints applied on these variables. For example N-Queen problem, timetabling problem, scheduling problem etc. are CSPs. In practical scenario, it is unlikely to obtain a solution that satisfies all constraints or most of the constraints. Such a solution is an exact solution. Even if an exact algorithm can be developed its time or space complexity may turnout unacceptable. In reality, it is often sufficient to find an approximate or partial solution to such NP problems using heuristic algorithms. Heuristic methods are used to speed up the process of finding a satisfactory solution, where an exhaustive search is impractical; hence resulting in guaranteed and approximate solutions. This paper proposes an efficient hybrid solution for standard N-Queen problem using Ant Colony Optimization and Genetic Algorithm. It also compares the performances of classical backtrack and brute force methods and heuristic methods, Simulated annealing and Genetic algorithm on N-Queen problem.

Kavishi Agarwal, Akshita Sinha, M. Hima Bindu
Testing for Software Security: A Case Study on Static Code Analysis of a File Reader Java Program

The high-level contribution of this paper is to illustrate the use of automated tools to conduct static code analysis of a software program and mitigate the vulnerabilities associated with the program. We present a case study of static code analysis conducted on a File Reader program (developed in Java) using the Source Code Analyzer and Audit Workbench automated tools, developed by Fortify, Inc. Specifically, the following software vulnerabilities are discovered, analyzed and mitigated: (i) Denial of Service, (ii) System Information Leak, (iii) Unreleased Resource (in the context of Streams) and (iv) Path Manipulation. We describe the potential risks in having each of these vulnerabilities in a software program and provide the solutions (including the code snippets in Java) to mitigate these vulnerabilities. The proposed solutions for each of these vulnerabilities are more generic and could be used to correct such vulnerabilities in software developed in any other programming language.

Natarajan Meghanathan, Alexander Roy Geoghegan
Vital Signs Data Aggregation and Transmission over Controller Area Network (CAN)

An Intensive Care Unit (ICU) consists of patient bedsides which are equipped with a set of sensors to monitor the condition of the patient. These sensors nodes periodically send the patients’ vital signs data to the central monitoring system for storing, analysis and emergency treatment requirements. The medical staff uses this data to ascertain the condition of the patient and to provide necessary medications, in case of abnormality. In this paper a design of Controller Area Network (CAN) based ICU monitoring system is presented. CAN provides high speed and reliable transmission of distributed sensor networks data. The key feature in this design is the proposed aggregation scheme which is adopted to effectively utilize the bandwidth and decrease bus load in order to accommodate more number of patients within the existing network.

Nadia Ishaque, Noveel Azhar, Atiya Azmi, Umm-e-laila, Ammar Abbas
A Comparative Study on Different Biometric Modals Using PCA

Multimodal Biometrics is a rising domain in biometric technology where more than one biometric trait is combined to improve the performance. It is well proved that multimodal biometrics has much higher efficiency than unimodal biometrics. In this research, a comparative study of multimodal biometrics has been done with four biometrics has been considered, three static, one behavioral namely face, ear, palm and gait respectively. The paper mainly focuses on four cases unimodal, bimodal, 3-modal, 4-modal biometrics and using PCA as the base feature extraction method in all cases. The training and testing algorithm has been embedded in PCA itself. the results of all cases are compared and that multimodal cases has much higher efficiency than unimodal and then comparing multimodal cases i.e. bi modal,3- modal,4-modal, there was not much difference and shows that, increasing the number of biometrics doesn’t make much difference.

G. Pranay Kumar, Harendra Kumar Ram, Naushad Ali, Ritu Tiwari
Methodology for Automatic Bacterial Colony Counter

An increased area of focus in Microbiology is the automation of counting methods. These obstacles include: How to handle confluent growth or growth of colonies that touch or overlap other colonies, How to identify each colony as a unit in spite of differing shapes, sizes, textures, colors, light intensities, etc. Counting of bacterial colonies is complex task for microbiologist. Further in an Industry thousands of such samples are formed per day and colonies on each sample are counted manually, then this becomes a time consuming hectic and error prone job. We proposed a method to count these colonies to save time with accurate results and fast delivery to customers. This proposed research work will count the colonies after 6 to 8 hours priori, saving a lot more time and this work will more efficient because market range for this is about 10,000 only as compare to prior systems.

Surbhi Gupta, Priyanka Kamboj, Sumit Kaushik
Sorting of Decision Making Units in Data Envelopment Analysis with Intuitionistic Fuzzy Weighted Entropy

Analysis method used for measuring and evaluating the efficiency of decision making units is basically a linear programming based technique. It has number of inputs and outputs included in the analysis and takes account of the relationship between inputs and outputs. Analysis method has clear advantages over competing approaches such as data envelopment analysis (DEA). In the present paper, we propose a new algorithm for decision making units in context of intuitionistic fuzzy weighted entropy in order to rank decision making units in data envelopment analysis.

Neeraj Gandotra, Rakesh Kumar Bajaj, Nitin Gupta
Reliability Quantification of an OO Design -Complexity Perspective-

Object oriented design and development are popular conceptions in today’s software development scenario. Object oriented design supports design principals such as inheritance, coupling, cohesion and encapsulation. The proposed research work will deliver a mechanism for reliability estimation of object oriented design in respect of complexity perspective. The four OO design metrics namely Inheritance metric complexity perspective (IM

C

), coupling metric complexity perspective (CM

C

), cohesion metric complexity perspective (C

O

M

C

) and encapsulation metric complexity perspective (EM

C

) are proposed for each of object oriented design constructs such as inheritance, coupling, cohesion and encapsulation respectively. The paper also proposed complexity and reliability estimation models. On the basis of proposed metrics a multiple regression equation has been established for computing the complexity of design hierarchies. Complexity is inversely affects reliability of object oriented designs. Again a multiple regression equation has been established to compute reliability in respect of complexity. Comparative analysis among metric and model values has been done in this paper.

A. Yadav, R. A. Khan
A New Hybrid Algorithm for Video Segmentation

Video segmentation became popular and most important in the digital storage media. In this video segmentation technique, initially the similar shots are segmented, subsequently the track frames in every shots are assorted using the extracted objects of every frame which highly reduces the processing time. Effective video segmentation is a challenging problem in digital storage media. In this hybrid video segmentation technique, it yields the effective video segmentation results by performing intersection on the segmented results provided by both the frame difference method as well as consecutive frame intersection method. The frame difference method considers the key frame as background and it segments the dynamic objects whereas the frame difference method segments the static and dynamic objects by intersection of objects in consecutive frames. The new hybrid technique is evaluated by varying video sequences and the efficiency is analyzed by calculating the statistical measures and kappa coefficient.

K. Mahesh, K. Kuppusamy
Using Modularity with Rough Information Systems

We are looking forward to propose a novel technique, which depends on using modular techniques and integration between fuzzy set concepts and rough set theory in mining rough systems. In this research We propose a set of algorithms For a novel model allows introducing modularity mechanism; by introduce decision grouping mechanism for getting the optimizing decision. This approach provides flexibility in decision making verifies all decision standards and determines decision requirements, through modularizing rough information system, extraction of rough association rules and developing mechanisms for decision grouping.

Ahmed T. Shawky, Hesham A. Hefny, Ashraf H. Abd Elwhab
Cost Optimized Approach to Random Numbers in Cellular Automata

In this research work, we are trying to put emphasis on on the cost effective generation approach of high quality random numbers using one dimensional cellular automaton. Maximum length cellular automata with higher cell number, has already been established for the generation of highest quality of pseudo random numbers. Sequence of randomness quality has been improved using DIEHARD tests reflecting the varying linear complexity compared to the maximum length cellular automata. The mathematical approach for proposed methodology over the existing maximum length cellular automata emphasizes on flexibility and cost efficient generation procedure of pseudo random pattern sets.

Arnab Mitra, Anirban Kundu
Selection of Views for Materializing in Data Warehouse Using MOSA and AMOSA

By saving or materializing a set of derived relations or intermediate results from base relations of a data warehouse, the query processing can be made more efficient. It avoids repeated generation of these temporary views while generating the query responses. But as in case of a data warehouse there may be large number of queries containing even larger number of views inside each query, it is not possible to save each and every query due to constraint of space and maintenance costs. Therefore, an optimum set of views are to be selected for materialization and hence there is the need of a good technique for selecting views for materialization. Several approaches have been made so far to achieve a good solution to this problem. In this paper an attempt has been made to solve this problem by using Multi Objective Simulated Annealing(MOSA) and Archived Multi-Objective Simulated Annealing(AMOSA) algorithm.

Rajib Goswami, D. K. Bhattacharyya, Malayananda Dutta
Comparison of Deterministic and Probabilistic Approaches for Solving 0/1 Knapsack Problem

The purpose of this paper is to analyze algorithm design paradigms applied to single problem – 0/1 Knapsack Problem. The Knapsack Problem is a combinatorial optimization problem where one has to maximize the benefits of objects in a knapsack without exceeding its capacity. It is an NP-complete problem and uses exact and heuristic techniques to get solved.

The objective is to analyze that how the various techniques like Dynamic Programming and Genetic Algorithm affect the performance of Knapsack Problem. Our experimental results show that the promising approach is genetic algorithm as it gives result in optimal time.

Ritika Mahajan, Sarvesh Chopra, Sonika Jindal
Comparison of Content Based Image Retrieval System Using Wavelet Transform

The large numbers of images has posed increasing challenges to computer systems to store and manage data effectively and efficiently. This paper implements a CBIR system using different feature of images through four different methods, two were based on analysis of color feature and other two were based on analysis of combined color and texture feature using wavelet coefficients of an image. To extract color feature from an image, one of the standard ways i.e. color histogram was used in YCbCr color space and HSV color space. Daubechies’ wavelet transformation and Symtels’ wavelet transform were performed to extract the texture feature of an image. After obtaining all experimental results, it has been inferred that wavelet based method gave a better performance as compared to color based method.

Suchismita Das, Shruti Garg, G. Sahoo
A New Approach for Hand Gesture Based Interface

This paper presents a new approach for controlling mouse movement and implementing mouse functions using a real-time camera. Most existing approaches involve changing mouse parts such as adding more but-tons or changing the position of the tracking ball. In-stead, we propose to change the hardware design. Our method is to use a camera, image comparison technology and motion detection technology to control mouse movement and implement its functions (right click, left click, scrolling and double click).

T. M. Bhruguram, Shany Jophin, M. S. Sheethal, Priya Philip
Multi-document Summarization Based on Sentence Features and Frequent Itemsets

Information retrieval is the process of searching for information and related knowledge within the collected documents or from the web Users and are presented with vast information which suffers from redundancy and irrelevance. Searching for the required information from this huge collection is a tiresome task. This motivated the researchers to provide high quality summary that allows the user to quickly locate the desired information. In this paper an attempt is made to improve the performance of summarization technique using the sentence features as length, position, centriod, Noun and by adding the new feature Noun-Verb pair. The second technique exploits modified FIS – Frequent Itemset Sequence generation algorithm for summarization. The redundancy elimination techniques are applied to achieve the efficient summary from various documents. The performance of proposed algorithms is compared with the existing MEAD summarization technique by considering F-measure. Introduction of Noun –Verb pair improves the quality of summarization compared to existing MEAD and our proposed FIS technique.

J. Jayabharathy, S. Kanmani, Buvana
Performance Evaluation of Evolutionary and Artificial Neural Network Based Classifiers in Diversity of Datasets

In the last two decades, we have seen an explosive growth in our capabilities to both generate and collect data. Advances in scientific data collections (e.g. from remote sensors or from space satellites), the widespread use of bar codes for almost all commercial products, and the computerization of many business and government transactions have generated a sea of data. So there is a need for automatic tools and techniques for such a huge collection of data. These techniques and tools are the subject of the emerging field of knowledge discovery in databases (KDD) and data mining. Data mining plays an important role to discover important information to help in decision making of a decision support system. It has been the active area of research in the last decade. The classification is one of the important tasks of data mining. Different kind of classifiers have been suggested and tested to predict the future events based on unseen data. This paper compares the performance evaluation of evolutionary based genetic algorithm and artificial neural network based classifiers in diversity of datasets. The performance evaluation metrics are predictive accuracy, training time and comprehensibility. Evolutionary based classifier shows better comprehensibility and predictive accuracy as compared to ANN based classifier. Such a classifier is slower as compared to the ANN based one.

Pardeep Kumar, Nitin, Vivek Kumar Sehgal, Durg Singh Chauhan
Some Concepts of Incomplete Multigranulation Based on Rough Intuitionistic Fuzzy Sets

The definition of basic rough sets depends upon a single equivalence relation defined on the universe or several equivalence relations taken one each taken at a time. In the view of granular computing, classical rough set theory is based upon single granulation. The basic rough set model was extended to rough set model based on multi-granulations (MGRS) in [6], where the set approximations are defined by using multi-equivalences on the universe and their properties were investigated. Using the hybridized rough fuzzy set model introduced by Dubois and Prade [2], rough fuzzy set model based on multi-granulation is introduced and studied by Wu and Kou [15]. Topological properties of rough sets introduced by Pawlak in terms of their types were recently studied by Tripathy and Mitra [11]. These were extended to the context of incomplete multi granulation by Tripathy and Raghavan [12]. Recently, the concept of multi-granulations based on rough fuzzy sets by Tripathy and Nagaraju [13]. In a recent paper, Tripathy et al [14] introduced the concept of incomplete multigranulation on rough intuitionistic fuzzy sets (MGRIFS) and studied some of its topological properties. In this paper we continue further by introducing the concept of accuracy measures on MGRIFS and prove some of their properties. Our findings are true for both complete and incomplete intuitionistic fuzzy rough set models based upon multi granulation. The concepts and results established in [13] and [14] open a new direction in the study of multigranulation for further study.

B. K. Tripathy, G. K. Panda, Arnab Mitra
Data Mining Model Building as a Support for Decision Making in Production Management

The paper gives the next stages of the project that is oriented on the use of data mining techniques and knowledge discoveries from production systems through them. They have been used in the management of these systems. Production data was obtained in previous stages of project. This production data are stored in data warehouse that was proposed and developed by authors. Data mining model has been created by using specific methods and selected techniques for defined problems of production system management. The main focus of our article is the proposal of data mining model.

Pavol Tanuska, Pavel Vazan, Michal Kebisek, Oliver Moravcik, Peter Schreiber
Multi-Objective Zonal Reactive Power Market Clearing Model for Improving Voltage Stability in Electricity Markets Using HFMOEA

This paper presents a development of a new multi-objective zonal reactive power market clearing (ZRPMC-VS) model for improving voltage stability of power system. In proposed multi-objective ZRPMC-VS model, two objective functions such as total payment function (TPF) for rective power support from generators and syncronus condensers and voltage stability enhancement index (VSEI) are optimized symultanously by satisfying various power system constraints using hybrid fuzzy multi-objective evolutionary algorithm (HFMOEA). The results obtained using HFMOEA are comapared with a well known NSGA-II solution technique. This analysis helps the independent system operators (ISO) to take better decisions in clearing the reactive power market in competetive market environment.

Ashish Saini, Amit Saraswat
Comparative Study of Image Forgery and Copy-Move Techniques

Image forgery means manipulation of the digital image to conceal some meaningful or useful information of the image. There are cases when it is difficult to identify the edited region from the original image. The detection of a forged image is driven by the need of authenticity and to maintain integrity of the image. This paper surveys different types of image forgeries. The survey has been done on existing techniques for forged image and it also highlights various copy – move detection methods based on their robustness and computational complexity.

M. Sridevi, C. Mala, Siddhant Sanyam
Single Sideband Encoder with Nonlinear Filter Bank Using Denoising for Cochlear Implant Speech Processor

Cochlear Implants (CI) are the most successful neural prosthesis used to restore normal hearing to the profoundly deaf, by electrical stimulation of the auditory nerves. The use of speech coder is very crucial in the cochlear implant to obtain a very close resemblance of the normal hearing. Use of noise reduction techniques further enhances a satisfactory hearing in noisy conditions. We propose a new method of sound processing which gives improved speech recognition. To achieve this goal we implemented denoising technique and further adopted SSB demodulation along with a non linear filterbank such as The Dual Resonance Non Linear (DRNL) which is capable of modeling the behavior of the human cochlea. Comparative analysis was done to understand the performance of the proposed method with existing method. Simulation results showed a significant improvement in the speech recognition over existing method.

Rohini S. Hallikar, M. Uttara Kumari, K. Padmaraju
Crosstalk Reduction Using Novel Bus Encoders in Coupled RLC Modeled VLSI Interconnects

Most of the encoding methods proposed in recent years have dealt with only

RC

modeled VLSI interconnects. For deep submicron (DSM) technologies, on-chip inductive effects has rapidly increased due to increasing clock frequency, decreasing signal rise times and increasing length of on-chip interconnects. This issue is an important concern for signal integrity and overall chip performance. Therefore, this research paper proposes an efficient Bus Encoder using Bus Inverting (BI) method. This method dramatically reduces both crosstalk and power dissipation in

RLC

modeled interconnects. The proposed encoder consumes significantly lower power which makes it suitable for the current high-speed low power VLSI interconnects. The proposed design demonstrates an overall reduction in power dissipation and crosstalk delay by 59.43% and 72.87%, respectively.

G. Nagendra Babu, Brajesh Kumar Kaushik, Anand Bulusu
Event Triggering Mechanism on a Time Base: A Novel Approach for Sporadic as well as Periodic Task Scheduling

A novel concept involving the combination of the two individual time triggered and event triggered methodologies is presented aiming in gaining the benefits of the respective mechanisms and attaining high system level performance. The major advantage of the work presented is the best utilization of the system resources available with gained flexibility. A detailed explanation of the proposed approach is presented. Simulation studies were conducted for the approach validation.

Ramesh Babu Nimmatoori, A. Vinay Babu, C. Srilatha
A High Level Approach to Web Content Verification

In this paper we present a tool for visually imposing constraints over the content of XML-based webpages and automatically repair such webpages in case they don’t comply with the imposed constraints. The tool is based in the XCentric programming language and relies on a highly declarative model.

Liliana Alexandre, Jorge Coelho
Histogram Correlation for Video Scene Change Detection

In this paper a novel and simple scene change detection algorithm based on the correlation between the frames of the video is proposed. The first frame of the video is taken as a reference frame. The correlation between the histogram of the reference frame and the histogram of all video frames is computed. The plotting of the relationship between the computed correlation values and frame number illustrates the differentiation between scene and motion changes. When the correlation values are constant over a number of frames, so there is a motion scene where the background is not changed. While changing the correlation values over a number of frames indicate a gradual scene change. Changing of these values sharply indicates abrupt scene change. Experimental results show that this method is effective for motion, abrupt and gradual shot transition detection. It achieves an F-measure exceeding 0.89 for gradual shot transition compared with 0.84 when using a PCA based method.

Nisreen I. Radwan, Nancy M. Salem, Mohamed I. El Adawy
Microposts’ Ontology Construction

The social networking website Facebook offers to its users a feature called “status updates” (or just “status”), which allows users to create Microposts directed to all their contacts, or a subset thereof. Readers can respond to Microposts, or in addition to that also click a “Like” button to show their appreciation for a certain Micropost. Adding semantic meaning in the sense of unambiguous intended ideas to such Microposts. We can make a start towards semantic web by adding semantic annotation to web resources. Ontology are used to specify meaning of annotations. Ontology provide a vocabulary for representing and communicating knowledge about some topic and a set of semantic relationships that hold among the terms in that vocabulary. For increasing the efficiency of ontology based application there is a need to develop a mechanism that reduces the manual work in developing ontology. In this paper, we proposed Microposts’ ontology construction.

Beenu Yadav, Harsh Verma, Sonika Gill, Prachi Bansal
A Comparative Study of Clustering Methods for Relevant Gene Selection in Microarray Data

Classification of microarray cancer data has drawn the attention of research community for better clinical diagnosis in last few years. Microarray datasets are characterized by high dimension and small sample size. Hence, the conventional wrapper methods for relevant gene selection cannot be applied directly on such datasets due to large computation time. In this paper, a two stage approach is proposed to determine a subset containing relevant and non redundant genes for better classification of microarray data. In first stage, genes were partitioned into distinct clusters to identify redundant genes. To determine the better choice of clustering algorithm to group redundant genes, four different clustering methods were investigated. Experiments on four well known cancer microarray datasets depicted that hierarchical agglomerative with complete link approach performed the best in terms of average classification accuracy for three datasets. Comparison with other state-of-art methods have shown that the proposed approach which involves gene clustering is effective in reducing redundancy among selected genes to provide better classification.

Manju Sardana, R. K. Agrawal
An Optimal Approach for DICOM Image Segmentation Based on Fuzzy Techniques

In this paper an optimal method for DICOM CT image segmentation is explored with the integration of FCM thresholding with fuzzy levelset for medical image processing FCM thresholding gives fine segmented results when compared to Otsu method. The optimization property of FCM is improved when it is combined with local thresholding. The application of Fuzzy levelset gives enhanced segmentation results. The experimentation results based on the statistical metrices proves that the optimal approach enhances the segmented results with fine regions.

J. Umamaheswari, G. Radhamani
A Two-Phase Item Assigning in Adaptive Testing Using Norm Referencing and Bayesian Classification

Due to the advancement in information technology and varied learner group, e-learning has become popular. Hence computer based assessment become a prevalent method of administering the tests. Randomization of test items here may produce unfair effect on test takers which is unproductive in the outcome of the test. There is a need to develop the Intelligent Tutoring System that assigns intelligent question depending on the student’s response in the testing session. It will be more productive when the questions are assigned based on the ability in the early stage itself. Also, if only the standard multiple-choice questions are focused, then the real embedded nature of computer assessment is sacrificed. Items with different constrained constructs are included to bring out the complex skills, analytical and comprehensive ability of learners. So, this study focus on building up a framework to automatically assign intelligent question with different constructs based on the learner ability while entry. Using Norm Referencing, questions are classified based on item difficulty. Item discrimination is found and there by only the items which can discriminate the performers alone are accumulated in the item pool to have maximum effect of intelligence in tutoring system. The level of new learner is predicted by means of Naïve Bayesian classification and the consequent item is posed. Thereby the objective of Intelligent Tutoring System is achieved by using both adaptability and intelligence in testing.

R. Kavitha, A. Vijaya, D. Saraswathi
Implementation of Multichannel GPS Receiver Baseband Modules

Global Positioning Systems are mainly used for finding the location of an object across the globe. GPS receivers can be implemented by using software defined radio techniques. In this paper, hardware implementation of base band (acquisition and tracking) modules of a GPS receiver using system generator 9.2 has been carried out. The implementation will be tested on Lyrtech (small form factor-software defined radio) platform which consists of 3 layers. The upper layer being the radio frequency (ISM band receiver) layer, middle layer is the ADACMasterIII layer and the last is digital processing (DSP) layer. The data transfer between the FPGA Virtex4 SX35 and DSP module is done using a TMS320DM6446 Davinci processor. Generation of 17MHz Intermediate frequency from the RF signal received by the GPS receiver has been achieved. Currently the process of building the base band modules of acquisition and tracking are in progress. Acquisition is being implemented using parallel code phase search algorithm. This is performed by using FFT. Tracking module is implemented by using Costas loop and Delay Lock Loop (DLL). First the base band modules are being made in simulink and simulation results will be tested. Once this is achieved, real time hardware implementation will be done. The results will lead to the development of indigenous GPS receivers with single and multiple channels within the same hardware with reconfiguration. Also it is adaptive for consistent receiving and tracking of the signals.

Kota Solomon Raju, Y. Pratap, Virendra Patel, Gaurav Kumar, S. M. M. Naidu, Amit Patwardhan, Rabinder Henry, P. Bhanu Prasad
Towards a Practical “State Reconstruction” for Data Quality Methodologies: A Customized List of Dimensions

Data quality (DQ) has been defined as “fitness for use” of the data (also called Information Quality). A single aspect of data quality is defined as a “dimension” such as “consistency”, “accuracy”, “completeness”, or “timeliness”. In order to assess and improve data quality, “methodologies” have been defined. Data quality methodologies are a set of guidelines and techniques that are designed for assessing, and perhaps, improving data quality in a given application or organization. Most data quality methodologies use a pre-defined list of dimensions to assess the quality of data. This pre-defined list is usually based on previous research and may not be related to the specific application at hand. As a prelude (or state reconstruction phase) for methodologies, a useful list of dimensions specific to the current application or organization must be collected. In this paper we propose a state reconstruction phase in order to achieve that.

Reza Vaziri, Mehran Mohsenzadeh
A Hybrid Reputation Model through Federation of Peers Having Analogous Function

The widespread usage of peer to peer (P2P) system is visible in all major application areas like, file sharing , high performance computing, P2P TV, P2P IP TV etc. Conversely, the foremost confront in the growth of P2P system is the dearth of an efficient scheme to handle malicious and mysterious nodes. The QoS factor of the system will be deeply reduced by freeriders. To facilitate the augmentation, the system should be able to trim down the influence of malicious nodes with a feasible, proficient and scalable reputation model. Trust value computation among peers outline the basis of reputation model. This paper recommends a hybrid reputation model through federation of peers having analogous function. The grouping of peers having analogous function is introduced to reduce the unnecessary searching. The competence of the model can be augmented by the application of following methods. Application of election algorithm with suggested modification is a solution to central point failure. The ring structure formation of peers having analogues function reduces the search time through the network. In addition to that the behavior prediction of peers is made more accurate using the Markov chain.

G. Sreenu, P. M. Dhanya
An Amalgam Approach for Feature Extraction and Classification of Leaves Using Support Vector Machine

This paper describes the need for the development of automatic plant recognition system for classification of plant leaves. In this paper, an automatic Computer Aided Plant Leaf Recognition (CAP-LR) is presented. To implement the above system initially the input image is pre-processed in order to remove the background noise and to enhance the leaf image. As a second stage the system efficiently extracts the different feature vectors of the leaves and gives it as input to the Support Vector Machine (SVM) for classification into plant leaves or tree leaves. Geometric, texture and color features are extracted for classification. The method is validated by K-Map which calculates the accuracy, sensitivity and efficiency. The experimental result shows that the system has faster processing speed and higher recognition rate.

N. Valliammal, S. N. Geethalakshmi
Applying Adaptive Strategies for Website Design Improvement

The use of web data mining to maintain websites and improve their functionalities is an important field of study. Web site data may be unstructured to semi structured whose purpose to show the relevant data to user. This is possible only when we understand the specifics preferences that define the visitor behavior in a web site. The two predominant paradigms for finding information on the Web are navigation and search subsequently we can design a adaptive website. With the growth of World Wide Web, development of web-based technologies and the growth in web content, the structure of a website becomes more complex and web navigation becomes a critical issue to both web designers and users. In this paper we propose the method to know the significance of website by applying adaptive strategy that is dynamic map and highlights and buffering .The effect of dynamic map is apparent it can improve adaptive level of a website. Highlighting can shorten the time to find user’s objective pages, and buffering pages can reduce page’s response time.

Vinodani Katiyar, Kamal Kumar Srivastava, Atul Kumar
WebTrovert: An AutoSuggest Search and Suggestions Implementing Recommendation System Algorithms

There are hundreds of websites and apps that are struggling to find the algorithms for the perfect search to optimize the website’s resources, however we have very few success stories.

We aim to build in this paper, a recommendation system, WebTrovert, which is based on practically designed algorithms. It comprises of a social networking platform holding user information and their data in the form of documents and videos.

It incorporates autosuggest search and suggestions to enhance the productivity and user friendliness of the website.

Akrita Agarwal, L. Annapoorani, Riya Tayal, Minakshi Gujral
A Study of the Interval Availability and Its Impact on SLAs Risk

The obligations that telecommunications providers have with their customers are nowadays clearly specified in SLA contracts. The offered availability during the contract period is one of the most relevant variables in SLAs. Modeling accurately the transient solution posed by the need of considering the interval availability is still an open challenge. A common policy taken to make simpler models is the use of steady state assumptions. Nevertheless, this simplification may put on risk the contract fulfillment as stochastic variations of the measured availability are significant over a typical contract period. This paper makes a theoretical study of the interval availability and propose an approximation to evaluate the cumulative downtime distribution of a system component using renewal theory. We study the evolution of the distribution of the interval availability with the increase of the observation period i.e., duration of the contract, and show its respective impact in the SLA success probability. In addition, using the approximation proposed, we analyze numerically the behavior of the cumulative downtime distribution and the SLA risk under processes that do not follow Markovian assumptions.

Andres J. Gonzalez, Bjarne E. Helvik
Application of Intervention Analysis on Stock Market Forecasting

In today’s financial market, different financial events have direct impact on stock market values. Even a slight change in those events may result a huge difference in stock prices. So consideration of those effects is very important in forecasting stock values. Most of the researches as of now only consider about forecasting but not these effects. This paper studies the effects of some of those events in financial market forecasting. In this paper, we focused our study on the effects of financial events such as GDP, Consumer Sentiments and Jobless Claims on stock market forecasting and analyze them. These events are considered as intervention effects. The intervention effect is described in this study as temporary but immediate and abrupt. So we have tried to estimate not only the period of effect of these events but also use intervening values on forecasting. These forecasted values are then compared to forecasted values obtained from fusion model based on Concordance and Genetic Algorithm (GA). The concept is validated using financial time series data (S&P 500 Index and NASDAQ) as the sample data sets. We also have analyzed how often our forecasting values have the same movement as that of actual market values. The developed tool can be used not only for forecasting but also for in depth analysis of the stock market.

Mahesh S. Khadka, K. M. George, N. Park, J. B. Kim
Partial Evaluation of Communicating Processes with Temporal Formulas and Its Application

This paper presents a framework that extends a partial evaluation method for transformational programs to a method for reactive CSP processes. Temporal logic formulas are used to represent constraints on the sets of the sequences of communication actions executed by the processes. We present a set of simple rules for specializing processes with temporal formulas which contain

X

(next)-operators and/or

G

(invariant)-operators. We present an example of an application of our partial evaluation method to improve the security of concurrent systems.

Masaki Murakami
Performance Analysis of a Hybrid Photovoltaic Thermal Single Pass Air Collector Using ANN

This paper presents the performance analysis of semi transparent hybrid photovoltaic single pass air collector considering four weather conditions (a, b, c and d type) of New Delhi weather station of India using ANN technique. The MATLAB 7.1 neural networks toolbox has been used for defining and training of ANN for calculations of thermal, electrical, overall thermal energy and overall exergy. The ANN models use ambient temperature, number of clear days, global and diffuse radiation as input parameters. The transfer function, neural network configuration and learning parameters have been selected based on highest convergence during training and testing of networks. About 3000 sets of data from four weather stations (Bangalore, Mumbai, Srinagar, and Jodhpur) have been given as input for training and data of the fifth weather station (New Delhi) has been used for testing purpose. ANN model has been tested with Levenberg-Marquardt training algorithm to select the best training algorithm. The feedforward back-propagation algorithm with logsig transfer function has been used in this analysis. The results of ANN model have been compared with analytical values on the basis of root mean square error.

Deepali Kamthania, Sujata Nayak, G. N. Tiwari
An Effective Software Implemented Data Error Detection Method in Real Time Systems

In this paper, a software-based technique is presented for detecting soft errors that damage data and values of the programs. The proposed technique that is called CPD (Critical Path Duplication) is based on critical path duplication of program. The mentioned path is extracted from the data flow graph of the program and has the most length so there is a great probability of error occurrence in it. In CPD technique, the instructions of the critical path is repeated and separated variables and registers are determined for them. In order to analyze the proposed technique, fault injection to variables and registers are utilized and the achieved results are compared with a full duplication method. Experimental results show that CPD technique has 54% and 25% of the performance and memory overheard less than full duplication method. The percentage of fault coverage is reduced about 24% which is acceptable in safety- critical applications which are sensitive to speed and space overheads.

Atena Abdi, Seyyed Amir Asghari, Saadat Pourmozaffari, Hassan Taheri, Hossein Pedram
Preprocessing of Automated Blood Cell Counter Data and Generation of Association Rules in Clinical Pathology

This paper applies the preprocessing phases of the Knowledge Discovery in Databases to the automated blood cell counter data and generates association rules using apriori algorithm. The functions of an automated blood cell counter from a clinical pathology laboratory and the phases in Knowledge Discovery in Databases are explained briefly. Twelve thousand records are taken from a clinical laboratory for processing. The preprocessing steps of the KDD process are applied on the blood cell counter data. This paper applies the Apriori algorithm on the blood cell counter data and generates interesting association rules that are useful for medical diagnosis.

D. Minnie, S. Srinivasan
The Particular Approach for Personalised Knowledge Processing

Researchers, teachers, librarians, and individuals in daily practice perform various activities that require the processing of large amounts of knowledge and dynamic information flow. The database application BIKE (Batch Knowledge and Information Editor) was developed within the implementation of Technology Enhanced Learning. Based on the paradigm of “batch knowledge processing”, it works as a universal, multi-purpose, pre-programmed "all-in-one" environment. The environment enables individuals to solve personalised approaches for producing teaching and learning materials, batch information retrieving, a personnel information system and also other applications. Some applications are presented in this paper.

Stefan Svetsky, Oliver Moravcik, Pavol Tanuska, Jana Stefankova, Peter Schreiber, Pavol Vazan
Metrics Based Quality Assessment for Retrieval Ability of Web-Based Bioinformatics Tools

Today, there is need for share and building resources that could be made available around the world any where, at any time. This made the necessity of web for which the usage and utilization depends on those who set these resources for sharing globally. Web based tool with the conceptual view of resource sharing it could be a classification of tool that only extracts resources from the web, like extraction of informatics from the database that would require which would be a thousand of thousands of object entities that would relay on this real world for which the resource is left for sharing globally. Bioinformatics tools aims at the same which is used for solving real world problems using DNA and amino acid sequences and related information using mathematical, statistical and computing methods. Mostly of the tools of this area are web-based since biological resources are real entity that should be kept updated based on the researches that requires vast space. Tools build in this area could not be build by one databases so, database like NCBI, PDB, EMBDL, Medline etc…have been developed to share its resources. At present development of bioinformatics tools are tremendously increasingly for real-time decision making for which it is vital to evaluate the performance of the tool by means of retrieval ability. Since mostly tools are web-based that utilizes various databases for information retrieval or mining information’s it is vital for evaluating the retrieval ability along with error report of the tools for performance based quality assessment. Metrics is a measure that qualifies the characteristics of a product in numerical data that have being observed. In this paper few web-based bioinformatics tools have been taken, that retrieves documents from PubMed database for which the tools performances have been evaluated by quantitative means through metrics. Selective metrics that have been used are Information retrieval, error report, F-measure etc…for performance evaluation for which detailed result analysis have been narrated. From the observation made from the analyzed results on the tools will help to provide a guideline for developing better tools or selecting better tool for better decision making with enhanced qualities.

Jayanthi Manicassamy, P. Dhavachelvan, R. Baskaran
Exploring Possibilities of Reducing Maintenance Effort in Object Oriented Software by Minimizing Indirect Coupling

The quality of a good object oriented software design is much effective when it has highly maintainable class components. This paper describes an investigation into the use of indirect coupling to provide early indications of maintenance effort in object oriented software. The properties of interest are: (i) the potential maintainability of a class and (ii) the likelihood that a class will be affected by maintenance changes made to the overall system. The research explores that minimizing indirect coupling can provide useful indications of software maintenance effort that may have a significant influence on the effort during system maintenance and testing.

Nirmal Kumar Gupta, Mukesh Kumar Rohil
A New Hashing Scheme to Overcome the Problem of Overloading of Articles in Usenet

Usenet is a popular distributed messaging and files sharing service. Usenet flood articles over an overlay network to fully replicate articles across all servers. However, replication of Usenet’s full content requires that each server pay the cost of receiving (and storing) over 1 Tbyte/day. This paper shows the design and implementation of Usenet database in Multilevel Hash table. A Usenet system that allows a set of cooperating sites to keep a shared, distributed copy of Usenet articles. In a standard multiple hashing scheme, each item is stored improves space utilization. This schemes open very amenable to Usenet implementation unfortunately this scheme occasionally require a large number of items to be moved to perform an insertion and deletion in Usenet database this paper shows that it is possible to significantly increase the space utilization of multiple choice hashing scheme by allowing at most one item to be moved during an insertion.

This paper represents the problems occur in this type of methods with little bit solution of them. Users may want to read, but it will not solve the problem of near exponential growth or the problems of Usenet’s backbone peers.

Monika Saxena, Praneet Saurabh, Bhupendra Verma
Bio-inspired Computational Optimization of Speed in an Unplanned Traffic and Comparative Analysis Using Population Knowledge Base Factor

Bio- inspired Computational Optimization of Speed in Unplanned Traffic and the comparative analysis is a very promising research problem. Searching for an efficient optimization method or technique to formulate optimal solution of a given problem in hand is very challenging and thereby to increase the traffic flow in an unplanned zone is a widely concerning issue. However, there has been a limited research effort on the optimization of the lane usage with speed optimization. This paper presents a novel technique to solve the problem optimally using the knowledge base analysis of speeds of vehicles, using partial modification of Bio Inspired Algorithm (Ant Colony Optimization) which, in turn will act as a guide and baseline for designing lanes optimally to provide better optimized traffic with less number of transitions between lanes.

Prasun Ghosal, Arijit Chakraborty, Sabyasachee Banerjee
Transliterated SVM Based Manipuri POS Tagging

Manipuri is a Scheduled Indian language which has two script: a borrowed Bengali Script and the original Meitei Mayek (Script). Manipuri is a resource poor language specially the Meitei Mayek text Manipuri. This paper deals with Support Vector Machine (SVM) based Part of Speech (POS) tagging of the Bengali Script text and then are transliterated to Meitei Mayek after POS tagging. So far POS tagging of Meitei Mayek Manipuri is not reported and this could be the first attempt.

Kishorjit Nongmeikapam, Lairenlakpam Nonglenjaoba, Asem Roshan, Tongbram Shenson Singh, Thokchom Naongo Singh, Sivaji Bandyopadhyay
A Survey on Web Service Discovery Approaches

Web services are playing an important role in e-business and e-commerce applications. As web service applications are interoperable and can work on any platform, large scale distributed systems can be developed easily using web services. Finding most suitable web service from vast collection of web services is very crucial for successful execution of applications. Traditional web service discovery approach is a keyword based search using UDDI. Various other approaches for discovering web services are also available. Some of the discovery approaches are syntax based while other are semantic based. Having system for service discovery which can work automatically is also the concern of service discovery approaches. As these approaches are different, one solution may be better than another depending on requirements. Selecting a specific service discovery system is a hard task. In this paper, we give an overview of different approaches for web service discovery described in literature. We present a survey of how these approaches differ from each other.

Debajyoti Mukhopadhyay, Archana Chougule
Intensity Based Adaptive Fuzzy Image Coding Method: IBAFC

A new design method of image compression as Intensity Based Adaptive Fuzzy Coding (

IBAFC

) is presented. In this design, the image is decomposed to non overlapping square blocks and hence each block is classified as either edge or smooth blocks. This classification based upon some predefined Threshold compared to adaptive quantization level of each block. Then each block is coded as either fuzzy F-transform compressed for edge block or mean value of block is sent for smooth block. The experimental results proves that the proposed

IBAFC

scheme is superior to conventional AQC and Intensity based AQC (IBAQC) on measures like MSE PSNR alongwith visual quality.

Deepak Gambhir, Navin Rajpal
Periocular Feature Extraction Based on LBP and DLDA

Periocular recognition is an emerging field of research and people have experimented with some feature extraction techniques to extract robust and unique features from the periocular region. In this paper, we propose a novel feature extraction approach to use periocular region as a biometric trait. In this approach we first applied Local Binary Patterns (LBPs) to extract the texture information from the periocular region of the image and then applied Direct Linear Discriminant Analysis (DLDA) to produce discriminative low-dimensional feature vectors. The approach is evaluated on the UBIRIS v2 database and we achieved 94% accuracy which is a significant improvement in the performance of periocular recognition.

Akanksha Joshi, Abhishek Gangwar, Renu Sharma, Zia Saquib
Towards XML Interoperability

Now a day’s distributed computing has become a common phenomenon. As the system designed from the bottom up with networking in mind, distributed computing makes it very easy for computers to cooperate and today, scientific world is nourishing the benefits provided by distributing computing under the broad umbrella of client server architecture. Information is requested and used at various distant physical or logical locations as per the requirement by the users. This is very unlikely that all the engaged users use the same computing environment. XML (Extensive Markup Language) technology has emerged as an efficient medium for information transfer and attempting to offer similar kind of information environment up to an extent by using its potential property called, interoperability. Interoperability is a kind of ability of software and hardware on different machines from different vendors to share data. In this paper we employ a few criteria to evaluate interoperability of XML in heterogeneous distributed computing environment.

Sugam Sharma, S. B. Goyal, Ritu Shandliya, Durgesh Samadhiya
Erratum: Speed Optimization in an Unplanned Lane Traffic Using Swarm Intelligence and Population Knowledge Base Oriented Performance Analysis

This chapter has been retracted due to self-plagiarism; a significant proportion of the content was previously published in another chapter.

Prasun Ghosal, Arijit Chakraborty, Sabyasachee Banerjee
Backmatter
Metadaten
Titel
Advances in Computer Science, Engineering & Applications
herausgegeben von
David C. Wyld
Jan Zizka
Dhinaharan Nagamalai
Copyright-Jahr
2012
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-642-30157-5
Print ISBN
978-3-642-30156-8
DOI
https://doi.org/10.1007/978-3-642-30157-5