Skip to main content
main-content

Über dieses Buch

This book constitutes the proceedings of the First International Conference on Advances in Computing and Information Technology, ACITY 2011, held in Chennai, India, in July 2011. The 55 revised full papers presented were carefully reviewed and selected from numerous submissions. The papers feature significant contributions to all major fields of the Computer Science and Information Technology in theoretical and practical aspects.

Inhaltsverzeichnis

Frontmatter

Advances in Computing and Information Technology

Output Regulation of the Unified Chaotic System

This paper investigates the problem of output regulation of the unified chaotic system (Lu, Chen, Cheng and Celikovsky, 2002). Explicitly, state feedback control laws to regulate the output of the unified chaotic system have been derived so as to track the constant reference signals. The control laws are derived using the regulator equations of C.I. Byrnes and A. Isidori (1990), who solved the problem of output regulation of nonlinear systems involving neutrally stable exosystem dynamics. Numerical simulations are shown to illustrate the results.

Sundarapandian Vaidyanathan

Global Chaos Synchronization of Hyperchaotic Bao and Xu Systems by Active Nonlinear Control

This paper investigates the global chaos synchronization of hyperchaotic systems, viz. synchronization of identical hyperchaotic Bao systems (Bao and Liu, 2008), and synchronization of non-identical hyperchaotic Bao and Xu systems. Active nonlinear feedback control is the method used to achieve the synchronization of the chaotic systems addressed in this paper. Our theorems on global chaos synchronization for hyperchaotic Bao and Xu systems are established using Lyapunov stability theory. Since the Lyapunov exponents are not required for these calculations, the active control method is effective and convenient to synchronize identical and different hyperchaotic Bao and Xu systems. Numerical simulations are also given to illustrate and validate the various synchronization results derived in this paper.

Sundarapandian Vaidyanathan, Suresh Rasappan

Stabilization of Large Scale Discrete-Time Linear Control Systems by Observer-Based Reduced Order Controllers

This paper investigates the stabilization of large scale discrete-time linear control systems by observer-based reduced order controllers. Sufficient conditions are derived for the design of observer-based reduced order controllers for the large scale discrete-time linear control systems by obtaining a reduced order model of the original linear plant using the dominant state of the system. A separation principle has been established in this paper which shows that the observer poles and controller poles can be separated and hence the pole placement problem and observer design problem are independent of each other.

Sundarapandian Vaidyanathan, Kavitha Madhavan

Review of Parameters of Fingerprint Classification Methods Based on Algorithmic Flow

Classification refers to associating a given fingerprint to one of the existing classes already recognized in the literature. A search over all the records in the database takes a long time, so the aim is to reduce the size of the search space by choosing an appropriate subset of database for search. Classifying a fingerprint images is a very difficult pattern recognition problem, due to the small interclass variability, the large intraclass variability. This paper presents a sequence flow diagram which will help in developing the clarity on designing algorithm for classification based on various parameters extracted from the fingerprint image. It discusses in brief the ways in which the parameters are extracted from the image. Existing fingerprint classification approaches are based on these parameters as input for classifying the image. Parameters like orientation map, singular points, spurious singular points, ridge flow and hybrid feature are discussed in the paper.

Dimple Parekh, Rekha Vig

Adv-EARS: A Formal Requirements Syntax for Derivation of Use Case Models

The development of complex systems frequently involves extensive work to elicit, document and review functional requirements that are usually written in unconstrained natural language, which is inherently imprecise. Use of Formal techniques in Requirement Engineering would be of immense importance as it would provide automated support in deriving use case models from the functional requirements. In this paper we propose a formal syntax for requirements called Adv-EARS. We define a grammar for this syntax such that a requirements document in this format can be grammatically parsed and the prospective actors and use cases are automatically derived from the parse tree. The use case diagram is then automatically generated based on the actors and use cases and their relationships. We have used requirements of an Insurance system as a case study to illustrate our approach.

Dipankar Majumdar, Sabnam Sengupta, Ananya Kanjilal, Swapan Bhattacharya

Tender Based Resource Scheduling in Grid with Ricardo’s Model of Rents

The policy of resource scheduling determines the way in which the consumer jobs are allocated to resources and how each resource queues those jobs such that the job is finished within the deadline. Tender based scheduling policy requires the resources to bid for job while the consumer processes the bid and awards the job to the lowest bidder. Using an incentive based approach this tender based policy can be used to provide fairness in profit for the resources and successful job execution for the consumers. This involves adjusting the price, Competition Degree (CD) and the job position in the resource queue. In this paper, this model is further extended by incorporating resource categorization and modifying the resource bidding using ’Group Targeting’. The resources are classified using the ’Ricardo’s theory of rents’ taking into account the speed and type of each resource. This allows the consumer to make his decision using the category of resource along with its price which induces a quality element in the bid processing mechanism. This is modeled using the parameter Quality Degree (QD) introduced in the consumer side. This categorization and modified bid processing result in a scheduling policy closer to market-like behavior.

Ponsy R. K. Sathiabhama, Ganeshram Mahalingam, Harish Kumar, Dipika Ramachandran

An Adaptive Pricing Scheme in Sponsored Search Auction: A Hierarchical Fuzzy Classification Approach

Sponsored Search Auctions (SSA) are gaining widespread attention in web commerce community because of their highly targeted customers and billion dollars revenue generating online market. Unlike other form of auctions this class possesses fairly complex interaction among its key players, Users, Advertisers and Search Engines. Therefore research issues pertaining to SSA are being explored with large momentum in eclectic domains e.g. game theory, algorithmic theory and machine learning etc. Though problems related to different pricing schemes in SSA need more focus from researchers especially in analyzing adaptive pricing measures .This work is an effort towards making diligent use of information available in terms of different auctions’ situations by ingraining best of major popular pricing schemes in which switching among pricing is made by hierarchical fuzzy classification. Effectiveness of the proposed scheme is illustrated through experimental results.

Madhu Kumari, Kamal K. Bharadwaj

Optimization of Disk Scheduling to Reduce Completion Time and Missed Tasks Using Multi-objective Genetic Algorithm

The crucial challenge that decides the success of real-time disk scheduling algorithm lies in simultaneously achieving the two contradicting objectives namely – completion time and missed tasks. This work is motivated toward developing such an algorithm. The goal of this paper is to demonstrate that the simultaneous optimization of completion time and missed tasks produces an efficient schedule for real-time disk scheduling. The objective function is designed to minimize the two parameters. An extensive experimental evaluation to compare the performance of the proposed system vs. other disk scheduling algorithms conducted on 1000 disk request sets. The observations reported that the proposed scheme is a state-of-the-art model offering minimum completion time and missed tasks.

R. Muthu Selvi, R. Rajaram

Goal Detection from Unsupervised Video Surveillance

Unsupervised video surveillance that can automatically learn, predict or detect events can be useful in unsupervised video surveillance that can automatically learn, predict or detect events can be useful in many practical situations. This work describes how many practical situations. This work describes how an unsupervised surveillance can be used in goal detection in basketball videos. We present a system which takes as input a video stream of a basket and an agent trying to hit a goal and produce an analysis of the behavior of the ball in the scene and detect goals. To achieve this functionality, our system relies on two modular blocks. The first-one detects and tracks moving balls in the sequence. The second module takes as input these trajectories and makes decision on a goal versus non goal. We present details of the system, together with results on a number of real video sequences and also provide a quantitative analysis of the results. The approach described here uses object detection and mean-shift tracking to detect and track the basketball in a video. Goal decision is based on the positions of the ball, its current and immediate past positions, in image frame, with respect to a matrix representing the basket.

Chirag I. Patel, Ripal Patel, Palak Patel

Data Mining Based Optimization of Test Cases to Enhance the Reliability of the Testing

Software testing is any activity aimed at evaluating an attribute or capability of a program or system and determining that it meets its required results. Software testing is important activity in Software Development Life Cycle. Test case selection is a crucial activity in testing since the number of automatically generated test cases is usually enormous and possibly unfeasible. Also, a considerable number of test cases are redundant, that is, they exercise similar features of the application and/or are capable of uncovering a similar set of faults. The strategy is aimed at selecting the less similar test cases while providing the best possible coverage of the functional model from which test cases are generated. Test suite selection techniques reduce the effort required for testing by selecting a subset of test suites. In previous work, the problem has been considered as a single-objective optimization problem. However, real world testing can be a complex process in which multiple testing criteria and constraints are involved. The paper utilizes a hybrid, multi-objective algorithm that combines the efficient approximation of the evolutionary approach with the capability data mining algorithm to produce higher-quality test cases.

Lilly Raamesh, G. V. Uma

An Insight into the Hardware and Software Complexity of ECUs in Vehicles

Modern automotives integrate large amount of electronic devices to improve the driving safety and comfort. This growing number of Electronic Control Units (ECUs) with sophisticated software escalates the vehicle system design complexity. In this paper we explain the complexity of ECUs in terms of hardware and software and also we explore the possibility of Common Object Request Broker Architecture (CORBA) architecture for the integration of add-on software in ECUs. This reduces the complexity of the embedded system in vehicles and eases the ECU integration by reducing the total number of ECUs in the vehicles.

Rajeshwari Hegde, Geetishree Mishra, K. S. Gurumurthy

Local Binary Patterns, Haar Wavelet Features and Haralick Texture Features for Mammogram Image Classification Using Artificial Neural Networks

The objective of this study is the classification of mammogram images into benign and malignant using Artificial Neural Network. This framework is based on combining Local Binary Patterns, Haar Wavelet features and Haralick Texture features. The study shows the importance of Computer Aided Medical Diagnosis in successful decision making by calculating the likelihood of a disease. This multi feature approach for classification obtains an average classification accuracy of 98.6% for training, validation and testing.

Simily Joseph, Kannan Balakrishnan

A Hybrid Genetic-Fuzzy Expert System for Effective Heart Disease Diagnosis

This paper presents a genetic algorithm (GA)-based fuzzy logic approach for computer aided disease diagnosis scheme. The aim is to design a fuzzy expert system for heart disease diagnosis. The designed system is based on Cleveland Heart Disease database. Originally there were thirteen attributes involved in predicting the heart disease. In this work genetic algorithm is used to determine the attributes that contribute more towards the diagnosis. Thirteen attributes are reduced to six attributes using genetic search. Fuzzy expert system is used for developing knowledge based systems in medicine. The proposed system uses Mamdani inference method. The system designed in Matlab software can be viewed as an alternative for existing methods to distinguish of heart disease presence.

E. P. Ephzibah

Genetic Algorithm Technique Used to Detect Intrusion Detection

The paper content is genetic algorithm based intrusion detection system. It is a simulation type system comes under networking area. The first system in the line is an anomaly-based IDS implemented as a simple linear classifier. This system exhibits high both detection and false-positive rate. For that reason, we have added a simple system based on

if-then

rules that filter the decision of the linear classifier and in that way significantly reduces false-positive rate. In the first step of our solution we deploy feature extraction techniques in order to reduce the amount of data that the system needs to process. Hence, our system is simple enough not to introduce significant computational overhead, but at the same time is accurate, adaptive and fast due to genetic algorithms. The model is verified on KDD99 benchmark dataset.

Payel Gupta, Subhash K. Shinde

Online Delaunay Triangulation Using the Quad-Edge Data Structure

Previous works involving the Online Delaunay triangulation problem required that the incoming request lies within the triangulation or a predefined initial triangulation framework, which will contain all the incoming points. No mention is made for Online Delaunay triangulation when the request point lies outside the triangulation, which also happens to be the unbounded side of the Convex Hull of the triangulation. In this work, we give a solution for the Online Delaunay triangulation Problem for incoming request points lying in the unbounded side of the Convex Hull bounding the Delaunay triangulation as well for points lying inside the triangulation. We use the Quad-Edge data structure for implementing the Delaunay triangulation.

Chintan Mandal, Suneeta Agarwal

A Novel Event Based Autonomic Design Pattern for Management of Webservices

A system is said to be adaptive if its behavior automatically changes according to its context.Systems based on the service-oriented architecture (SOA) paradigm must be able to bind arbitrary Web services at runtime.Web services composition has been an active research area over the last few years. However, the technology is still not mature yet and several research issues need to be addressed. In this paper, we propose an autonomic design pattern that describes the dynamic composition and adaptation of Web services based on the context. This pattern is primarily an extension of the Case-based Reasoning, Strategy, Observer Design Patterns.We proposed a framework where service context is configurable to accomodate the needs of different users and can adapt to dynamic changing environments.This permits reusability of a service in different contexts and achieves a level of adaptiveness and contextualization without recoding and recompiling of the overall composed services. The execution of adaptive composite service is provided by an observer model. Three core services, coordination service, context service, and event service, are implemented to automatically schedule and execute the component services, that adapt to user configured contexts and environment changes at run time. We demonstrate the benefits of our proposed design pattern by an experimental setup with implementation without generating stubs at the client side.

Vishnuvardhan Mannava, T. Ramesh

Towards Formalization of Ontological Descriptions of Services Interfaces in Services Systems Using CL

With the advent of semantic technologies, the Internet is moving from Web 2.0 to Web 3.0. Social Networks, Semantic Web, Blogs, etc are all components of Web 3.0. In this paper, we intend to formally represent the description of the service systems proposed by expressing the constraints and capabilities of the system in standards-based language; Common Logic. We translate and represent the WSML ontology, web-services, relations, axioms and other entity metadata descriptions of the system in ISO Common Logic. Also, the temporal constraints of the system discussed are represented using service policies. The extended service policies (WS-Policy Language) are discussed for representing the temporal constraints of the system.

Amit Bhandari, Manpreet Singh

Advanced Data Warehousing Techniques for Analysis, Interpretation and Decision Support of Scientific Data

R & D Organizations handling many Research and Development projects produce a very large amount of Scientific and Technical data. The analysis and interpretation of these data is crucial for the proper understanding of Scientific / Technical phenomena and discovery of new concepts. Data warehousing using multidimensional view and on-line analytical processing (OLAP) have become very popular in both business and science in recent years and are essential elements of decision support, analysis and interpretation of data. Data warehouses for scientific purposes pose several great challenges to existing data warehouse technology. This paper provides an overview of scientific data warehousing and OLAP technologies, with an emphasis on their data warehousing requirements. The methods that we used include the efficient computation of data cubes by integration of MOLAP and ROLAP techniques, the integration of data cube methods with dimension relevance analysis and data dispersion analysis for concept description and data cube based multi-level association, classification, prediction and clustering techniques.

Vuda Sreenivasarao, Venkata Subbareddy Pallamreddy

Anti-synchronization of Li and T Chaotic Systems by Active Nonlinear Control

The purpose of this paper is to study chaos anti-synchronization of identical Li chaotic systems (2009), identical T chaotic systems (2008) and non-identical Li and T chaotic systems. In this paper, sufficient conditions for achieving anti-synchronization of the identical and non-identical Li and T systems are derived using active nonlinear control and our stability results are established using Lyapunov stability theory. Since the Lyapunov exponents are not required for these calculations, the active nonlinear feedback control method is effective and convenient to anti-synchronize the identical and non-identical Li and T chaotic systems. Numerical simulations are also given to illustrate and validate the anti-synchronization results for the chaotic systems addressed in this paper.

Sundarapandian Vaidyanathan, Karthikeyan Rajagopal

Message Encoding in Nucleotides

This paper suggests a message encoding scheme in nucleotide strand for small text les. The proposed scheme leads to ultra high volume data density and depends on adoption of transformation algorithms like, Burrow-Wheeler transformation and Move to Front for generating better context information. Huffman encoding further compresses the transformed text message. We used a mapping function to encode message in nucleotide from the binary strand and tested the suggested scheme on collection of small text size les. The testing result showed the proposed scheme reduced the number of nucleotides for representing text message over existing methods.

Rahul Vishwakarma, Satyanand Vishwakarma, Amitabh Banerjee, Rohit Kumar

XIVD: Runtime Detection of XPath Injection Vulnerabilities in XML Databases through Aspect Oriented Programming

The growing acceptance of XML technologies for documents and protocols, it is logical that security should be integrated with XML solutions. In a web application, an improper user input is root cause for a wide variety of attacks. XML Path or XPath language is used for querying information from the nodes of an XML document. XPath Injection is an attack technique used to exploit applications that construct XPath (XML Path Language) queries from user-supplied input to query or navigate XML documents such as SQL in Databases. Hence, we proposed an approach to detect XPath injection attack in XML databases at runtime through Aspect Oriented Programming (AOP). Our approach intercept XPath expression i.e.) XQuery from the web application through Aspect Oriented Programming (AOP) and parse the XQuery expression to find the inputs to be placed in the expression. The identified inputs are used to design an XML file and it would be validated through a proposed schema. The validation results the correctness of the XQuery.

Velu Shanmughaneethi, Ra. Yagna Pravin, S. Swamynathan

Multilevel Re-configurable Encryption and Decryption Algorithm

In the existing crypto graphical techniques, to increase the strength of encoding data and to keep the data secured against cryptanalysis, a Multi-Level Encryption algorithm with an Efficient Public Key System is proposed. The Multi-Level encryption steps up the strength of the algorithm by using five keys for encrypting each character, and it makes cryptanalysis impossible without detecting all the five keys. The key generation and distribution method suits the algorithm well, as the purpose of multilevel keys to enhance security step by step is followed while detecting the keys. The proposed algorithm possesses the property of re-configurability in which the logical set of operations performed to encrypt the data can be done in six ways. The keys are not of the static values, and each time a character is encrypted the key value changes, causing intruder getting perplexed. The encrypted data is represented in the patch of color form.

Manoharan Sriram, V. Vinay Kumar, Asaithambi Saranya, E. Tamarai Selvam

Segmentation of Printed Devnagari Documents

Document segmentation is one of the most important phases in machine recognition of any language. Correct segmentation of individual symbols decides the success of character recognition technique. It is used to decompose an image of a sequence of characters into sub images of individual symbols by segmenting lines and words. Devnagari is the most popular script in India. It is used for writing Hindi, Marathi, Sanskrit and Nepali languages. Moreover, Hindi is the third most popular language in the world. Devnagari documents consist of vowels, consonants and various modifiers. Hence a proper segmentation Devnagari word is challenging. A simple approach based on bounded box to segment Devnagari documents is proposed in this paper. Various challenges in segmentation of Devnagari script are also discussed.

Vikas J. Dongre, Vijay H. Mankar

An Adaptive Jitter Buffer Playout Algorithm for Enhanced VoIP Performance

The QoS standard of a VoIP session degrades if its stringent time requirements are not met. Low end-to-end delay of the voice packets and low packet loss must be maintained. Jitter between voice packets must also be within tolerable limits. Jitter hampers voice quality and makes the VoIP call uncomfortable to the user. Very often, buffers are used to store the received packets for a short time before playing them at equal spaced intervals to minimize jitter. However, this introduces the problem of added end-to-end delay and discarded packets. In this paper, some established adaptive jitter buffer playout algorithms have been studied and a new algorithm has been proposed. The network used for the analysis of the algorithms has been simulated using OPNET modeler 14.5.A. The proposed algorithm kept jitter within a tolerable limit along with drastic reduction of delay and loss compared to other algorithms analyzed in this paper.

Atri Mukhopadhyay, Tamal Chakraborty, Suman Bhunia, Iti Saha Misra, Salil Kumar Sanyal

Optimizing VoIP Call in Diverse Network Scenarios Using State-Space Search Technique

A VoIP based call has stringent QoS requirements with respect to delay, jitter, loss, MOS and R-Factor. Various QoS mechanisms are being implemented to satisfy these requirements. These mechanisms must be adaptive under diverse network scenarios. Moreover such mechanisms must be implemented in proper sequence, otherwise they may conflict with each other. The objective of this paper is to address the problem of adaptive QoS maintenance and sequential execution of available QoS implementation mechanisms with respect to VoIP under varying network conditions. In this paper, we generalize this problem as a state-space problem and thereby solve it. Firstly, we map the problem of QoS optimization into state-space domain and then apply incremental heuristic search. We implement it under various network and user scenarios in a VoIP test-bed to optimize the performance. Finally, we discuss the advantages and uniqueness of our approach.

Tamal Chakraborty, Atri Mukhopadhyay, Suman Bhunia, Iti Saha Misra, Salil Kumar Sanyal

State-Based Dynamic Slicing Technique for UML Model Implementing DSA Algorithm

Unified Modeling Language has been widely used in software development for modeling the problem domain to solution domain. The major problems lie in comprehension and testing which can be found in whole process. Program slicing is an important approach to analyze, understand, test and maintain the program. It is a technique for analyzing program by focusing on statements which have dependence relation with slicing criterion. Program slicing is of two types (i) Static slicing (ii) Dynamic slicing. Dynamic slicing refers to a collection of program execution and may significantly reduce the size of the program slice because runtime information, collected during execution, is used to compute the program slice. In this paper we introduce an approach for constructing dynamic slice of unified modeling language (UML) using sequence diagram, state chart diagram, class diagram along with the activity diagram. First we construct an intermediate representation known as model dependency graph. MDG combines information extracted from various state diagram. Then dynamic slice is computed by integrating the activity models into the MDG. For a given slicing criterion DSA algorithm traverse the constructed MDG to identify the relevant model element.

Behera Mamata Manjari, Dash Rasmita, Dash Rajashree

Self Charging Mobile Phones Using RF Power Harvesting

RF power harvesting is one of the diverse fields where still research continues. The energy of RF waves used by devices can be harvested and used to operate in more effective and efficient way. This paper highlights the performance of energy harvesting in an efficient way by using a simple voltage doubler. With slight modifications we attained high output voltage from harvested RF energy. The modified form of existing schottky diode based voltage doubler circuit is presented to achieve high output power for an average input RF power of 20 dBm. The performance of the circuit is studied with simulation results in ADS tools.

Ajay Sivaramakrishnan, Karthik Ganesan, Kailarajan Jeyaprakash Jegadishkumar

A Servey on Bluetooth Scatternet Formation

A Bluetooth ad hoc network can be formed by interconnecting piconets into scatternets. The constraints and properties of Bluetooth scatternets present special challenges in forming an ad hoc network efficiently. This paper, the research contributions in this arena are brought together, to give an overview of the state-of-the-art.

Simply stated, Bluetooth is a wireless communication protocol. Since it’s a communication protocol, you can use Bluetooth to communicate to other Bluetooth-enabled devices. In this sense, Bluetooth is like any other communication protocol that you use every day, such as HTTP, FTP, SMTP, or IMAP. Bluetooth has a client-server architecture; the one that initiates the connection is the client, and the one who receives the connection is the server. Bluetooth is a great protocol for wireless communication because it’s capable of transmitting data at nearly 1MB/s, while consuming 1/100th of the power of Wi-Fi. We discuss criteria for different types of scatternets and establish general models of scatternet topologies. Then we review the state-of-the-art approaches with respect to Bluetooth scatternet formation and contrast them.

Pratibha Singh, Sonu Agrawal

Text Mining Based Decision Support System (TMbDSS) for E-governance: A Roadmap for India

In this digital age most of the government regulations are often available online, similarly with the advent of a number of electronic online forums the opportunity of gathering citizens’ petitions and stakeholders’ views on government policy has increased greatly, but the volume and the complexity of analyzing unstructured data makes difficult to extract useful information from this data. On the other hand, text mining(TM) has the capability to deal with this type of data. TM techniques can help policy makers by identifying the relatedness between existing regulations and proposed policy drafts. In this article we discuss how text-mining techniques can help in retrieval of information and relationships from unstructured data sources, thereby assisting policy makers in discovering associations between existing policies, proposed policies and citizens’ opinions expressed in electronic public forums. In this article, an integrated text mining based architecture for e-governance decision support is presented along with a discussion on the Indian scenario.

Gadda Koteswara Rao, Shubhamoy Dey

IMPACT-Intelligent Memory Pool Assisted Cognition Tool : A Cueing device for the Memory Impaired

Memory impairment results from a variety of disease such as Alzheimer, Parkinson, brain trauma and Aging. These people are obsessed with their current state and emotions when dealing with their environment with deteriorating memory in them. There are about 30 million people worldwide afflicted with this instance. They lose their insight into their own condition, forgetting even their loved ones, disability to locate their residence and a myriad of other inabilities. Medicine has evolved fast enough but still there is not a definite method to diagnose or a treatment for their memory loss. The only finding has revealed is due to the depletion of the brain cells and loss in hormone secretion to carry sensory messages. The appropriate reason for this behavior of cells is a mystery unsolved. While not only losing their pensive they develop a cold behavior, socially and emotionally they become unbalanced. The most common form of assistance to them is by caregivers or life logging through writing their reminiscences, taking photos of their surrounding and their close people, mobile phones etc .While all these can be of some form of help to this people it will result only in more stress to their already damaged brain resulting in depression and other mental problems. The proposed tool IMPACT acts like a second brain with a repository of memory comprising of their previous experiences in multimedia format which is sensed and fetched when they come across a similar person or surrounding thus enabling them to recall their senses about that situation engaging cueing interaction.

Samuel Cyril Naves

A Secure Authentication System Using Multimodal Biometrics for High Security MANETs

Mobile Adhoc NETworks (MANET) are collections of wireless mobile devices with restricted broadcast range and resources and communication is achieved by relaying data along appropriate routes that are dynamically discovered and maintained through collaboration between the nodes. MANET is a self configuring, dynamic, multi hop radio network without any fixed infrastructure. The main challenge in the design of such networks is how to provide security for the information which is communicated through the network. Biometrics provides possible solutions for this problem in MANET since it has the direct connection with user identity and needs little user interruption.

This proposed Multimodal Biometric-based Authentication Combined Security System provides authentication using face biometrics and security using fingerprint biometrics. The proposed system has three advantages compared to previous works. First, for authentication, eigenface of the sender is generated and is attached to the data to be transferred. Second, to enhance security, the data and the eigenface of the sender is encrypted by using the key which is extracted from the fingerprint biometric of the receiver. Third, to reduce a transmission-based attack, the fingerprint based cryptographic key is randomized by applying a genetic operator. Thus, this security system provides authentication, security and revocability for high security applications in mobile environments.

B. Shanthini, S. Swamynathan

Error Detection and Correction for Secure Multicast Key Distribution Protocol

Integrating an efficient Error detection and correction scheme with less encoding and decoding complexity to support the distribution of keying material in a secure multicast communication is an important issue, since the amount of information carried out in the wireless channel is high which produces more errors due to noise available in the communication channel. Moreover, the key must be sent securely to the group members. In this paper, we propose a new efficient Key Distribution Protocol that provides more security and also integrates an encoding method in sender side and decoding method in the receiver side. To achieve higher level of security, we propose Euler’s totient function based key distribution protocol. To provide efficient error detection and correction method while distributing the Keying and rekeying information, we introduce tanner graph based encoding stopping set construction algorithm in sender and receiver side of the multicast communication. Two major operations in this scheme are joining and leaving operations for managing multicast group memberships. The encoding and decoding complexity of this approach is computed in this paper and it is proved that this proposed approach takes less decoding time complexity.

P. Vijayakumar, S. Bose, A. Kannan, V. Thangam, M. Manoji, M. S. Vinayagam

Empirical Validation of Object Oriented Data Warehouse Design Quality Metrics

Data warehouses have been developed that stores information enabling the knowledge worker to make better and faster decisions. As a decision support information system, a data warehouse must provide high level quality of data and quality of service. Various metrics have been defined and theoretical validated to measure the quality of the data warehouse in a consistent and objective manner and if quality measured, it can be managed and improved. Now, in this paper we will use these design quality metrics and empirically validated these metrics by conducting an experiment using regression analysis and deriving the conclusions according to the analysis so that they can be used by researchers and users.

Jaya Gupta, Anjana Gosain, Sushama Nagpal

Plagiarism Detection of Paraphrases in Text Documents with Document Retrieval

Retrieval of documents is used for finding relevant documents to user queries and plagiarism is the act of copying the contents of one’s work without any acknowledgement. Paraphrasing is a type of plagiarism where the contents from source may be changed. This paper proposes a new document retrieval system and paraphrase plagiarism detection of text documents using multi-layered self organizing map (MLSOM). In the proposed system tree structure is extracted for the document that hierarchically represents the document features as document, pages and paragraphs. To handle the tree-structured documents in an efficient way, MLSOM is used as a clustering algorithm. Using MLSOM the documents can be compared for detecting plagiarism and it finds out the local similarity. Paraphrased plagiarism can be detected by finding the similarity between sentences of two documents which is a kind of local similarity detection.

S. Sandhya, S. Chitrakala

An XAML Approach for Building Management System Using WCF

Building Automation is one of the critical issues in recent scenarios. The process of Building Automation is called as Building automation System (BAS).The BAS includes different kind of information that enables to work towards intelligent building system. There was a Web Services technology for integrating different BAS but it supports only HTTP protocol which is stateless. This paper presents the next generation internet technology Windows Communication Foundation (WCF) to integrate different building automation systems in the development of BAS. WCF includes various contracts which writes and reads Building Automation Control Network (BACnet) data points from BACnet network. These contracts will be called by other enterprise applications for realize BAS integration and get real-time data on BACnet network as a facilities Management.BMS will be applied in a BAS which consists of BACnet network. The applications use sensors, actuators and controllers for controlling Building Management System (BMS). This paper presents a Service Oriented Architecture (SOA) for BMS using WCF and Extensible Application Markup Language (XAML) which will provides client side GUI for BMS reused for different kind of applications. Finally, it discusses about challenges in providing security to BMS.

Surendhar Thallapelly, P. Swarna Latha, M. Rajasekhara Babu

Hand Gesture Recognition Using Skeleton of Hand and Distance Based Metric

In this paper we are mainly concerns on the image processing and computer vision concepts for interpretation of gestures .By using gestures we can convey instructions to the machine (computer) or commands to a robots .This is known as Human machine interaction (Human computer interaction (HCI).Hand gestures are an ideal way of exchanging information between human and computer, robots, or any other device. In this paper we are calculating skeleton of the hand by using distance transformation technique and are using for recognition instead of the entire hand, because of its robust nature against translation, rotation and scaling. Skeleton is computed for each and every hand posture in the entire hand motion and superimposed on a single image called as Dynamic Signature of the particular gesture type. Gesture is recognized by using the Image Euclidean distance measure by comparing the current Dynamic Signature of the particular gesture with the gesture Alphabet set.

K. Sivarajesh Reddy, P. Swarna Latha, M. Rajasekhara Babu

DWCLEANSER: A Framework for Approximate Duplicate Detection

Data quality has become a major area of concern in data warehouse. The prime aim of a data warehouse is to store quality data so that it can enhance the decision support systems effectively. Quality of data is improved by employing data cleaning techniques. Data cleaning deals with detecting and removing errors and discrepancies from data. This paper presents a novel framework for detection of exact as well as approximate duplicates in a data warehouse. The proposed approach decreases the complexity involved in the previously designed frameworks by providing efficient data cleaning techniques. In addition, appropriate methods have been framed to manage the outliers and missing values in the datasets. Moreover, comprehensive repositories have been provided that will be useful in incremental data cleaning.

Garima Thakur, Manu Singh, Payal Pahwa, Nidhi Tyagi

A Dynamic Slack Management Technique for Real-Time System with Precedence and Resource Constraints

Energy consumption is a critical design issue in embedded systems, especially in battery-operated systems. Dynamic Voltage Scaling and Dynamic Frequency Scaling allow us to adjust supply voltage and processor frequency to adapt to the workload demand for better energy management. For a set of real-time tasks with precedence and resource constraints executing on a distributed embedded system, we propose a dynamic energy efficient scheduling algorithm with weighted First Come First Served (WFCFS) scheme, which also considers the run-time behaviour of tasks, to further explore the idle periods of processors. Our algorithm is compared with the Service-Rate-Proportionate (SRP) Slack Distribution Technique which uses FCFS and Weighted scheduling schemes. Our proposed algorithm achieves about 6 percent more energy savings and increased reliability over the existing one.

Santhi Baskaran, Perumal Thambidurai

Multi-level Local Binary Pattern Analysis for Texture Characterization

Texture is the core element in numerous computer vision applications. The objective of this paper is to present a novel methodology for learning and recognizing textures. A local binary pattern (LBP) operator offers an efficient way of analyzing textures. A multi-level local binary pattern operator which is an extension of LBP is proposed for extracting texture feature from the images. The operator finds the association of LBP operators at multiple levels. This association helps to identify macro features. Depending on the size of the operator, octets are framed and LBP responses of the octets are noted. Their occurrence histograms are combined to frame the texture descriptor. The proposed operator is gray-scale invariant. The operator is computationally simple since it can be realized with few operations in the local neighborhood. A non-parametric statistic named G-Statistic is used in the classification phase. The classifier is trained with images of known texture class to build a model for that class. Experimental results prove that the approach provides good discrimination between the textures.

R. Suguna, P. Anandhakumar

Brain Tissue Classification of MR Images Using Fast Fourier Transform Based Expectation- Maximization Gaussian Mixture Model

This paper proposes MR image segmentation based on Fast Fourier Transform based Expectation and Maximization Gaussian Mixture Model algorithm (GMM). No spatial correlation exists when classifying tissue type by using GMM and it also assumes that each class of the tissues is described by one Gaussian distribution but these assumptions lead to poor performance. It fails to utilize strong spatial correlation between neighboring pixels when used for the classification of tissues. The FFT based EM-GMM algorithm improves the classification accuracy as it takes into account of spatial correlation of neighboring pixels and as the segmentation done in Fourier domain instead of spatial domain. The solution via FFT is significantly faster compared to the classical solution in spatial domain — it is just O(N log

2

N) instead of O(N^2) and therefore enables the use EM-GMM for high-throughput and real-time applications.

Rajeswari Ramasamy, P. Anandhakumar

Network Intrusion Detection Using Genetic Algorithm and Neural Network

Intrusion detection is a classification problem where the classification accuracy is very important. In network intrusion detection, the large number of features increases the time and space cost. As the irrelevant features make noisy data, feature selection plays essential role in intrusion detection. The process of selecting best feature is the vital role to ensure the performance, speed, accuracy and reliability of the detector. In this paper we propose a new feature selection method based on Genetic Algorithm in order to improve detection accuracy and efficiency. Here the Genetic Algorithm is used for the best feature selection and optimization. The Back Propagation Neural Network is used to evaluate the performance of the detector in terms of detection accuracy. To verify this approach, we used KDD Cup99 dataset.

A. Gomathy, B. Lakshmipathi

Performance Evaluation of IEEE 802.15.4 Using Association Process and Channel Measurement

IEEE 802.15.4 is a new standard addressing the needs of low-rate wireless personal area networks or LR-WPAN with a focus on enabling wireless sensor networks. The standard is characterized by maintaining a high level of simplicity, allowing for low cost and low power implementations. It operates primarily in the 2.4GHz ISM band, which makes the technology easily applicable and worldwide available. However, IEEE 802.15.4 is potentially vulnerable to interference by other wireless technologies working in this band such as IEEE 802.11 and Bluetooth. This paper gives a short overview of the IEEE 802.15.4 and carefully analyzes the properties and performance of IEEE 802.15.4 through simulation study and channel measurement. Furthermore, this paper analyzes one of the association scheme named Simple Association Process (SAP) and compares SAP with the original IEEE 802.15.4 protocol. The analytic results are validated via ns-2 simulations.

Jayalakhsmi Vaithiyanathan, Ramesh Kumar Raju, Geetha Sadayan

Efficient Personalized Web Mining: Utilizing the Most Utilized Data

Looking into the growth of information in the web it is a very tedious process of getting the exact information the user is looking for. Many search engines generate user profile related data listing. This paper involves one such process where the rating is given to the link that the user is clicking on. Rather than avoiding the uninterested links both interested links and the uninterested links are listed. But sorted according to the weightings given to each link by the number of visit made by the particular user and the amount of time spent on the particular link.

L. K. Joshila Grace, V. Maheswari, Dhinaharan Nagamalai

Performance Comparison of Different Routing Protocols in Vehicular Network Environments

Due to mobility constraints and high dynamics, routing is a more challenging task in Vehicular Ad hoc NETworks (VANETs). In this work, we evaluate the performance of the routing protocols in vehicular network environments. The objective of this work is to assess the applicability of these protocols in different vehicular traffic scenarios. Both the position-based and topology-based routing protocols have been considered for the study. The topology-based protocols AODV and DSR and position-based protocol LAR are evaluated in city and highway scenarios. Mobility model has significant effects on the simulation results. We use Intelligent Driver Model (IDM) based tool VanetMobiSim to generate realistic mobility traces. Performance metrics such as packet delivery ratio and end-to-end delay are evaluated using NS-2. Simulation results shows position based routing protocols gives better performance than topology based routing protocols.

Akhtar Husain, Ram Shringar Raw, Brajesh Kumar, Amit Doegar

A BMS Client and Gateway Using BACnet Protocol

A Building Management System (BMS) is a computer-based control system installed in buildings that controls and monitors the building’s mechanical and electrical equipments such as ventilation, lighting, power systems, fire systems, and security systems. The aim is at integrating elements of BMS using an open standard known as BACnet. BACnet stands for Building Automation Control and Network. It is the most widely used protocol in industry for automation control, which is used to control/monitor the status of different units. Any security system installed in a company can also be controlled using the same philosophy.

The main objective is to use BACnet protocol, an open standard protocol to develop BMS client application, capable of displaying, controlling and monitoring all BACnet entities, irrespective of the manufacturer and also to develop a gateway to interface Fire panels to communicate over network using BACnet protocol.

Chaitra V. Bharadwaj, M. Velammal, Madhusudan Raju

Implementation of Scheduling and Allocation Algorithm

This paper presents a new evolutionary algorithm developed for scheduling and allocation for an elliptic filter. It involves the scheduling of the operations and resource allocation for elliptic filter. The Scheduling and Allocation Algorithm is compared with different scheduling algorithms like As Soon As Possible algorithm, As Late As Possible algorithm, Mobility Based Shift algorithm, FDLS, FDS and MOGS using Elliptic Filter. In this paper execution time and resource utilization is calculated for different Elliptic Filter and reported that proposed Scheduling and Allocation increases the speed of operation by reducing the control step. The proposed work to analyse the magnitude, phase and noise responses for different scheduling algorithm in an elliptic filter.

Sangeetha Marikkannan, Leelavathi, Udhayasuriyan Kalaiselvi, Kavitha

Interfacing Social Networking Sites with Set Top Box

Today’s viewers can be overwhelmed by the amount of content that is made available to them. In a world of hundreds of thousands, sometimes hundreds of millions of choices, we need powerful and effective interfaces to help viewers manage their viewing choices and find the content that’s right for them.

The project explores how the merging of ubiquitous consumer electronics and the sociable web improve the user experience of these devices, increase the functionality of both, and help distribute content in a more sociable way. The source of motivation for this project is the very idea of providing ‘lean back’ mode to communicate via social networking sites that can be achieved using a remote control for T.V like device. All one needs to communicate in this mode is a T.V (with a remote), a STB that is provided with IP.

This project saves the time and money of customers as they need not have a computer or a dedicated internet connection to utilize services of social networking sites. This project also acts as a add-on feature from the service provider to the customers. This project can also be extended to include still more useful services of World Wide Web. The future fields would be to allow the customer to do TV shopping, Net surfing

Kulkarni Vijay, Anupama Nandeppanavar

Comparison between K-Means and K-Medoids Clustering Algorithms

Clustering is a common technique for statistical data analysis, Clustering is the process of grouping similar objects into different groups, or more precisely, the partitioning of a data set into subsets according to some defined distance measure. Clustering is an unsupervised learning technique, where interesting patterns and structures can be found directly from very large data sets with little or none of the background knowledge. It is used in many fields, including machine learning, data mining, pattern recognition, image analysis and bioinformatics. In this research, the most representative algorithms K-Means and K-Medoids were examined and analyzed based on their basic approach.

Tagaram Soni Madhulatha

Performance of Routing Lookups

We look at the performance of routing lookups when techniques for restructuring binary search trees are applied. We try to obtain near-optimal routing lookups with bounded worst-case performance. For this, we look at the problem of constructing search trees so that the average lookup time is minimized while keeping the worst-case lookup time within a fixed bound.

S. V. Nagaraj

Multi Agent Implementation for Optimal Speed Control of Three Phase Induction Motor

Agent and Multi agent systems are becoming a new way to design and control of complex systems. Induction motor is widely used in industrial applications. But due to its highly non linear behavior its control is very complex. Recently by adapting non linear speed control techniques the dynamic performance of electric drives can be improved.

In this paper agent based approach is developed to control speed of Induction motor. Design and simulation of Multi Agent System is developed for Indirect vector controlled 3-phase Induction motor. To implement Multi Agent system classical controller (PI) & various intelligent controllers such as Fuzzy Logic & Neural Network are developed. The system is developed on simulink toolbox in MATLAB. The speed responses of controllers in terms of rise time, steady state error and overshoot are compared. And also results are compared with that of Multi Agent system.

Rathod Nirali, S. K. Shah

Indexing and Querying the Compressed XML Data (IQCX)

Extensible Markup Language was designed to carry data which provides a platform to define own tags. XML documents are immense in nature. As a result there has been an ever growing need for developing an efficient storage structure and high-performance techniques to query efficiently. Though several storage structures are available, QUICX (Query and Update Support for Indexed and Compressed XML) compact storage structure proved to be efficient in terms of data storage. The major reason for performance loss in query processing is identified to be the storage structure used as well as the lack of efficient query processing techniques. The approach (IQCX) focuses on indexing and querying the compressed data stored in QUICX, thereby increasing the compression ratio. Proposed indexing technique exploits the high degree of redundancy exhibited by the XML documents. Thus indexing enhances the query performance, compared to querying without indexing.

Radha Senthilkumar, N. Suganya, I. Kiruthika, A. Kannan

Discovering Spatiotemporal Topological Relationships

Discovering spatiotemporal topological relationships deals with the discovery of geometric relationships like disjoint, cover, intersection and overlap between every pair of spatiotemporal objects and change of such relationships with time from spatiotemporal databases. Spatiotemporal databases deal with changes to spatial objects with time. The applications in this domain process spatial, temporal and attribute data elements to find the evolution of spatial objects and changes in their topological relationships with time. These advanced database applications require storing, management and processing of complex spatiotemporal data. In this paper we discuss the design of spatiotemporal database and methodology for discovering various kinds of spatiotemporal topological relationships. Prototype implementation of the system is carried out on top of open source object relational spatial database management system called postgresql and postgis. The algorithms are experimented on historical cadastral datasets that are created using OpenJump. The results that are visualized using OpenJump software are presented.

K. Venkateswara Rao, A. Govardhan, K. V. Chalapati Rao

A Novel Way of Connection to Data Base Using Aspect Oriented Programming

Over the recent years aspect-oriented programming (AOP) has found increasing interest among researchers in software engineering. Aspects are abstractions which capture and localise cross-cutting concerns. Although persistence has been considered as an aspect of a system aspects in the persistence domain in general and in databases in particular have been largely ignored. This paper brings the notion of aspects to object-oriented databases. Some cross-cutting concerns are identified and addressed using aspects. An aspect-oriented extension of an OODB is discussed and various open issues pointed out. In this paper we are using Aspect Oriented Programming (AOP) to enable dynamic adaptation in existing programs, to enable reusability during Data Base connections. We propose an approach to implement dynamic adaptability, reusability especially for connecting Data Base using AOP. We have used AspectJ; Java based language to create aspects in Eclipse supported framework.

Bangaru Babu Kuravadi, Vishnuvardhan Mannava, T. Ramesh

Enhanced Anaphora Resolution Algorithm Facilitating Ontology Construction

Enormous explosion in the number of the World Wide Web pages occur every day and since the efficiency of most of the information processing systems is found to be less, the potential of the Internet applications is often underutilized. Efficient utilization of the web can be exploited when similar web pages are rigorously, exhaustively organized and clustered based on some domain knowledge (semantic-based) [1]. Ontology which is a formal representation of domain knowledge aids in such efficient utilization. The performance of almost all the semantic-based clustering techniques depends on the constructed ontology, describing the domain knowledge [6]. The proposed methodology provides an enhanced pronominal anaphora resolution, one of the key aspects of semantic analysis in Natural Language Processing for obtaining cross references [19] within a web page providing better ontology construction. The experimental data sets exhibits better efficiency of the proposed method compared to earlier traditional algorithms.

L. Jegatha Deborah, V. Karthika, R. Baskaran, A. Kannan

A New Data Mining Approach to Find Co-location Pattern from Spatial Data

Spatial co-location patterns represent the subsets of Boolean spatial features whose instances are often located in close geographic proximity. These patterns derive the meaningful relation between spatial data. Co-location rules can be identified by spatial statistics or data mining approaches. In data mining method, Association rule-based approaches can be used which are further divided into transaction-based approaches and distance-based approaches. Transaction-based approaches focus on defining transactions over space so that an Apriori algorithm can be used. The natural notion of transactions is absent in spatial data sets which are embedded in continuous geographic space. A new distance –based approach is developed to mine co-location patterns from spatial data by using the concept of proximity neighborhood. A new interest measure, a participation index, is used for spatial co-location patterns as it possesses an anti-monotone property. An algorithm to discover co-location patterns are designed which generates candidate locations and their table instances. Finally the co-location rules are generated to identify the patterns.

M. Venkatesan, Arunkumar Thangavelu, P. Prabhavathy

Backmatter

Weitere Informationen

Premium Partner

    Bildnachweise