Skip to main content

2011 | Buch

Advanced Computing

First International Conference on Computer Science and Information Technology, CCSIT 2011, Bangalore, India, January 2-4, 2011. Proceedings, Part III

herausgegeben von: Natarajan Meghanathan, Brajesh Kumar Kaushik, Dhinaharan Nagamalai

Verlag: Springer Berlin Heidelberg

Buchreihe : Communications in Computer and Information Science

insite
SUCHEN

Über dieses Buch

This volume constitutes the third of three parts of the refereed proceedings of the First International Conference on Computer Science and Information Technology, CCSIT 2010, held in Bangalore, India, in January 2011. The 46 revised full papers presented in this volume were carefully reviewed and selected. The papers are organized in topical sections on soft computing, such as AI, Neural Networks, Fuzzy Systems, etc.; distributed and parallel systems and algorithms; security and information assurance; ad hoc and ubiquitous computing; wireless ad hoc networks and sensor networks.

Inhaltsverzeichnis

Frontmatter

Soft Computing (AI, Neural Networks, Fuzzy Systems, etc.)

Analysis of the Severity of Hypertensive Retinopathy Using Fuzzy Logic

Eye, an organ associated with vision in man is housed in socket of bone called orbit and is protected from the external air by the eyelids. Hypertensive retinopathy is a one of the leading cause of blindness amongst the working class in the world. The retina is one of the "target organs" that are damaged by sustained hypertension. Subjected to excessively high blood pressure over prolonged time, the small blood vessels that involve the eye are damaged, thickening, bulging and leaking. Early detection can potentially reduce the risk of blindness. An automatic method to detect thickening, bulging and leaking from low contrast digital images of retinopathy patients is developed. Images undergo preprocessing for the removal of noise. Segmentation stage clusters the image into two distinct classes by the use of fuzzy c-means algorithm. This method has been tested using 50 images and the performance is evaluated. The results are encouraging and satisfactory and this method is to be validated by testing 200 samples.

Aravinthan Parthibarajan, Gopalakrishnan Narayanamurthy, Arun srinivas Parthibarajan, Vigneshwaran Narayanamurthy
An Intelligent Network for Offline Signature Verification Using Chain Code

It has been observed that every signature is distinctive, and that’s why, the use of signatures as a biometric has been supported and implemented in various technologies. It is almost impossible for a person himself to repeat the same signature every time he signs. We proposed an intelligent system for off-line signature verification using chain-code. Dynamic features are not available, so, it becomes more difficult to achieve the goal. Chain-code is extracted locally and Feed Forward Back Propagation Neural Network used as a classifier. Chain-code is a simple directional feature, extracted from a thinned image of signature because contour based system acquires more memory. An intelligent network is proposed for training and classification. The results are compared with a very basic energy density method. Chain-code method is found very effective if number of samples available for training is limited, which is also practically feasible.

Minal Tomar, Pratibha Singh
An Improved and Adaptive Face Recognition Method Using Simplified Fuzzy ARTMAP

Face recognition has become one of the most active research areas of pattern recognition since the early 1990s. This paper proposes a new face recognition method based on Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA) and Simplified Fuzzy ARTMAP (SFAM). Combination of PCA and LDA is used for improving the capability of LDA and PCA when used alone. Neural classifier, SFAM, is used to reduce the number of misclassifications. Experiment is conducted on ORL database and results demonstrate SFAM’s efficiency as a recognizer. SFAM has the added advantage that the network is adaptive, that is, during testing phase if the network comes across a new face that it is not trained for; the network identifies this to be a new face and also learns this new face. Thus SFAM can be used in applications where database needs to be updated.

Antu Annam Thomas, M. Wilscy
An Automatic Evolution of Rules to Identify Students’ Multiple Intelligence

The proposed work focuses on Genetic-Fuzzy approach to identify student’s skills. It is an integrated approach of education and technology implementing Theory of Multiple Intelligence. The objective is to reduce the system’s developmental and maintenance effort and automatically evolve strong rules. The proposed model is a novel evolutionary hybrid approach to measure and classify multiple intelligence in a friendly way. The paper includes general architecture of the model with front end and back end designs including encoding strategy, fitness function, crossover operator, and sample evolved rules and results. It concludes with the scope and application of the work to other domains.

Kunjal Mankad, Priti Srinivas Sajja, Rajendra Akerkar
A Survey on Hand Gesture Recognition in Context of Soft Computing

Hand gestures recognition is the natural way of Human Machine interaction and today many researchers in the academia and industry are interested in this direction. It enables human being to interact with machine very easily and conveniently without wearing any extra device. It can be applied from sign language recognition to robot control and from virtual reality to intelligent home systems. In this paper we are discussing work done in the area of hand gesture recognition where focus is on the soft computing based methods like artificial neural network, fuzzy logic, genetic algorithms, etc. We also described hand detection methods in the preprocessed image for detecting the hand image. Most researchers used fingertips for hand detection in appearance based modeling. Finally we are comparing results given by different researchers after their implementation.

Ankit Chaudhary, J. L. Raheja, Karen Das, Sonia Raheja
Handwritten Numeral Recognition Using Modified BP ANN Structure

In this work the classification efficiency of the feed-forward neural network architecture is analyzed by using various different activation functions for the neurons of hidden and output layer and varying the number of neurons in the hidden layer. 250 numerals were gathered form 35 people to create the samples. After binarization, these numerals were clubbed together to form training patterns for the neural network. Network was trained to learn its behavior by adjusting the connection strengths at every iteration. Experiments were performed by selecting all combinations of two activation functions logsig and tansig for the neurons of the hidden and output layers and the results revealed that as the number of neurons in the hidden layer is increased, the network gets trained in small number of epochs and the percentage recognition accuracy of the neural network was observed to increase up to a certain level and then it starts decreasing when number of hidden neurons exceeds a certain level due to overfitting.

Amit Choudhary, Rahul Rishi, Savita Ahlawat
Expert System for Sentence Recognition

The problem of using natural languages as a medium of input to computational system has long intrigued and attracted researchers. This problem becomes especially acute for systems that have to deal with massive amount of data as inputs in the form of sentences/commands/phrase as a large number of such phrases may look vastly different in lexical and grammatical structure but yet convey similar meanings. In this paper, we describe a novel approach involving Artificial Neural Network to sufficiently solve the aforesaid problem for inputs in English language. The proposed system uses Self Organizing Map (SOM) to recognize and classify the input sentences into classes representing phrases/sentences having similar meaning. After Detailed analysis and evaluation, we have been able to reach a maximum efficiency of approximately 92.5% for the system. The proposed expert system could be extended to be used in the development of efficient and robust systems like intelligent medical systems, Systems for Intelligent Web-Browsing, telemarketing and several others which will be able to take text input in the form commands/sentences in natural languages to give suitable output.

Bipul Pandey, Anupam Shukla, Ritu Tiwari
RODD: An Effective Reference-Based Outlier Detection Technique for Large Datasets

Outlier detection has gained considerable interest in several fields of research including various sciences, medical diagnosis, fraud detection, and network intrusion detection. Most existing techniques are either distance based or density based. In this paper, we present an effective reference point based outlier detection technique (RODD) which performs satisfactorily in high dimensional real-world datasets. The technique was evaluated in terms of detection rate and false positive rate over several synthetic and real-world datasets and the performance is excellent.

Monowar H. Bhuyan, D. K. Bhattacharyya, J. K. Kalita

Distributed and Parallel Systems and Algorithms

A Review of Dynamic Web Service Composition Techniques

The requester’s service request sometimes includes multiple related functionalities to be satisfied by the Web service. In many cases the Web service has a limited functionality which is not sufficient to meet the requester’s complex functional needs. The discovery mechanism for such complex service request involving multiple tasks (operations) may fail due to unavailability of suitable Web services advertised in the registry. In such a scenario, a need arises to compose the available atomic or composite Web services to satisfy the requester’s complex request. Dynamic Web service composition generates and executes the composition plan based on the requester’s runtime functional and nonfunctional requirements. This paper provides the review of Web service composition architectures and techniques used to generate new (value added) services.

Demian Antony D’Mello, V. S. Ananthanarayana, Supriya Salian
Output Regulation of Arneodo-Coullet Chaotic System

This paper investigates the problem of output regulation of the Arneodo-Coullet chaotic system, which is one of the paradigms of the chaotic systems proposed by A. Arneodo, P. Coullet and C. Tresser (1981). Explicitly, state feedback control laws to regulate the output of the Arneodo-Coullet chaotic system have been derived so as to track the constant reference signals as well as to track periodic reference signals. The control laws are derived using the regulator equations of C.I. Byrnes and A. Isidori (1990), who solved the problem of output regulation of nonlinear systems involving neutrally stable exosystem dynamics. The output regulation of the Coullet chaotic system has important applications in Electrical and Communication Engineering. Numerical simulations are shown to verify the results.

Sundarapandian Vaidyanathan
A High-Speed Low-Power Low-Latency Pipelined ROM-Less DDFS

The present-day research on direct digital frequency synthesizer (DDFS) lays emphasis on ROM-less architecture, which is endowed with high speed, low power and high spurious free dynamic range (SFDR) features. The DDFS has a wide application in signal processing and telecommunication area, which generates the sine or cosine waveforms within a broad frequency range. In this paper, one high-speed, low-power, and low-latency pipelined ROM-less DDFS architecture has been proposed, implemented and tested using Xilinx Virtex-II Pro University FPGA board. The proposed ROM-less DDFS design has 32 bit phase input and 16 bit amplitude resolution with a maximum amplitude error of 1.5 ×10

− 4

. FPGA implementation of the proposed design has exhibited an SFDR of -94.3 dBc and a maximum operating frequency of 276 MHz while consuming only 22 K gates and 1.05 mW/MHz power. The high speed of operation and the low power make the proposed design suitable for use in communication transceiver for up and down conversion.

Indranil Hatai, Indrajit Chakrabarti
A Mathematical Modeling of Exceptions in Healthcare Workflow

Though workflow technology is being used for automating enterprise activities but its uses in healthcare domain is at nascent stage. As healthcare workflow directly deals with human life, therefore it must take precautions to undesired situations during its executions. In this paper we analyze the domain and mathematical modeling is done. We present a comprehensive view on genesis of exceptions and corresponding actions. The concept is further explained with the help of a case study.

Sumagna Patnaik
Functional Based Testing in Web Services Integrated Software Applications

In this paper we analyze the distinct features of web-based applications and testing done to ensure security and efficiency in the communication of data between client and host. Most work on web applications has been on making them more powerful, but relatively little has been done to ensure their quality. Important quality attributes for web applications include reliability, availability, interoperability and security. The SOAP protocol is used as a communication protocol between XML and HTTP. Based on the analysis done functional based testing is used to ensure different level of quality control of web services applications in various circumstances.

Selvakumar Ramachandran, Lavanya Santapoor, Haritha Rayudu
Design and Implementation of a Novel Distributed Memory File System

To improve performance and efficiency of applications, a right balance among CPU throughput, memory performance and I/O subsystem is required. With parallel processors increasing the number crunching capabilities, the limitations of I/O systems have come to fore and have become the major bottleneck in achieving better turnaround times for large I/O bound jobs in particular. In this paper, we discuss the design, implementation and performance of a novel distributed memory file system that utilizes the free memory of cluster nodes over different interconnects to assuage the above-mentioned problem.

Urvashi Karnani, Rajesh Kalmady, Phool Chand, Anup Bhattacharjee, B. S. Jagadeesh
Decentralized Dynamic Load Balancing for Multi Cluster Grid Environment

Load balancing is essential for efficient utilization of resources and enhancing the performance of computational grid. Job migration is an effective way to dynamically balance the load among multiple clusters in the grid environment. Due to limited capacity of single cluster, it is necessary to share the underutilized resources of other clusters. Each cluster saves the static and dynamic information about its neighbors including transfer delay and load. This paper addresses the issues in multi cluster load balancing based on job migration across separate clusters. A decentralized grid model, as a collection of clusters for computational grid environment is proposed.A Sender Initiated Decentralized Dynamic Load Balancing (SI-DDLB) algorithm is introduced. The algorithm estimates system parameters such as resource processing rate and load on each resource. The algorithm balances the load by migrating jobs to the least loaded neighboring resource by taking into account of transfer delay. The algorithm also considers the availability of selected resource before dispatching job for execution since the probability of failure is more in the dynamic grid environment. The main goal of the proposed algorithm is to reduce the response time of the jobs. The proposed algorithm has been verified through the GridSim simulation toolkit. Simulation results show that the proposed algorithm is feasible and improves the system performance considerably.

Malarvizhi Nandagopal, V. Rhymend Uthariaraj
Adoption of Cloud Computing in e-Governance

Cloud is a model or architecture and a new paradigm of computing with SOA as its base architecture. Cloud Computing has evolved as a key computing platform for sharing resources that include infrastructures, software, applications, and business processes. e-Governance plays a vital role in any organization and clouds with different layers are helpful to the e-Governance services. Cloud has different services, which are integrated and reused. As e-Governance is using distributed services, which requires a lot of infrastructure. Cloud services are helpful to reduce the cost of infrastructure and software cost. This paper describes how to adopt cloud computing in e-Governance applications to reduce infrastructure, and platform cost, to increase network security, to increase scalability and quick implementation.

Rama Krushna Das, Sachidananda Patnaik, Ajita Kumar Misro
Efficient Web Logs Stair-Case Technique to Improve Hit Ratios of Caching

Cache prefetching technique can improve the hit ratio and expedite users visiting speed. Predictive Web prefetching refers to the mechanism of deducing the forth coming page accesses of a client based on its past accesses.Congestion in Network remains one of the main barriers to the continuing success of the Internet. For Web users, congestion manifests itself in unacceptably long response times. One possible remedy to the latency problem is to use caching at the client, at the proxy server, or within the Internet. However, Web documents are becoming increasingly dynamic, which limits the potential benefit of caching. The performance of a Web caching system can be dramatically increased by integrating document prefetching into its design. Although prefetching reduces the response time of a requested document, it also increases the network load, as some documents will be unnecessarily prefetched.In the paper, we developed a Stair-Case prune algorithm to mine popular with their conditional probabilities from the proxy log, and stored them in the rule table. Then, according to contents and the rule table, a prediction is calculated in some precondition. After the simulation, we found that our approach has much better performance than the other ones, in terms of hit ratio.

Khushboo Hemnani, Dushyant Chawda, Bhupendra Verma
A Semantic Approach to Design an Intelligent Self Organized Search Engine for Extracting Information Relating to Educational Resources

With the phenomenal growth in the World Wide Web, current online education system has tried to incorporate artificial intelligence and semantic web resources to their design and architecture. Semantic web is an evolving extension of the World Wide Web in which web content is organized meaningfully in a structured format using web ontology language (OWL), thus permitting them to find, share and integrate information more easily. This paper presents a methodology to design an intelligent system to retrieve information relating to education resources. For this, a knowledge library for pedagogic domain is created using ontology and knowledge management technologies, and then a strategy is devised to group the related topics in each subject and present it to the user in a single search with the prerequisites. The efficacy of our approach is demonstrated by implementing a prototype and comparing the retrieval results with online search engines.

B. Saleena, S. K. Srivatsa, M. Chenthil Kumar
Cluster Bit Collision Identification for Recognizing Passive Tags in RFID System

Radio Frequency Identification (RFID) is a technology that amalgamates the use of electromagnetic or electrostatic coupling in the radio frequency (RF) portion of the electromagnetic spectrum to uniquely identify an object. RFID systems often experience a circumstance in which tags responding to a single reader at the same time, collide with each other, leading to retransmission of tag ID’s that results in wastage of bandwidth and an increase in the total delay. In this paper, a novel anti-collision algorithm, Cluster Bit Collision Identification (CBCID) Anti Collision Protocol is proposed to lessen the length of response generated by tags, minimize the time slots to be consumed for recognizing all tags, and minimize the average identification delay. CBCID checks for the occurrence of collision by bit, and reduces the number of tag counters to one, when compared to other protocols. CBCID is compared with existing approaches namely QT, QTsl, QTim and ABS, to evaluate its efficiency in handling a worst case environment. Results indicated that when the number of tags in the interrogation zone increases exponentially CBCID excels in minimizing the time slots consumed, number of collisions incurred.

Katheeja Parveen, Sheik Abdul Khader, Munir Ahamed Rabbani
Integration Testing of Multiple Embedded Processing Components

Integration testing of Complex Embedded systems (CES) and associated interconnection network has not been discussed much in the literature. This paper focuses on integration testing among Embedded Processing Components (EPCs) that are (loosely coupled) interconnected via I/O ports. This paper models EPC as a deterministic FSM and describes its fault model that captures errors in the interface between the Hardware and software. The EPC integration with rest of ES (or another EPC) can be viewed as a system under test (SUT) composed of two parts: EPC1 that requires integration testing with the other EPC2. This paper models EPC integration testing as a general fault localization problem [15] between Communicating FSM. An example shows that the integration testing of two subsystems is a subset of general fault diagnostic problem and the faulty machine can be identified during integration.

Hara Gopal Mani Pakala, K. V. S. V. N. Raju, Ibrahim Khan

Security and Information Assurance

A New Defense Scheme against DDoS Attack in Mobile Ad Hoc Networks

The mobile ad hoc networks (MANETs) are highly vulnerable to attacks because of its unique characteristics such as open network architecture, shared wireless medium, stringent resource constraints and highly dynamic network topology

.

In particular, distributed denial-of-service (DDoS) attacks can severely cripple network performance with relatively little effort expended by the attacker. These attacks throttle the tcp throughput heavily. A new defense scheme is proposed to develop a flow monitoring scheme to defend against such attacks in mobile adhoc networks. Our proposed defense mechanism uses the medium access control (MAC) layer information to detect the attackers. The defense mechanism includes bandwidth reservation and distributed rate control. Once the attackers are identified, all the packets from those nodes will be blocked. The network resources are made available to the legitimate users.

S. A. Arunmozhi, Y. Venkataramani
A Model for Delegation Based on Authentication and Authorization

Sharing information and maintaining privacy and security is a requirement in distributed environments. Mitigating threats in a distributed environment requires constant vigilance and defense-in-depth. Most systems lack a secure model that guarantees an end-to-end security. We devise a model that mitigates a number of threats to the distributed computing pervasive in enterprises. This authentication process is part of a larger information assurance systemic approach that requires that all active entities (users, machines and services) be named, and credentialed. Authentication is bi-lateral using PKI credentialing, and authorization is based upon Security Assertion Markup Language (SAML) attribution statements. Communication across domains is handled as a federation activity using WS-* protocols. We present the architectural model, elements of which are currently being tested in an operational environment. Elements of this architecture include real time computing, edge based distributed mashups, and dependable, reliable computing

.

The architecture is also applicable to a private cloud.

Coimbatore Chandersekaran, William R. Simpson
Identification of Encryption Algorithm Using Decision Tree

The task of identification of encryption algorithm from cipher text alone is considered to be a challenging one. Very few works have been done in this area by considering block ciphers or symmetric key ciphers. In this paper, we propose an approach for identification of encryption algorithm for various ciphers using the decision tree generated by C4.5 algorithm. A system which extracts eight features from a cipher text and classifies the encryption algorithm using the C4.5 classifier is developed. Success rate of this proposed method is in the range of 70 to 75 percentages.

R. Manjula, R. Anitha
A Novel Mechanism for Detection of Distributed Denial of Service Attacks

The increasing popularity of web-based applications has led to several critical services being provided over the Internet. This has made it imperative to monitor the network traffic so as to prevent malicious attackers from depleting the resources of the network and denying services to legitimate users. This paper has presented a mechanism for protecting a web-server against a distributed denial of service (DDoS) attack. Incoming traffic to the server is continuously monitored and any abnormal rise in the inbound traffic is immediately detected. The detection algorithm is based on a statistical analysis of the inbound traffic on the server and a robust hypothesis testing framework. While the detection process is on, the sessions from the legitimate sources are not disrupted and the load on the server is restored to the normal level by blocking the traffic from the attacking sources. To cater to different scenarios, the detection algorithm has various modules with varying level of computational and memory overheads for their execution. While the approximate modules are fast in detection and involve less overhead, they have lower detection accuracy. The accurate modules involve complex detection logic and hence involve more overhead for their execution, but they have very high detection accuracy. Simulations carried out on the proposed mechanism have produced results that demonstrate effectiveness of the scheme.

Jaydip Sen
Authenticating and Securing Mobile Applications Using Microlog

This paper elucidates the research and implementation of Microlog in J2ME applications. This small yet powerful logging library logs all the detailed background transactions, acts as a tool for detecting unauthorized users trying to access the application by logging to remote servers and devices via various logging destinations. It also retrieves useful runtime information, such as malfunction code and unexpected errors and behaviours. The log thus generated can be printed using a portable Bluetooth printer. J2ME being platform independent works with Microlog in providing a great feature being explored by future J2ME developers.

Siddharth Gupta, Sunil Kumar Singh
Assisting Programmers Resolving Vulnerabilities in Java Web Applications

We present in this paper a new approach towards detection and correction of security vulnerabilities in Java Web applications using program slicing and transformation. Our vulnerability detector is based on an extended program slicing algorithm and handles taint propagation through strings. Our prototype is implemented as an Eclipse plug-in and leverages the WALA library to fix XSS Vulnerabilities in an interactive manner. We also show that our approach offers a good performance, both computationally and in terms of vulnerabilities found.

Pranjal Bathia, Bharath Reddy Beerelli, Marc-André Laverdière
Estimating Strength of a DDoS Attack Using Multiple Regression Analysis

Anomaly based DDoS detection systems construct profile of the traffic normally seen in the network, and identify anomalies whenever traffic deviate from normal profile beyond a threshold. This extend of deviation is normally not utilized. This paper reports the evaluation results of proposed approach that utilizes this extend of deviation from detection threshold, to estimate strength of DDoS attack using multiple regression model. A relationship is established between strength of DDoS attacks and observed deviation in sample entropy. Various statistical performance measures, such as Coefficient of determination (R2), Coefficient of Correlation (CC), Sum of Square Error (SSE), Mean Square Error (MSE), Root Mean Square Error (RMSE), Normalized Mean square Error (NMSE), Nash–Sutcliffe Efficiency Index (

η

) and Mean Absolute Error (MAE) are used to measure the performance of the regression model. Internet type topologies used for simulation are generated using Transit-Stub model of GT-ITM topology generator. NS-2 network simulator on Linux platform is used as simulation test bed for launching DDoS attacks with varied attack strengths. The simulation results are promising as we are able to estimate strength of DDoS attack efficiently with very less error rate using multiple regression model.

B. B. Gupta, P. K. Agrawal, R. C. Joshi, Manoj Misra
A Novel Image Encryption Algorithm Using Two Chaotic Maps for Medical Application

The advancement of information technology has provided the possibility of transmitting and retrieving medical information in a better manner in the recent years. The secured medical image transmission will help in maintaining the confidentiality of information. Such security measures are highly essential for multi media data transfer from the local place to the specialist location at the remote place. This paper devoted to provide a secured medical image encryption technique using duo chaos based circular mapping. The given original images are divided into blocks and zigzag scanning is performed .To encrypt the image, chaos based circular shift mapping procedure and scrambling based on cryptography technique are adopted. The efficiency of the proposed scheme is evaluated in terms of statistical measures such as cross correlation and peak signal –to noise ratio (PSNR). It is found that the proposed image encryption scheme yields better results, which, can be suitably tested for real time problems.

G. A. Sathishkumar, K. Bhoopathybagan, N. Sriraam, S. P. Venkatachalam, R. Vignesh
Chest X-Ray Analysis for Computer-Aided Diagnostic

X-ray is a classical method for diagnosis of some chest diseases. The diseases are curable if they are detected in their early stages. Detection of chest diseases is mostly based on chest X-ray images (CXR). This is a time consuming process. In some cases, medical experts had overlooked the diseases in their first examinations on CXR, and when the images were re-examined, the disease signs could be detected. Furthermore, the number of CXR to examine is numerous and far beyond the capability of available medical staff, especially in developing countries.

A computer-aided diagnosis (CAD) system can mark prospected areas on CXR for careful examination by medical doctors, and can give alarm in the cases that need urgent attention.

This paper reports our continuous work on the development of a CAD system. Some preliminary results for detection of early symptoms of some chest diseases like tuberculosis, cancer, lung collapse, heart failure, etc. are presented.

Kim Le
Overcoming Social Issues in Requirements Engineering

Aim of this research paper is for creating awareness and consciousness of the importance about Social issues in requirements engineering by identifying those issues and analyzing it with the inputs given by several companies across the world. This paper also discusses overcoming those social issues and how currently software industry is handling those issues.

Selvakumar Ramachandran, Sandhyarani Dodda, Lavanya Santapoor

Ad Hoc and Ubiquitous Computing

Range-Free Localization for Air-Dropped WSNs by Filtering Neighborhood Estimation Improvements

Many situation management applications involve an aerial deployment of a dense sensor network over the area of interest. In this context, current range-free localization proposals, based on an iterative refinement and exchange of node estimations, are not directly applicable, due to they introduce a high traffic overhead. In this paper, we propose to control this overhead by means of avoiding the transmission of packets which do not contribute to improve the result of the localization algorithm. In particular, a node does not transmit its current position estimation if it does not significantly differ from the estimations of its neighborhood. Simulation results show that the proposed filter reduces significantly the amount of packets required by the localization process.

Eva M. García, Aurelio Bermúdez, Rafael Casado
Evolution of Various Controlled Replication Routing Schemes for Opportunistic Networks

Opportunistic network is a recent evolution in the wireless community; they constitute basically through cooperation & coordination a special type of wireless mobile adhoc networks. These networks are formed instantaneously in a random manner, provided the basic elements of networks exist in vicinity or approachable limits. In such networks, most of the time there does not exist an end to end path, contact is opportunity based, and, could break soon after discovery. There are many realistic scenarios fitting to this situation, like wildlife tracking sensor networks, military networks, vehicular ad hoc networks to mention a few. To transmit information under such circumstances/scenarios researchers have proposed various efficient forwarding (single copy), replication routing and controlled based schemes. In this paper, we propose to explore, investigate and analyze most of the schemes [1] [2] [3] [4] [5] [6] and present the findings of the said scheme by consolidating critical parameters and issues and suggest through our study the possible future options and scopes.

Hemal Shah, Yogeshwar P. Kosta
Collaborative Context Management and Selection in Context Aware Computing

Computing and computing applications are merged into surroundings instead of having computers as discrete objects are the objective of pervasive computing. Applications must adjust their behavior to every changing surroundings. Adjustment involves proper capture, management and reasoning of context. This paper proposes representation of context in a hierarchical form, storing of context data in an object relational database rather than an ordinary database and selecting the context using heuristic pruning method. Semantic of the context is managed by Ontology and context data is handled by Object relational database. These two modeling elements are associated to each other by semantics relations build in the ontology. The separation of modeling elements loads only relevant context data into the reasoner. This influences the only limited amount of context data in the reasoning space which further improves the performance of the reasoning process.

B. Vanathi, V. Rhymend Uthariaraj
Privacy Preservation of Stream Data Patterns Using Offset and Trusted Third Party Computation in Retail-Shop Market Basket Analysis

Privacy preservation is widely talked in recent years, which prevents the disclosure of sensitive information during the knowledge discovery. There are many applications of distributed scenario which includes retail shops, where the stream of digital data is collected from time to time. The collaborating parties are generally interested in finding global patterns for their mutual benefits. There are few proposals which address these issues, but in the existing methods, global pattern computation is carried out by one of the source itself and uses one offset to perturb the personal data which fails in many situations such as all the patterns are not initiated at the initial participating party. Our novel approach addresses these problems for retail shops in strategic way by considering the different offsets to perturb the sensitive information and trusted third party to ensure global pattern computation.

Keshavamurthy B.N., Durga Toshniwal

Wireless Ad Hoc Networks and Sensor Networks

Application of Euclidean Distance Power Graphs in Localization of Sensor Networks

Localization of sensor nodes in a wireless sensor network is needed for many practical uses. If the nodes are considered as vertices of a globally rigid graph then the nodes can be uniquely localized up to translation, rotation and reflection. We have given the construction of globally rigid graph through Euclidean distance powers of Unit Disk graph.

G. N. Purohit, Seema Verma, Usha Sharma
Retracted: A New Protocol to Secure AODV in Mobile AdHoc Networks

In this paper we propose a game theoretic approach called The New Protocol and we integrate this into the reactive Ad hoc On-demand Distance Vector (

AODV

) routing protocol to provide defense against blackhole attacks. This idea is based on the concept of non-cooperative game theory. The

AODV-NEW

outperforms

AODV

in terms of the number of dropped packets when blackhole nodes exist within a

MANET

(Mobile AdHoc Network).

Avinash Krishnan, Aishwarya Manjunath, Geetha J. Reddy
Spelling Corrector for Indian Languages

With the advancements in computational linguistic processing, the paper work is replaced by documents in the form of soft copies. Though there are many software and keyboards available to produce such documents, the accuracy is not always acceptable. The chance of getting errors is more.This paper proposes a model which can be used to correct spellings of Indian languages in general and Telugu in particular. This paper also discusses the experimental setup and results of implementation. There are few spell checkers and correctors which were developed earlier. The spelling corrector proposed in this paper is different from that in terms of the approach used. Main claim of the paper is implementation of the correction algorithm and proposal of architecture.

K. V. N. Sunitha, A. Sharada
Voltage Collapse Based Critical Bus Ranking

Identification of critical or weak buses for a given operating condition is an important task in the load dispatch centre. It has become more vital in view of the threat of voltage instability leading to voltage collapse. This paper presents a fuzzy approach for ranking critical buses in a power system based on line flow index and voltage profiles at load buses. The line flow index determines the maximum load that is possible to be connected to a bus in order to maintain stability before the system reaches its bifurcation point. Line flow index (LF index) along with voltage profiles at the load buses are represented in fuzzy set notation. Further they are evaluated using fuzzy rules to compute composite index. Based on this index, critical buses are ranked. The bus with highest rank is the weakest bus as it can withstand a small amount of load before causing voltage collapse. The proposed method is tested on five bus test system.

Shobha Shankar, T. Ananthapadmanabha
Multiplexer Based Circuit Synthesis with Area-Power Trade-Off

Due to the regularity of implementation, multiplexers are widely used in VLSI circuit synthesis. This paper proposes a technique for decomposing a function into 2-to-1 multiplexers performing area-power tradeoff. To the best of our knowledge this is the first ever effort to incorporate leakage into power calculation for multiplexer based decomposition. With respect to an initial ROBDD (Reduced Ordered Binary Decision Diagram) based representation of the function, the scheme shows more than 30% reduction in area, leakage and switching for the LGSynth91 benchmarks without performance degradation. It also enumerates the trade-offs present in the solution space for different weights associated with these three quantities.

Sambhu Nath Pradhan, Santanu Chattopadhyay
Exergaming – New Age Gaming for Health, Rehabilitation and Education

The urban lifestyle is hectic and information rich. A busy work pace and digital entertainment take time away from real world physical exercise, while lifestyle diseases increase globally. Exergaming, which is a term combining “exercise” and “gaming”, has a lot of potential to provide various new service business opportunities for the entertainment and recreation as well as the healthcare sectors. In this paper, I review some new exergaming prototypes. Also I present current rehabilitation schemes, especially the PS3 based rehabilitation of children with hemiplegia. The Nintendo Wii is also an emerging contender in the health field. I discuss Wii based balance and wrist therapies that are becoming widespread. The Wii fit and Wii sports are becoming a hit among health conscious people. Also researchers from the the University of Ulster have made some new webgames for upper limb rehabilitation. The use of PSPs in English lab classes is also shown. Together the gaming industry contributes a lot today.

Ankit Kamal
Inclusion/Exclusion Protocol for RFID Tags

It is not uncommon to encounter objects with several RFID tags. However, the tags on these objects are generally mobile and move from or to (or, both) the object. No existing RFID authentication protocol considers this scenario. Moreover, an authentication protocol in such a scenario has the potential to be vulnerable to relay attacks where a tag that is not present on the object may pretend to be present. We present an authentication protocol that facilitates inclusion as well as exclusion of RFID tags on an object while simultaneously providing immunity to relay attacks.

Selwyn Piramuthu
Min Max Threshold Range (MMTR) Approach in Palmprint Recognition

Palmprint recognition is an effective biometric authentication method to automatically identify a person’s identity. The features in a palmprint include principal lines, wrinkles and ridges etc. All these features are of different length and thickness. It is not possible to analyse them in single resolution, so multi-resolution analysis technique is required. Here, Wavelet transform is proposed as a multi-resolution technique to extract these features. Euclidian distance is used for similarity measurement. In addition, a Min Max Threshold Range (MMTR) method is proposed that helps in increasing overall system accuracy by matching a person with multiple threshold values. In this technique, firstly the person is authenticated at global level using Reference threshold. Secondly, the person is authenticated at local level using range of Minimum and Maximum thresholds defined for a person. Generally, personal authentication is done using reference threshold but there are chances of false acceptance. So, by using the Minimum and Maximum Thresholds range of false accepted persons at personal level, a person is identified to be false accepted or genuinely accepted. MMTR is an effective technique to increase the accuracy of the palmprint authentication system by reducing the False Acceptance Rate (FAR). Experimental results indicate that the proposed method improves the False Acceptance Rate drastically.

Jyoti Malik, G. Sainarayanan, Ratna Dahiya
Power and Buffer Overflow Optimization in Wireless Sensor Nodes

Prolonging the life span of the network is the prime focus in highly energy constrained wireless sensor networks. Sufficient number of active nodes can only ensure proper coverage of the sensing field and connectivity of the network. If most of the nodes get their batteries depleted then it is not possible to maintain the network. In order to have long lived network it is mandatory to have long lived sensor nodes and hence power optimization at node level becomes equally important as power optimization at network level. In this paper need for a dynamically adaptive sensor node is signified in order to optimize power at individual nodes.

We have analyzed a wireless sensor node using queuing theory. A sensor node is looked upon as a tandem queue in which first server is the processor or micro controller and the second server in series is the transmitter. Both the servers have finite and very small buffers associated with them as the sensor nodes are tiny devices and have very limited hardware.

In this paper we have analyzed and simulated sensor node models. First we have considered a sensor node working with fixed service rate (processing rate and transmission rate). Secondly we have considered an adaptive sensor node which is capable of varying its service rates as per the requirement and ensure the quality of service. We have simulated both the models using MATLAB and compared their performances like life time, power consumption, buffer overflow probability and idle time etc.

We have compared the performances of both the models under normal work loads as well as when the catastrophe (heavy wok load) occurs. In both the situations an adaptive service model out performs the fixed service model as it saves the power during normal period and increases the lifetime and during catastrophe period it consumes more power but ensures the QoS (Quality of Service) by reducing the overflow probability.

Gauri Joshi, Sudhanshu Dwivedi, Anshul Goel, Jaideep Mulherkar, Prabhat Ranjan
Web Log Data Analysis and Mining

Log files contain information about User Name, IP Address, Time Stamp, Access Request, number of Bytes Transferred, Result Status, URL that Referred and User Agent. The log files are maintained by the web servers. By analysing these log files gives a neat idea about the user. This paper gives a detailed discussion about these log files, their formats, their creation, access procedures, their uses, various algorithms used and the additional parameters that can be used in the log files which in turn gives way to an effective mining. It also provides the idea of creating an extended log file.

L. K. Joshila Grace, V. Maheswari, Dhinaharan Nagamalai
Steganography Using Version Control System

In this paper two different techniques of steganography using change tracking are discussed. First method, steganography using change tracking technique uses change tracking feature of MS Word for data hiding. Message embedding and extraction in MS Word document is discussed briefly along with the example. Second method, steganography using version control system is also proposed in this paper. It elaborates the idea of using the version control system for data hiding. One of the most important features provided by version control systems is version control. It helps to keep track of changes by maintaining multiple versions of the project depending on the requirements. One of the versions of this project can be utilized as a cover medium for data hiding. Generally a project consists of many files. Hence long message can be fragmented and one message fragment can be embedded in one file of the project. Experimentation is carried out using Microsoft Visual SourceSafe as the version control system and C# sample project as the cover project.

Vaishali S. Tidake, Sopan A. Talekar
Erratum: A New Protocol to Secure AODV in Mobile AdHoc Networks

The paper “A New Protocol to Secure AODV in Mobile AdHoc Networks” appearing on pages 378-389 of this publication has been retracted due to a severe case of plagiarism.

Avinash Krishnan, Aishwarya Manjunath, Geetha J. Reddy
Backmatter
Metadaten
Titel
Advanced Computing
herausgegeben von
Natarajan Meghanathan
Brajesh Kumar Kaushik
Dhinaharan Nagamalai
Copyright-Jahr
2011
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-642-17881-8
Print ISBN
978-3-642-17880-1
DOI
https://doi.org/10.1007/978-3-642-17881-8

Premium Partner