Skip to main content
Top

2014 | Book

Proceedings of International Conference on Internet Computing and Information Communications

ICICIC Global 2012

Editors: Swamidoss Sathiakumar, Lalit Kumar Awasthi, M. Roberts Masillamani, S S Sridhar

Publisher: Springer India

Book Series : Advances in Intelligent Systems and Computing

insite
SEARCH

About this book

The book presents high quality research papers presented by experts in the International Conference on Internet Computing and Information Communications 2012, organized by ICICIC Global organizing committee (on behalf of The CARD Atlanta, Georgia, CREATE Conferences Inc). The objective of this book is to present the latest work done in the field of Internet computing by researchers and industrial professionals across the globe. A step to reduce the research divide between developed and under developed countries.

Table of Contents

Frontmatter
Secure Text Steganography

Steganography is an art in which the data can be hidden in other data as cover, the text files are commonly used for hiding data. The main aspects of steganography are the capacity and security, where the capacity refers to how much data can be hidden in the cover carrier, while the security concerns with the ability of disclosing or altering the data by unauthorized party. The aim of this project is to implement an algorithm to reduce the size of objects created using steganography. In addition, the security level of each approach is made more secured. This project presents an overview of text steganography and various existing text-based steganography techniques. Highlighted are some of the problems inherent in text steganography as well as issues with existing solutions. A new approach is proposed in information hiding using interword spacing which reduces the amount of information to hide. This method offers generated stego text of maximum capacity according to the length of the secret message.

P. Akhilandeswari, Jabin G. George
Policy-Based Energy Management in Smart Homes

This paper proposes the use of policies in smart homes to manage energy efficiently and reduce peak energy demand. In peak hours, demand increases and supply providers bring additional power plants online to supply more power, which results in higher operating costs and carbon emission. In order to meet peak demand, utility companies have to build additional power plants, which may be operated only for short period of time. Therefore, reducing peak load will reduce the need for building additional power plants and decrease carbon emission. Our policy-based framework achieves peak shaving so that power consumption adapts to available power while ensuring the comfort level of the inhabitants and taking device characteristics into account at the same time. Our simulation results on Matlab indicate that the proposed policy driven homes can effectively contribute to demand side power management.

T. K. Anandalakshmi, S. Sathiakumar, N. Parameswaran
Saturation Throughput and Delay Analysis of IEEE 802.11 Broadcast Transmissions Scheme

In this paper, the performance of IEEE 802.11-based single hop broadcast network assuming saturated nodes is studied by determining the threshold parameters for both saturation throughput as well as delay. An upper limit for the achievable throughput is derived by maximizing the broadcast saturation throughput. An analytical expression for optimal contention window size required to meet the desired objective is derived. Expression for the maximum saturation throughput is also obtained. The results reveal that the maximum saturation throughput is independent of the number of contending stations. This work also addresses the computation of the minimum delay in a wireless ad-hoc network with saturated nodes. The optimal contention window that minimizes the saturation delay under broadcast is also determined. The minimum delay parameter is significant in such a network since the network applications are generally not delay tolerant.

Gayathri Narayanan
Spatial Query Monitoring in Wireless Broadcast Environment

Wireless data broadcast is a promising technique for information dissemination that leverages the computational capabilities of the mobile devices in order to enhance the scalability of the system. Under this environment, the data are continuously broadcast by the server, interleaved with some indexing information for query processing. Clients may then tune in the broadcast channel and process their queries locally without contacting the server. It performs location updates only when they would likely alter the query results through monitoring process. Previous work on spatial query processing for wireless broadcast systems has only considered snapshot queries over static data. Here, we use the simple of K-Means clustering algorithm for making clusters of sensor node data in a wireless sensor network. It is used to monitor the objects continuously.

Koenni Naresh, J. Thangakumar, Durgamahesh Pannem
Enhancing Data Caching in Ad-hoc Networks Through Benefit-Based Technique

Mobile ad-hoc network (MANET) is demand-based, self configurable, network without any existing infrastructure. Data caching can significantly improve the efficiency of information access in a wireless ad-hoc network by reducing the access latency and bandwidth usage. Every Mobile Hosts (MHs) can move arbitrarily and communicate with other MHs by using multihop wireless links. However, designing efficient distributed caching algorithms is nontrivial when network nodes have limited memory. In this paper, we consider the cache placement problem of minimizing total data access cost in ad-hoc networks with multiple data items and nodes with limited memory capacity. Defining benefit as the reduction in total access cost. The approximation algorithm is amenable to localized distributed implementation, which is shown via simulations to perform close to the approximation algorithm.

Koteswara Rao Makke, J. Thangakumar, B. V. Suresh Reddy, Yannam Somaiah
Intrusion Detection in Cloud Computing Implementation of (SAAS & IAAS) Using Grid Environment

Security requires user authentication with password digital certificates and confidentiality for transmission of data in a distributed system. The Intrusion Detection System (IDS) detect intrusions by means of knowledge and behavior analysis. In this paper, we introduce concept called cloud computing to increase data efficiency and satisfies user request. We also include grid computing to make cloud computing more efficient, reliable, and increase the performance of the systems that are accessing server. This is because of more user logins at the same time and the server is not able to provide equal performance to all other system. We can achieve performance by getting performance from the system that are connected to server and providing it to system that accessing the server.

S. Manthira Moorthy, M. Roberts Masillamani
Inter Departure Time Analysis Over Heterogeneous Platforms Using Distributed IPv6 Traffic Generator Tool

IPv6 networks are at present under vast deployment in production networks as well as in the Internet. Tasks such as resource reservation, capacity planning, and effective security deployments necessitate the understanding of IPv6 flow behavior in the nodes as well as in the network. To accomplish the understanding of IPv6 flow behavior, a tool that generates application and attack flows based on sound mathematical models is essential. Since different flows have different characteristics, the model parameters for a particular flow feature are to be assigned with appropriate values. In our tool, we have identified five flow features for characterizing the flows. They are, namely, inter departure time (IDT), packet size (PS), flow count (FC), flow volume (FV), and flow duration (FD). Random sampling from distributions such as Exponential, Pareto, Poisson, Cauchy, Gamma, Student, Wei-bull, and Log Normal are considered. IPv6 TCP and UDP packets are constructed and transmitted for the flows according to the feature values received from the assumed distribution model. The direct model parameter specification and trace-based model parameter learning are incorporated in our tool. Using our tool, we have analyzed the node behavior for IDT feature over different hardware platforms to IPv6 flows. We have presented the IDT analysis and modeled the bit rate using a three parameter function from the experimental measurements. The parameters estimated for different platforms to this model are also reported.

S. P. Meenakshi, S. V. Raghavan, S. M. Bhaskar
Modeling-Simulation of an Underground Wireless Communication Channel

Wireless communication inside mines and tunnels is very different from that in terrestrial environment because of the strong attenuation of signals. Here, we have tried to develop an empirical model for the underground wireless communication channel based on experimental data. The model developed is based on the available outdoor and indoor propagation models such as Okumara-Hata, COST231, ITU indoor propagation models. Modeling is done by super position method. Choosing the most appropriate model among the available ones for the given data and performing regression methods to do curve curve fitting. Correction factors are then added based on two parameters, namely, diffraction and low frequency interference losses. Losses due to penetration and multi path loss are assumed to be constants. MAT LAB is used for curve fitting.

M. N. Jayaram, C. R. Venugopal
Review on Heart Sound Analysis Technique

Heart auscultation (the interpretation by a physician of heart sounds) is a fundamental activity of cardiac diagnosis. It is, however, a difficult skill to acquire. So it would be convenient to diagnosis the failure using some monitoring techniques. This paper reviews different signal processing technique for analyzing Heart Sound (HS) Vibration signals which is mainly used to diagnose these diseases. Conventional methods for fault diagnosis are mainly based on observing the amplitude differences in time or frequency domain such as Fourier Transform (FT), Short Time Fourier Transform (STFT), and Wavelet transform. This paper includes Spectral analysis method of heart sound by using autoregressive power spectral density (AR-PSD) for discriminating normal and abnormal HS, another method to diagnose heart sound such as Wavelet packet analysis and classifiers like Hidden Markov Model (HMM), Artificial Neural Network (ANN).

U. More Monali, R. Shastri Aparana
Solutions for Security in Mobile Agent System

For getting instant access to the data at one place, the software called ‘Mobile Agent’ is used, which moves from host to host and brings back the information required. With advancement in technology, access to information has become more and more easy, but it also brings certain concerns about the security of systems involved in the process and the data being used or taken. This paper is concerned with the security threats that a mobile agent system could experience and solutions that have been proposed for securing the mobile agent system. There are many solutions proposed for security; some are simulation based while some are mathematical derivation based. Simulation-based models are more practical for solving the real-world scenarios.

Neelam Dayal, Lalit Kumar Aswathi
An Automatic MRI Brain Segmentation by Using Adaptive Mean-Shift Clustering Framework

A novel fully, automatic, adaptive, robust procedure for brain tissue classification from three-dimensional (3D) magnetic resonance head images (MRI) is described in this paper. We propose an automated scheme for magnetic resonance imaging (MRI) brain segmentation. An adaptive mean-shift methodology is utilized in order to categorize brain voxels into one of three main tissue types: gray matter, white matter, and cerebro spinal fluid. The MRI image space is characterized by a high dimensional feature space that includes multimodal intensity features in addition to spatial features. An adaptive mean-shift algorithm clusters the joint spatial-intensity feature space, thus extracting a representative set of high-density points within the feature space, otherwise known as modes. Tissue segmentation is obtained by a follow-up phase of intensity-based mode clustering into the three tissue categories. By its nonparametric nature, adaptive mean-shift can deal successfully with nonconvex clusters and produce convergence modes that are better applicant for intensity based categorization than the initial voxels. The performance of this brain tissue classification procedure is demonstrated through quantitative and qualitative validation experiments on both simulated MRI data (10 subjects) and real MRI data (43 subjects). The proposed method is validated on 3-D single and multimodal datasets, for both simulated and real MRI data. It is shown to perform well in comparison to other state-of-the-art methods without the use of a preregistered statistical brain atlas.

J. Bethanney Janney, A. Aarthi, S. Rajesh Kumar Reddy
Automatic Silkworm Egg Counting Mechanism for Sericulture

Sericulture is an art of rearing silkworm for the production of cocoons, which is the raw material for the production of silk. The silkworm seed production is one of the important activities of sericulture in which the silkworm seed known as Disease Free Layings (DFLs) are prepared in their centers and supplied to the farmers for rearing. It is very important to count the number of silkworm eggs accurately so that farmers can pay accordingly and they should not suffer a loss. In order to generate some statistics, the fecundity and hatching percentage is measured by counting silkworm eggs. This counting is usually performed in a manual, visual, and non-automatic form, which is erroneous and time-consuming. This work approaches the development of automatic methods to count the number of silkworm eggs using image processing, particularly color segmentation and mathematical morphology.

Rupali Kawade, Jyoti Sadalage, Rajveer Shastri, S. B. Deosarkar
Proficient Energy Consumption Algorithm Using HMAC and ANT Colony-Based Algorithm

In this research we deal with Ant colony-based clustering algorithm (BACCA) for wireless sensor networks which uses low energy adaptive cluster hierarchy (LEACH) as its prototype a good approximation of a proactive network protocol in the case of adaptive routing. Based on this we estimate the amount of energy consumed by these in each case then we use hybrid medium access control (HMAC) used along with TDMA Multiple hop in a single medium access control, end-to-end quality of service, Latency sensitive traffic flows, and high priority channel access. It is shown by computer simulation that these mechanisms result in a significant improvement in energy consumption. Similarly in the final half we do consider the bandwidth utilization using (SFF) and dynamic first fit (DFF) by which we minimize the bandwidth which is a constrain for many user in the case of adaptive routing.

G. Saravanan, J. V. Anand, M. Roberts Masillamani
Secure Remote Access Fleet Entry Management System Using UHF Band RFID

A Fleet management system based on UHF and RFID is proposed. This system is applied to a vehicle entering/leaving at the road gates. The system consists of a RFID tag present in the fleet, reader antenna, a reader controller, and the monitoring and the commanding software. The whole system sits on open source platform. The java code is used for controlling the platform. The band width of the UHF expands from 300 MHZ to 3 GHz which helps the system to detect the incoming fleet at distance of 30 m around the antenna even when the fleet travels at a speed of 45 km/h.

Nagarajan Sathish, P. Ranjana
Distributed Data Mining in the Grid Environment

Grid computing has emerged as an important new branch of distributed computing focused on large-scale resource sharing and high-performance orientation. In many applications, it is necessary to perform the analysis of very large data sets. The data are often large, geographically distributed and its complexity is increasing. In these areas, grid technologies provide effective computational support for applications such as knowledge discovery. This paper is an introduction to grid infrastructure and its potential for machine learning tasks.

C. B. SelvaLakshmi, S. Murali, P. Chanthiya, P. N. Karthikayan
An Efficient Method for Improving Hiding Capacity for JPEG2000 Images

Information hiding techniques have recently become important in a number of application areas. The redundancy of digital media, as well as the characteristic of human visual system, makes it possible to hide messages. In this paper, a detailed review on various information hiding techniques discussed both in spatial and wavelet domain and focused on the problem of how to enhance the hiding capacity for JPEG2000 images as it is really a challenging problem because of limited redundancy and bit stream truncation. Spatial domain techniques spanning from Least Significant Bit (LSB) method, Least Pixel Adjustment Process (LPAP), and wavelet-based techniques include, Discrete Cosine Transform (DCT) method, Discrete Wavelet Transform (DWT) based techniques and spread spectrum techniques. Finally, JPEG2000 Architecture and high capacity steganography scheme for JPEG2000 Baseline system are described. Hiding capacity is very important for efficient covert communications. Available redundancy is very limited in JPEG2000 compressed images. So it is necessary to enlarge the hiding capacity and also it is very difficult to hide the information because of the bit stream truncation. Here a high-capacity steganography scheme is proposed for JPEG2000 baseline system, which uses bit-plane encoding procedure multiple times to solve the problem due to bit stream truncation and redundancy evaluation method, is used to increase hiding capacity.

T. Shahida, C. C. Sobin
Electronic Customer Relationship Management (e-CRM): Data Integration for Technical Institutions

Educational institutions worldwide are undergoing funda- mental shifts in how they operate and interact with their “customers”: students, alumni, donors, faculty members, and staff members. Kotler and Fox [32] state that “the best organization in the world will be ineffective if the focus on ‘customers’ is lost. First and foremost is the treatment of individual students, alumni, parents, friends, and each other internal customers). Every contact counts!”. Many organizations are fa- miliar with using CRM (Customer relationship management) to manage and enhance the customer relationship. Good customer relationship can bring great benefits and a competitive advantage to organization. And in this era of technology, CRM that consists of e-CRM (Electronic customer relationship management) is acknowledged as another potential solution for business. The focus is currently shifting from improving internal operations to concentrating more on customers. Technical education customers are demanding more attention and immediate service-that is, “Internet time”. Proactive institutions are now adjusting their practices by refocusing their efforts externally. Because of the need to concentrate more on customers, many institutions are once again turning to technology-this time to customer relationship management (CRM) software. CRM goes several steps further than ERP by helping institutions maximize their customer-centric resources. The purpose of this study is how Electronic Customer Relationship (e-CRM) will help the technical institutions to integrate the data from customer touch points.

Kini K. Shashidhar, D. H. Manjaiah
An Efficient Image Fusion Technique Using Wavelet with PCA

Image fusion is a process in which high-resolution Panchromatic Image (PAN) is combined with a low resolution Multispectral Image (MS) to form a new single image which contains both the spatial information of the PAN image and the spectral information of the MS image. By applying wavelet transform alone, the fusion result is often not good. Hence, when a wavelet transform is integrated with any traditional fusion method the fusion results are better. The decimated and undecimated wavelets used in image fusion can be categorized into three classes: Orthogonal, Biorthogonal, and Nonorthogonal. In this study, a fusion technique is proposed which uses both wavelet and PCA method for fusing the IRS-1D images using LISS III scanner for the locations Vishakhapatnam and Hyderabad, India. The proposed fusion results are compared using statistical performance measures and analyzed. It was ascertained that the wavelet with PCA is superior to the other wavelet transform methods.

C. M. Sheela Rani
Forensic Investigation Processes for Cyber Crime and Cyber Space

Computers are an integral part of our life. A significant percentage of today’s transactions and processes take place using the computer and Internet. People have readily adopted Internet technology and innocently trust it while using it with the ignorance of the limitations and threats to the system security. With the advance of technology, equally or more advanced form of crimes started emerging. Different types of cyber attacks from various sources may adversely affect computers, software, a network, an agency’s operations, an industry, or the Internet itself. Thus companies and their products aim to take assistance of legal and computer forensics. Digital forensics deals with computer-based evidence to determine who, what, where, when, and how crimes are being committed. Computer and network forensics has evolved to assure proper presentation of cyber crime evidentiary data into court. Forensic tools and techniques are an integral part of criminal investigations used to investigate suspect systems, gathering and preserving evidence, reconstructing or simulating the event, and assessing the current state of an event. In this paper we deliberate on two aspects; first, various types of crimes in the cyber space and various sources of cyber attacks, and second, investigation processes for various cyber attacks with the help of digital forensic tools like WinHex [

1

].

K. K. Sindhu, Rupali Kombade, Reena Gadge, B. B. Meshram
Feasibility Study for Implementing Brain Computer Interface Using Electroencephalograph

The purpose of a Brain Computer Interface (BCI) consists of the development of an interface between a human and a computer to allow the control of a device only via brain signals. The conformance of the system and the individual brain patterns of the subject is the major concern while developing the interface. In this paper, we begin with the exploration of variety of brain sensing technologies for detecting the specific forms of brain activity used in HCI research and then analyzing the properties of the most prevalent technology used in HCI research – Electroencephalograph. We have then proceeded with our aim of conducting a feasibility study of this implementation idea and verified the same by clearly determining the requirements and setting up of the appropriate facilities for research. We have also described and discussed the experiments showing the differences in the brain wave patterns for proving the same.

S. S. Sridhar, R. Shivaraman
Scope of Cloud Computing for Multimedia Application

The paper deals with most recent emerging computing technology cloud computing and how it can be used for multimedia application. With so many sources of data having multimedia features, the amount of data needed to be processed have increased significantly and alarmingly. So processing these data on the system itself is becoming very inefficient due to limited resources of the system especially in case of mobile device with limited memory and battery life. So this paper tries to analyze whether the concept of cloud computing can be implemented for processing and storing data related to multimedia application and if yes then what are the difficulties which could arise.

Uttam Vijay, Lalit Kumar Awasthi
Jena with SPARQL to Find Indian Natural Plants Used as Medicine for Diseases

The voluminous information available on the Web necessitates the integrative approach which retrieves sequence and process the information. The Semantic Web technology promises to store, share, and retrieve information with meaningful relationships. Semantic Web technology also allows to link heterogeneous data model into graphical form without changing the existing data storage. This makes the document-centric idea of current Web to more fine-grained semantic structures. Our paper focuses on Ontology construction of Natural Food Resources related with diseases, with the focus of context-based retrieval. Jena framework is used to build Semantic Web applications with the ontology representation of RDF (Resource Description Framework) and OWL (Web Ontology Language) form. SPARQL (Simple Protocol And RDF Query Language) is used to retrieve various Query Patterns. Thus using SPARQL in Jena it is possible to retrieve more specific and semantically related resources those are useful can identified without affecting the existing data models.

Vadivu Ganesan, Hopper S. Waheeta, H. Srimathi
Early Detection of Diabetic Retinopathy by CRA (Cluster Rejection Approach)

Diabetic Retinopathy (DR) is a major public health issue, since it can lead to blindness in patients with diabetes. Since large number of existing method of detecting DR had given importance to robust modeling of MA (Microaneurysm) by explicit segmentation of optic disk and vessels. Existing methods of detecting MA involve complex modeling results in high computation cost and time consuming process. In order to improve efficiency of the system, proposed a new approach for detecting DR based of cluster rejection methodology for detecting MA from retinal image. The proposed technique involves cluster separation of retinal images and selecting candidate sets based on simple threshold and rejection of candidates from DR affected retinal image. Our proposed methodology for detecting MA results in easy computation of fungus image and less time consuming process.

S. Vijayalakshmi Karthiga, T. Sudalai Muthu, M. Roberts Masillamani
A Review Paper on IEEE 802.11 WLAN

IEEE 802.11 wireless networks are becoming omnipresent nowadays, providing mobility as well as flexibility to the users accessing the information. Presently, it acts as an alternative to the wired network, but soon it may replace the wired network completely. The protocol is based on multiple access where a node competes with other nodes to get access to the communication medium and to transmit the data. A major aspect of wireless technology is roaming, which is defined as the ability to seamlessly change from one wireless AP to another. TCP used in wired networks is not appropriate for wireless networks because TCP assumes all loses as a result of congestion, which is not the case always in wireless links. Also, wireless networks are more vulnerable to security threats as compared to wired networks.

Vikram Singh, Lalit Kumar Awasthi
Compression of Color Image by Using Dual Tree Complex Wavelet Transfrom

In this paper, we explore the use of Dual tree Complex wavelet Transform which is nearly shift invariant and directionally selective in two and higher dimensions. The multidimensional dual tree CWT is a nonseparable and is based on computationally efficient, Separable filter bank (FB). This paper describes how the complex wavelet transform with directional properties is designed and use of it in image compression. When we take the dual tree complex wavelet transform then many wavelet coefficients are close to zero and have intra-subband dependency. We further evaluate the performance of SPIHT coding schema for coding of those coefficients. The result of proposed schema gives higher rate of compression and lover MSE compared to the schema based of DWT. Dual tree complex wavelet transform-SPIHT schema outperform DWT based schema at lower bit rates.

Vimal Kumar, Surendra Kumar Agarwal
Review of Software Quality Metrics for Object-Oriented Methodology

This paper presents a review of metrics used in object-oriented programming, it includes a small set of the most well-known and commonly applied traditional metrics which could be applied to object-oriented methodology and a set object-oriented metrics (specifically applicable to object-oriented programming) for software development. The need for such metrics is notably more when an organization is keen on adopting such metrics to develop good quality software. The demand has increased for new or improved metrics for software development and the most prominent being object-oriented methodology.

Suresh Yeresime, Jayadeep Pati, Santanu Ku Rath
RRTS: A Task Scheduling Algorithm to Minimize Makespan in Grid Environment

Task scheduling is one of the major issues of grid environment. This is an essential process in utilizing the resources efficiently by reducing the completion time. The performance of the grid can be enhanced by using efficient task scheduling algorithms. In this paper, we have proposed a new technique called Round Robin Task Scheduling (RRTS) for minimizing the Makespan by using concept of Round Robin. The idea of the approach is to execute the tasks by using Dynamic Time Slice (DTS). Our experimental analysis shows better results than other task scheduling algorithms (Minimum Execution Time (MET), Minimum Completion Time (MCT), Min–Max, and Max–Min) in terms of Makespan and Average Resource Utilization.

Sanjaya Kumar Panda, Sourav Kumar Bhoi, Pabitra Mohan Khilar
A Network Survivability Approach to Resist Access Point Failure in IEEE 802.11 WLAN

IEEE 802.11 WLAN is a most promising and demandable technology for communication. But it is affected by faults in the Access Points (APs), which degrade the network performance. So the network should tolerate the faults to preserve its performance and efficiency. In our approach, we have designed a fault tolerance technique to resist the faults in APs. This consists of three phases: Design of Minimum Cost Spanning Tree, creating Node Priority Table and establishing route to connect the nodes by Network Survivability Algorithm. By this method, we get the network coverage area, priorities of the nodes according to the degree and a route to connect the nodes after AP failure. We have considered it for both Single-Point Failure and Multi-Point Failure, which show better results in tolerating the faults by utilising maximum number of nodes and making it a cost-effective model.

Sourav Kumar Bhoi, Sanjaya Kumar Panda, Pabitra Mohan Khilar
Novel 2D Real-Valued Sinusoidal Signal Frequencies Estimation Based on Propagator Method

This paper considers the problem of estimating the frequencies of multiple 2D real-valued sinusoidal signals, also known as Real X-texture mode signals, in the presence of additive white Gaussian noise. An algorithm for estimating the frequencies of real-valued 2D sine wave based on propagator method is developed. This technique is a direct method which does not require any peak search. A new data model for individual dimensions is proposed, which gives the dimension of the signal subspace is equal to the number of frequencies present in the observation. Then propagator method-based estimation technique is applied on individual dimensions using the proposed new data model. The performance of the proposed method is demonstrated and validated through computer simulation.

Sambit Prasad Kar, P. Palanisamy
Creating Network Test Setup Using Virtualization and Simulation

How the virtualization and simulation can be used for creating network test setup? In research and development environment, there are multiple instances when the user has to test end-to-end network setup. During the development phase, the user may not have an access of end-to-end setup as other modules may be under development phase. In this scenario, he can create end-to-end network setup using simulation and virtualization to test his real module using hardware/software in loop concept. Hence tester integrates all three—simulation, virtualization, and his actual module to get close to real-world results. In this paper, we will explore how to create such a test setup specifically for networking and telecom applications. We will also explore mathematically how close the results of these test setups are to real world. To create this end-to-end network test setup, one needs to combine VM networks, simulators or simulated network, and actual applications as shown in Fig.

1

. This test setup can be configurable as per the requirements.

Fig. 1

Possible conceptual network test setup derived from real complex network

Abhay Shriramwar
Automated Graphical User Interface Regression Testing

Regression testing is performed after software modification to find out the correctness and quality of any software product. The modification is either corrective or adaptive. Normally in regression testing, we have to rerun the existing test cases or test scripts on the modified product. In this paper we are generating test scripts, test cases automatically for performing regression testing of graphical user interfaces. We have taken three example applications and used the capture and replay feature of the automated tool to capture the event sequences and generate test scripts automatically.

Madhumita Panda, Durga Prasad Mohapatra
Video Search Using Map Reduce Framework in an Education Cloud

Video search systems have become popular in recent years. The system prompts users to give a string query and retrieves the matched video quickly from the user for playing. There is still no established search method available for scalable fast search in large distributed video databases. In video search systems, when the number of online users reaches a certain scale, it will greatly reduce the response from the server. There are many video resources stored in distributed databases which should be accessible to all the users. If many users access the videos at the same time, it may lead to increase in load on the server. To solve this problem, cloud computing technology is used. A distributed database is used for storing and indexing videos. This system uses Map Reduce paradigm for retrieving the videos in distributed fashion. Map Reduce approach allows splitting the tasks into sub-tasks and then assigning it to various virtual nodes present in the cloud, which are then processed and consolidated to give the final output. Thus, the processing speed is increased while the processing time is greatly reduced.

E. Iniya Nehru, P. Seethalakshmi, S. Sujatha, M. Nandhini
Multimedia Service Delivery on Next-Generation Platforms: A Survey on the Need for Control, Adaptation and Inter-Mediation

With the increasing popularity of interactive and innovative multimedia streaming applications on mobile devices, the next-generation communication platforms are expected to be ready for such multimedia service delivery. This paper highlights the technical hurdles that arise due to end-user/device mobility and investigates an optimal multimedia service delivery mechanism to deal with mobility-related issues. The proposed solution called the “Mobility and Quality” Service enabler uses media control and mediation to offer cross-layered, application-independent solution to multimedia service delivery. The paper also presents preliminary experiments and results.

C. Balakrishna
Generating Test Data for Path Coverage Based Testing Using Genetic Algorithms

In this paper, we have developed an approach to generate test data for path coverage based testing using genetic algorithm. We have used control flow graph and cyclomatic complexity of the example program to find out the number of feasible paths present in the program and compared it with the actual number of paths covered by genetic algorithm. We have used genetic algorithm for generating test data automatically. We have shown that our algorithm is giving cent percent coverage, successfully covering all feasible paths. In our approach, we have observed that genetic algorithm is much more effective in generating test data within less time period, giving better coverage.

Madhumita Panda, Durga Prasad Mohapatra
Media Streaming Using Multiple Description Coding in Overlay Networks

In this paper we examine the performance of two types of Overlay networks i.e. Peer-to-Peer (P2P) & Content Delivery Network (CDN) media streaming using Multiple Description Coding (MDC). In both the approaches many servers simultaneously serve one requesting client with complementary descriptions. This approach improves reliability and decreases the data rate a server has to provide. We have implemented both approaches in the ns-2 network simulator. The experimental results indicate that the performance of Multiple Description Coding-based media streaming in case of P2P network is better than CDN.

Sachin Yadav, Shailendra Mishra, Ranjeeta Yadav
DSR and DSDV Routing Protocol Analysis Using NS2 and Association Rule Mining Technique

Ad hoc network (MANET) has no fixed networking infrastructure, and consists of mobile nodes that communicate with each other. Since nodes are mobile, routing in ad hoc network is a challenging task. Efficient routing protocols can make better performance in such networks. In this study, we are comparing the performance of two prominent reactive and proactive routing protocols for MANET, Dynamic Source Routing (DSR), and Destination Sequenced Distance Vector (DSDV). We are going to compare the DSR and DSDV mobile ad hoc network routing protocol using Network Simulator 2 (NS2). Our objective is to give performance comparison of DSR and DSDV based on CBR connection with varying speed and also various network parameter and measured performance metrics, such as packet delivery ratio for these two routing protocols. We have also use association rule mining technique–Apriori algorithm for analyzing major dropping node in NS2 simulation. We generate association rules for min_support threshold and min_confidence threshold.

Vinay Yadav, Divakar Singh
Innovative Fusion of Ear and Fingerprint in Biometrics

Biometric-based personal identification is regarded as an effective method for identification. In multimodal systems more than one biometric sample feature is used for identification which makes duplications of feature and spoofing nearly impossible. This paper proposes an identification method for a multimodal biometric system using two traits, i.e., ear and fingerprint. A new technique is introduced here for choosing the region of interest of the ear and fingerprint image. The two local points, including the Canal Intertranguiano and the starting point of the Helix are taken as the region of interest (ROI) for ear. In case of fingerprint image also the same procedure is followed to take ROI from the fingerprint image. After feature extraction of ear and fingerprint both undergoes fusion process. The fusion of these two extracted features is done using concatenation technique. Later, matching process is carried out to identify a person.

C. Malathy, K. Annapurani, A. K. Sadiq
Performance Analysis of Various Feature Extraction Techniques in Ear Biometrics

Many feature extraction techniques are available to extract the features of ear. Here in this paper, we have concentrated on analyzing the best feature extraction method. We have analyzed linear and nonlinear feature extraction methods like Linear Discriminant Analysis (LDA), Principal Component Analysis (PCA), and Kernel Principal Component Analysis (KPCA). Also we have combined LDA and PCA methods, so that the best properties of the two methods are taken. For experimentation, we have used the ear images obtained from publicly available sources. The experimental results have showed that the combination of LDA and PCA gives good performance in both verification rate and false acceptance rate compared to the other techniques individually used.

K. Annapurani, C. Malathy, A. K. Sadiq
Mobile Agent Model with Efficient Communication for Distributed Data Mining

This paper presents the managing of large-scale distributed system with the use of mobile agent concept in distributed environment which is becoming an increasing challenging task. The primary goal is to ensure efficient use of resources, services of computer systems, and networks to provide timely services to the end users using mobile agent with decentralized processing and control in distributed resources so as to minimize the network traffic and speed up management tasks in distributed environment, provide more scalability, and to provide more flexibility in the development and maintenance of the applications. The problem with the traditional techniques is, when the number of visited nodes grows, the mobile agent size also increases and making migration harder. One possible solution to this problem is to visit a fixed number of nodes, return to the agent home or send all data to it (reducing the mobile agent size), and start the task again on the remaining nodes. The initial size of a mobile agent also affects agent performance, since the larger the size, the more difficult the migration.

S. Kavitha, K. Senthil Kumar, K. V. Arul Anandam
Development of Locally Weighted Projection Regression for Concurrency Control in Computer-Aided Design Database

Concurrency control (CC) is the activity of synchronizing operations issued by concurrently executing programs on a shared database. Concurrency control is an important concept for proper transactions on objects to avoid any loss of data or to ensure proper updating of data in the database. This paper presents development of locally weighted projection regression (LWPR) for concurrency control while developing bolted connection using Autodesk inventor 2008. While implementing concurrency control, this work ensures that associated parts cannot be accessed by more than one person due to locking. The LWPR learns the objects and the type of transactions to be done based on which node in the output layer of the network exceeds a threshold value. Learning stops once all the objects are exposed to LWPR. We have attempted to use LWPR for storing lock information when multi users are working on computer-aided design (CAD).

A. Muthukumaravel, S. Purushothaman, R. Rajeswari
Performance Study of Combined Artificial Neural Network Algorithms for Image Steganalysis

Steganalaysis is a technique for detecting the presence of hidden information. Artificial neural network (ANN) is a widespread method for steganalysis. Back propagation algorithm (BPA), radial basis function (RBF), and functional update back propagation algorithm (FUBPA) are some of the popular ANN algorithms for detecting hidden information. Training and testing performance is improved when two algorithms are combined instead of using them separately. This paper analyzes the performance of combined algorithms of BPARBF and FUBPARBF. Among the two combinations FUBPARBF provides promising results than BPARBF since FUBPA uses less number of iterations for the network to converge. But still organizing the retrieved information is a challenging task.

P. Sujatha, S. Purushothaman, R. Rajeswari
Implementation of Radial Basis Function Network in Content-Based Video Retrieval

This paper presents retrieving a video from a given database using radial basis function (RBF) network method. The features of the videos are used by RBF for training and testing RBF in the algorithm developed. The features of frames of a video are extracted from the contents in the form of text, audio, and image. In this analysis, RBF is programmed to retrieve the words spoken by four different speakers in video presentation.

S. Prasanna, S. Purushothaman, R. Rajeswari
Implementation of Textile Image Segmentation Using Contextual Clustering and Fuzzy Logic

This paper presents the segmentation analysis on textile images. These images have innumerable textures. The content of the images are regularly arranged or repeated or random in a tessellated fashion. It is not necessary that the entire image has to be compulsorily segmented. However, at least one full object has to be segmented correctly in an image. In this work, a systematic approach has been developed to extract textures from the given texture images. The features of the textile images are extracted and used for segmenting those images using contextual clustering and fuzzy logic. The proposed methods combine to improve the segmentation accuracies and to analyze the effects of parameters of the proposed algorithms in segmentation of textures.

R. Shobarani, S. Purushothaman
Detections of Intima-Media Thickness in B-Mode Carotid Artery Images Using Segmentation Methods

This study presents the investigations carried out on carotid artery to identify the intima-media thickness of carotid artery that affected with plaques. B-mode ultrasound image video of the artery has been used as the data for processing. The frames of the video are processed to know the plaque properties of the artery. In order to achieve this, two segmentation processing techniques have been used on each frame. The features extracted from the frames are consolidated to know the conditions of the artery. Information of a frame are converted into features. The values of the features are estimated by artificial neural network (ANN) algorithm. ANN has not been used extensively by the past. ANN is used in estimating the plaque thickness in the carotid artery.

V. Savithri, S. Purushothaman
An Effective Segmentation Approach for Lung CT Images Using Histogram Thresholding with EMD Refinement

Image segmentation is an important step in extracting information from medical images. Segmentation of pulmonary chest computed tomography (CT) images is a precursor to most pulmonary image analysis. The purpose of lung segmentation is to separate the voxels corresponding to lung tissue from the surrounding anatomy. This paper presents an automated CT lung image segmentation. The approach utilizes histogram-based thresholding with Earth Mover’s Distance (HTEMD)-based refinement methods. The final segmented output is further refined by morphological operators. The performance of HTEMD is compared with Otsu’s,

K

-Means, and histogram thresholding using fuzzy measures.

Khan Z. Faizal, V. Kavitha
Analysis of Mobile Agents Applications in Distributed System

With the increasing demand to extend data mining technology to datasets inherently distributed among a large number of autonomous and heterogeneous sources over a network, there comes a new innovative technology that leads the system based on the mobility code especially mobile agents. Primary focus in distributed systems is that mobile agents have a number of key features that allow them to reduce the network load and overcome network latency. Mobile agents offer a more uniform approach to handling code and data in a distributed system. They can encapsulate protocols, and they can work remotely, even asynchronously and disconnected from a network. It improves the latency and bandwidth of client–server applications and reducing vulnerability to network disconnection. This paper presents an analysis of the various applications of mobile agents and some of the benefits and challenges of this new technology.

K. Senthil Kumar, S. Kavitha
Backmatter
Metadata
Title
Proceedings of International Conference on Internet Computing and Information Communications
Editors
Swamidoss Sathiakumar
Lalit Kumar Awasthi
M. Roberts Masillamani
S S Sridhar
Copyright Year
2014
Publisher
Springer India
Electronic ISBN
978-81-322-1299-7
Print ISBN
978-81-322-1298-0
DOI
https://doi.org/10.1007/978-81-322-1299-7