Skip to main content
main-content

Über dieses Buch

This book constitutes the refereeds proceedings of the International Conference on High Performance Architecture and Grid Computing, HPAGC 2011, held in Chandigarh, India, in July 2011. The 87 revised full papers presented were carefully reviewed and selected from 240 submissions. The papers are organized in topical sections on grid and cloud computing; high performance architecture; information management and network security.

Inhaltsverzeichnis

Frontmatter

Theme - 1: Grid and Cloud Computing

Era of Cloud Computing

Cloud Computing offers an entirely new way of looking at IT infrastructure. From a hardware point of view, cloud computing offers seemingly never-ending computing resources available on demand, thereby eliminating the need to budget for hardware that may only be used in high peak timeframes. Cloud computing eliminates an up-front commitment by users, thereby allowing agencies to start small and increase hardware resources only when there is an increase in their needs. Moreover, cloud computing provides the ability to pay for use of computing resources on a short-term basis as needed and release them as needed .In this paper we focus on area , issues and future of Cloud Computing.

Pramod Kumar Joshi, Sadhana Rana

An Overview on Soft Computing Techniques

Soft computing is a term applied to a field within computer science which is characterized by the use of inexact solutions to computationally-hard tasks such as the solution of NP-complete problems, for which an exact solution cannot be derived in polynomial time. This paper explains about the soft computing and its components briefly, also explains the need use and efficiency of its components. Soft computing differs from conventional (hard) computing in that, unlike hard computing, it is tolerant of imprecision, uncertainty, partial truth, and approximation. In effect, the role model for soft computing is the human mind. The guiding principle of soft computing is: Exploit the tolerance for imprecision, uncertainty, partial truth, and approximation to achieve tractability, robustness and low solution cost.

K. Koteswara Rao, G. SVP Raju

A Novel Approach for Task Processing through NEST Network in a Grid

With the increase in the complexity of task, complex architectures such as grid systems and cluster computing are employed to process huge amount of data. The major problem issues of such task processing systems include heterogeneity, load balancing, synchronization etc. The network employed to perform complex computations are hybrid form of peer network that utilizes the power of peer nodes to perform computations. The proposed architecture is an attempt to process task provided by set of users with a load balancing mechanism and node prioritization for task allocation through the Nest network. The Nest Network proposed for Grid is a peer network that processes the complex task provided by user and returns the processed output.

Tarun Gupta, Vipin Tyagi

TCP/IP Security Protocol Suite for Grid Computing Architecture

Grid computing is a term referring to the combination of computer resources from multiple administrative domains to attain a common goal. The grid can be thought of as a distributed system with non-interactive workloads that involve a large number of files. In this paper, we propose a solution for various security issues found in High Performance Grid Computing Architecture. We analyze different network layers available in Grid Protocol Architecture and identify various security disciplines at its different layers. We also analyze various Security Suites available for TCP/IP Internet Protocol Architecture. The paper intends to achieve two major tasks. First, it defines the various Security Disciplines on different layers of Grid Protocol Architecture. Second, it proposes different Security Suites applicable for different levels of Security Disciplines available in different layers of TCP/IP Security Protocol Suite.

Vikas Kamra, Amit Chugh

Security Issues in Cloud Computing

The cloud is next generation platform that provides dynamic resource pooling, virtualization and high resource availability. It is one of today’s most enticing technology areas due to its advantages like cost efficiency and flexibility. There are significant or persistent concerns about the cloud computing those are impeding momentum and will compromise the vision of cloud computing as a new information technology procurement model. A general understanding of cloud computing refers to the concept of grid computing, utility computing, software as a service, storage in cloud and virtualization. It enables the virtual organization to share geographically distributed resources as they pursue common goals, assuming the absence of central location, omniscience and an existing trust relationship. This paper is a survey more specific to the different security issues that has emanated due to the nature of the service delivery models of a cloud computing system.

Pardeep Sharma, Sandeep K. Sood, Sumeet Kaur

Classification of Software Quality Attributes for Service Oriented Architecture

In last few years, the emergence of Service-Oriented Architecture (SOA) is an extensive field in research due to the popularity of supporting wide range of quality attributes. SOA is becoming a popular architectural pattern for developing distributed system with prominent quality attributes. Due to the emergence of Web Service that is implemented by SOA have several quality issues such as performance, security, reliability and degree of interoperability or reusability. This paper presents a comprehensive study about positive or negative effect of software quality attributes (SQA) in developing distributed system. This paper also describes issues related to each quality attribute in developing distributed system. Finally, a classification framework of SQA shows the relationship between SOA and SQA.

Satish Kumar, Neeta Singh, Anuj Kumar

Energy Efficiency for Software and Services on the Cloud

The market for cloud computing services has continued to expand despite a general decline in economic activity in most of the world. Cloud computing is computation, software, data access, and storage services that do not require end-user knowledge of the physical location and configuration of the system that delivers the services.

This Paper provides an in-depth analysis of the energy efficiency benefits of cloud computing, including an assessment of the software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS) markets. It also highlights the key demand drivers and technical developments related to cloud computing, in addition to detailed quantification of energy savings and GHG reduction opportunities under a cloud computing adoption scenario, with a forecast period extending through 2020.

Priyanka Bhati, Prerna Sharma, Avinash Sharma, Jatin Sutaria, M. Hanumanthapa

Evaluation of Grid Middleware Frameworks for Execution of MPI Applications

Execution of large-scale parallel applications that span multiple distributed sites is important to realize the potential of computational grids. There are various problems being faced by the developers while running the applications on multiple clusters. In the last few years many groups have developed middleware frameworks that enable execution of MPI applications on multiple clusters where the slave nodes of a cluster have private or hidden IP address. This paper evaluates and compares such middleware frameworks for execution of MPI applications and discusses the merits of the solutions.

Abhishek Jain, Sathish S. Vadhiyar

Virtualization as an Engine to Drive Cloud Computing Security

In this paper we have proposed virtualization as an engine to drive cloud based security. Cloud computing is an approach for the delivery of services while virtualization is one possible service that could be delivered. Virtualization enables better security and enables a single user to access multiple physical devices. Large corporations with little downtime tolerance and airtight security requirements may find that virtualization fits them best. Thin clients and software as a service will free users from being tied to their computers, and allow them to access their information anywhere they can find an internet connection. Virtualization is a computing technology that enables a single user to access multiple physical devices. It is a single computer controlling multiple machines, or one operating system utilizing multiple computers to analyze a database. Virtualization may also be used for running multiple applications on each server rather than just one. It enables us to consolidate our servers and do more with less hardware. With growing pressure to move in this direction, we’re suggesting Virtualization for cloud based security.

Snehi Jyoti, Snehi Manish, Gill Rupali

Multi-dimensional Grid Quorum Consensus for High Capacity and Availability in a Replica Control Protocol

In distributed systems it is often necessary to provide coordination among the multiple concurrent processes to tolerate the contention, periods of asynchrony and a number of failures. Quorum systems provide a decentralized approach for such coordination. In this paper, we propose a replica control protocol by using a Multi-dimensional-grid-quorum-consensus, which is the generalization of a read-one-write-all (ROWA) protocol, Grid quorum consensus protocol and D-Space quorum consensus protocol. Provides very high read availability and read capacity while maintaining the reconfigurable levels of write availability and fault tolerance.

Vinit Kumar, Ajay Agarwal

Efficient Task Scheduling Algorithms for Cloud Computing Environment

Cloud Computing refers to the use of computing, platform, software, as a service. It’s a form of utility computing where the customer need not own the necessary infrastructure and pay for only what they use. Computing resources are delivered as virtual machines. In such a scenario, task scheduling algorithms play an important role where the aim is to schedule the tasks effectively so as to reduce the turnaround time and improve resource utilization. This paper presents two scheduling algorithms for scheduling tasks taking into consideration their computational complexity and computing capacity of processing elements. CloudSim toolkit is used for experimentation. Experimental results show that proposed algorithms exhibit good performance under heavy loads.

S. Sindhu, Saswati Mukherjee

“Cloud Computing: Towards Risk Assessment”

Cloud Computing is a revolutionary trend that not only minimizes the processing cost but also enhances the Return on Investment (ROI) despite that several risks are still challenging the paradigm. To ensure the confidentiality, integrity and availability of crucial data in the Cloud, policies and processes must be created to address this expanded reliance on these extended models. Although there is presence of SLA (Service Level Agreement) and NDA (Non-disclosure agreement), "It’s not enough for everybody. Some people do want to go deeper." There are lots of questions which are still un-answered. For instance regulatory compliance, location of data centers, its physical and network security etc. Although Cloud Computing is an excellent outsourcing idea, many believe that it also presents a long list of legal and other security concerns. In this paper we are focusing on assessing the various risks present at different layers of Cloud Architecture and their potential consequences and their plausible remedial actions too.

Bharat Chhabra, Bhawna Taneja

Efficient Grid Scheduling with Clouds

An efficient technique for scheduling in grids is explored in this paper and is further extended it with clouds. Here, we consider bandwidth availability while selecting resources for job scheduling. Thus, this strategy selects the resource in such a manner that along with computational capability, the ability of the resource to quickly respond to a task is also taken into account by means of using available bandwidth. This is further extended with cloud in order to tackle non availability of resources in a grid environment. Thus, if peak demand arises in a grid environment, we instantiate an on demand cloud resource customized to meet the grid user requirements. The response time and thus the total completion time for the job is lowered as the waiting time of the jobs gets lowered, which is evident from the experimental results.

L. Yamini, G. LathaSelvi, Saswati Mukherjee

Security Concerns in Cloud Computing

Since inception, the IT industry experienced a variety of natural evolution points, most marked with rapid change followed by years of internalization and consumption. According to most observers, the industry is rapidly evolving toward services as a core component of how consumers and business users interact with both software and one another The hype is deafening in places, and the key to success is recognizing that “cloud” adoption does not represent an all-or-nothing proposition. Organizations use cloud computing as a service infrastructure, critically like to examine the security and confidentiality issues for their business critical insensitive applications. Yet, guaranteeing the security of corporate data in the cloud is difficult, if not impossible, as they provide different services like SaaS, PaaS and IaaS. Each service has its own service issues. This paper discusses the security issues, requirements and challenges that cloud service providers face during the cloud engineering and the various deployment models for eliminating the security concerns.

Puneet Jai Kaur, Sakshi Kaushal

Cloud Computing – The Future

Cloud services are expected to become the driving force of IT innovation for the foreseeable future. Most companies are trying to be a part of the story either as enablers, vendors or service providers. With this, the cloud market is expected to grow at a phenomenal rate, with both large enterprises and SMEs going for it. Enterprise concerns over security, lock-in, etc. will be overcome by the benefits of the cloud. Large enterprises will prefer going for private or hybrid cloud deployment, while SMEs will prefer public clouds.

Vinay Chawla, Prenul Sogani

Cloud Computing: A Need for a Regulatory Body

There have been massive rise in spending on Cloud technologies from the past two years. Now, every IT Setup is expanding their horizons in the cloud services and related technologies. But turning to the issue of inexistence of a regulatory body or issues such as data protection, has kept many companies out of the cloud, particularly which are engaged in Financial Services, Health Care Services or Secret or Government Services, where data leakage or protection can’t be compromised. In this paper, initially we are going to review generic cloud computing term, their types and services and later, we assert for the need of a regulatory body and its resolution in providing a model for this body, which looks after various aspects like protocols, securities, data interactions, etc.

Bikramjit Singh, Rizul Khanna, Dheeraj Gujral

Clustering Dynamic Class Coupling Data to Measure Class Reusability Pattern

Identification of reusable components during the process of software development is an essential activity. Data mining techniques can be applied for identifying set of software components having dependence amongst each other. In this paper an attempt has been made to identify the group of classes having dependence amongst each other existing in the same repository. We explore document clustering technique based on tf-idf weighing to cluster classes from vast collection of class coupling data for particular java project/program. For this purpose firstly dynamic analysis of java application is done using UML diagrams to collect class import coupling data. Then in second step, this coupling data of each class is treated as a document and represented using VSM (using TF and IDF). Then finally in the third step basic K-mean clustering technique is applied to find clusters of classes. Further each cluster is ranked for its goodness.

Anshu Parashar, Jitender Kumar Chhabra

Cloud Computing in Education: Make India Better with the Emerging Trends

The objective of this paper is to study the impact of cloud computing on the modern education. Further, the study also attempts to answer whether the services of cloud computing are significant in the education sector. Education institutions are under increasing pressure to deliver more for less, and they need to find ways to offer rich, affordable services and tools. Both public and private institutions can use the cloud to deliver better services, even as they work with fewer resources. By sharing IT services in the cloud, your educational institution can outsource noncore services and better concentrate on offering students, teachers, faculty, and staff the essential tools to help them succeed.

Sunita Manro, Jagmohan Singh, Rajan Manro

Enhancing Grid Resource Scheduling Algorithms for Cloud Environments

Cloud computing is the latest evolution in the distributed computing paradigm and is being widely adopted by enterprises and organizations. The inherent benefits like instant scalability, pay for use, rapid elasticity, cost effectiveness, self-manageable service delivery and broader network access make cloud computing ‘the preferred platform’ for deploying applications and services. However, the technology being in nascent stage needs to be proven. The biggest challenge confronting service providers is effective provisioning and scheduling of cloud services to consumers leveraging the cost benefits of this computing paradigm. This paper attempts to investigate the key concerns for cloud resource management and explores possible alternatives that can be adapted from the existing Grid technology.

Pankaj Deep Kaur, Inderveer Chana

Development of Efficient Artificial Neural Network and Statistical Models for Forecasting Shelf Life of Cow Milk Khoa – A Comparative Study

Khoa

is very popular milk product used to make variety of sweets in India.

Khoa

is made by milk thickening and heating it in an open iron pan. In this study, feedforward Backpropagation Neural Network (BPNN), Radial Basis Function Neural Network (RBFNN) and Multiple Linear Regression (MLR) models have been developed to predict shelf life of cow milk

khoa

stored at 37

o

C. Five input parameters,

viz

., moisture, titratable acidity, free fatty acids, tyrosine and peroxide value are considered to predict sensory score. The dataset comprised of 48 observations. The accuracy of these models was judged with percent Root Mean Square Error (%RMSE). The BPNN model with Bayesian regularization algorithm provided static and consistent results. The residual shelf life of

khoa

was also computed using regression equations based on sensory scores. The BPNN model exhibited the best fit (%RMSE, 4.38) followed by MLR model (%RMSE, 9.27) and RBFNN model (%RMSE, 10.84).

Sumit Goyal, A. K. Sharma, R. K. Sharma

QoS for Grid Systems

QoS is important to the adoption of Grid Technologies. Grid Computing makes it possible for users to participate in distributed applications requiring data to be stored and delivered in timely manner. Users may wish to have control over Quality of Service (QoS) so that data is transferred on time in a distributed environment. Large scale Grids is composed of huge number of components from different sites and this requires efficient workflow management and Quality of Service (QoS).All the important components of this framework, integrated services and how workflows are managed with QoS are covered in this paper.

Vandana, Tamanna Sehgal

Creating Information Advantage in Cloudy World

To create knowledge of data management in this cloudy world we must have consistent, available and scalable data management systems which are capable of serving a billion of bytes of data to a number of users as well as large internet enterprises. One of the main security issue what all are facing is complications in data security in spite of providing different tools and security services. The security of cloud computing services is a contentious issue which may be delaying its adoption. The security depends on the methods adopting for the data management. In this paper we have analyzed the design choices and recommended approaches that allowed modern data management systems to achieve goals as compared to traditional databases.

Chahar Ravita, Mangla Vikram

Theme - 2: High Performance Architecture

Design of CMOS Energy Efficient Single Bit Full Adders

Here, three new low power single bit full adders using 9 and 10 transistor have been presented. The proposed adders have the advantage of low power consumption with small area requirements due less number of transistors. Low power objective has been achieved at circuit level by designing the adder with optimized XOR gates and multiplexer approach. Direct path between supply voltage and ground have been minimized in these designs. The circuits have been simulated in 0.18

μ

m CMOS technology using SPICE. The first adder shows power dissipation of 23.8595pW with maximum output delay of 67.5566fs at supply voltage of 1.8V. The second adder shows power dissipation of 43.1258pW with maximum output delay of 58.9935fs. Third adder shows power dissipation of 33.5163pW with delay of 62.065fs. Further, simulations have been carried out with different supply voltage [1.8 - 3.3] V. Power consumption of proposed full adders have been compared with earlier reported circuits and proposed circuit’s shows better results.

Manoj Kumar, Sujata Pandey, Sandeep K. Arya

Exploring Associative Classification Technique Using Weighted Utility Association Rules for Predictive Analytics

Association rule discovery determine the “inter-dependence” among various items in a transactional database. Data mining researchers have augmented upon the quality of association rule discovery for business development by integrating the influential factors like quantity of items sold (weight), profit (utility), for extracting the association patterns. This paper proposes a new model (associative classifier) based on weightage and utility for useful mining of substantial class association rules. In process of predicting the class lables, all attributes do not have same importance. So our framework considers the different frequencies of individual items as their weights and varied significance can be assigned to different attributes as their utilities according to their predicting capability. Initially, the proposed framework uses the CBA-RG algorithm to produce a set of class association rules from a database and as well as exploits the downward closure property of the apriori algorithm. Subsequently, the set of class association rules mined are subjected to weightage and utility constraints like W-gain, U-gain and a combined Utility Weighted Score (UW-Score) is calculated for the mining of class association rules. We purport a theoretical model to innovate new associative classifier that takes vantage of valuable Class association rules based on the UW-Score.

Mamta Punjabi, Vineet Kushwaha, Rashmi Ranjan

Bio-enable Security for Operating System by Customizing Gina

Security is the core part of computer system and based applications. Gina DLL can be treated as the heart of security for windows operating system. User can customize Gina DLL for security to operating system. Paper briefly summarizes customization of Gina DLL for providing password and fingerprint security by considering biometrics as a main tool. The principles behind biometrics are common and used in everyday life. Hamster device is connected with the system for fingerprint recognition and security to operating system is provided at the starting level of the operating system by customizing Gina DLL.

Swapnaja A. Ubale, S. S. Apte

A Destination Capability Aware Dynamic Load Balancing Algorithm for Heterogeneous Environment

Complexity of both data- intensive problems and computation-intensive problems has been increased in computational world. These problems require sophisticated mathematical & statistical techniques as well as knowledge creation and involve complex computations to be performed on huge amount of data. A number of large computational problems now work in terms of Petabytes of data rather than limited volume comprising of only Mbytes or Gbytes of data in olden days. In the present scenario, efficient utilization of system resources is necessary to solve large computational problems within desirable time constraint. In this paper, we present a dynamic load balancing algorithm for a cluster created by using heterogeneous commodity hardware. The algorithm uses system resources efficiently & effectively depending on the nature of the applications. Using this algorithm we observed an improvement in execution time of parallel applications if tasks are processed on favorable nodes instead of randomly assigned nodes.

Sharma Rajkumar, Kanungo Priyesh, Chandwani Manohar

Reliable Mobile Agent in Multi – Region Environment with Fault Tolerance for E-Service Applications

Mobile agent technology is emerging fast on web today that can be applied for developing distributed applications like information retrieval. Mobile agents travel through servers to perform their tasks. Ensuring the availability of mobile agents in the presence of agent server’s failure, during their travel, is a challenging issue. Communication among mobile agents in multi-region environment, when they are in move, makes the issue more complex. Considering this scenario, we provide a reliable mobile agent model with fault tolerance which ensures that the Mobile agent and the information collected by the mobile agent is available even in the case of server failure. This model works for the complete itinerary failure of a region also. The model is experimented in Internet using IBM Aglets, a tool for Mobile agents. The experimental results appear to be more encouraging in claiming this mobile agent model is more reliable.

M. Vigilson Prem, S. Swamynathan

Composition of Composite Semantic Web Services Using Abductive Event Calculus

Web Service composition is necessary when a single Web Service cannot satisfy the complex functional requirements. One of the key challenges in Composite Semantic Web Services is the composition of its atomic processes. In this work a novel architecture is proposed for the composition of composite semantic web services. An algorithm is used for the service discovery, which performs a fine grained match at the level of atomic process, rather than at the level of entire service. The architecture takes the advantage of abductive event calculus that uses abductive theorem prover to generate a plan for the composition of the atomic services.

D. Paulraj, S. Swamynathan

Ant Colony Optimization Based Congestion Control Algorithm for MPLS Network

Multi-Protocol Label Switching (MPLS) is a mechanism in high-performance telecommunications networks which directs and carries data from one network node to the next with the help of labels. MPLS makes it easy to create "virtual links" between distant nodes. It can encapsulate packets of various network protocols. MPLS is a highly scalable, protocol agnostic, data-carrying mechanism. Packet-forwarding decisions are made solely on the contents of this label, without the need to examine the packet itself. This allows one to create end-to-end circuits across any type of transport medium, using any protocol. There are high traffics when transmitting data in the MPLS Network due to emerging requirements of MPLS and associated internet usage. This paper proposes an Ant Colony Optimization (ACO) technique for traffic management in MPLS Network. ACO is a swarm intelligence methodology which offers highly optimized technique for dozen of engineering problems. In our proposed work, the ACO provides optimal value than existing algorithms.

S. Rajagopalan, E. R. Naganathan, P. Herbert Raj

Low Power Optimized Array Multiplier with Reduced Area

Multiplication is a fundamental operation in most arithmetic computing systems. Multipliers are indispensable part of DSP processing, FFT, convolution and many more areas where computation is required. In this paper an improved optimized design of 32-bit unsigned array multiplier with low power and reduced area is proposed. The power dissipation of optimized multiplier design is reduced by 3.82 percent and more than 30 percent as compared to multipliers using ripple carry and carry select adders. The area reduction is highly achieved by reducing the gate count.

Padma Devi, Gurinder pal Singh, Balwinder singh

Simulink Library Development and Implementation for VLSI Testing in Matlab

In ATPG, faults in the VLSI circuits are detected with D-algorithm, SPODEM and FAN algorithms. This paper gives the emphasis on presenting first two algorithms in MATLAB. Implementation of these algorithms for complex VLSI circuits is very tedious job. So an environment in Simulink is presented here, which is further verified on some benchmark circuits. Simulink provides an environment for intellectual properties (IP) building block based circuit engineering design as well as project simulation environment. In PODEM the requirement is the exact values of Controllability and Observability. For effective and fast calculation of COM, Simulink based designed models are used.

Gurinder Pal Singh, Balwinder Singh

Processing of Image Data Using FPGA-Based MicroBlaze Core

This paper proposes the technique of storing of image data in the FPGA memory and subsequent processing of the stored image data using MicroBlaze processor core of the Xilinx FPGA device. Though related research work exists in processing image data using DSP blocks available in FPGA devices but very little work exists in processing the same using FPGA based processor core. This type of design is extremely important for real time embedded system design for image processing applications. Our work deals with the inversion of a binary image in the FPGA memory and the recovery of the inverted image into its original form for verification of the inversion process. We have used the Xilinx EDK 11.1 tool, Spartan 3E FPGA kit and MATLAB is used for pre and post processing of the image data.

Swagata Samanta, Soumi Paik, Shreedeep Gangopadhyay, Amlan Chakrabarti

Parametric Analysis of Zone Routing Protocol

A Mobile Ad-hoc Network (MANET) consists of a set of autonomous, self-configuring, decentralized, power constrained mobile hosts that may communicate with one another from time to time without any base station support. Each host is equipped with a CSMA/CA (carrier sense multiple access with collision avoidance) transceiver. Routing is the process of finding a path from a source to some arbitrary destination on the network. In this paper Zone Routing Protocol (ZRP), which is hybrid routing protocol in MANET, is evaluated. Zone Routing Protocol (ZRP) is a most promising and widely used in scenario where nodes are placed in zone structure. The performance of this protocol is analyzed using performance metrics throughput, packet delivery ratio and average end-to-end delay. The total packets routed through Interzone Routing Protocol (IERP) are also analyzed using network simulator Qualnet 5.0.2.

Rani Astya, Parma Nand, S. C. Sharma

Vision of 5G Communication

This paper is designed to introduce the reader to the fundamental information for future or next Generation Technology .Forth Generation system implement in few countries & 4G systems is still a predominantly research and development and make an overview of next generation system i.e.5G communication. In this Paper presents an overview of the fifth generation mobile networks with emphasis on current and future trends in the areas of wireless networking, multimedia technology, network architecture, and network services. Related research in the development of future mobile systems has been highlighted. Beginning with prognoses of 5G Communication, architecture model of wireless communication & comparison with all the generation technology.

Mohd. Maroof Siddiqui

Secure Satellite Images Transmission Scheme Based on Chaos and Discrete Wavelet Transform

Many applications based on satellite communication like national defence and security rely on the satellite images as an important source of information. It is therefore, mandatory to secure satellite imagery while transmitting them over communication channels to protect from unauthorized access and usage. In this paper, chaotic logistic map based satellite image encryption scheme is proposed to meet the requirement of secure satellite-based communication. The algorithm is based on the concept of permuting the pixels of satellite image, then improving the pixels gray value distribution from cryptographic viewpoint. The permutation of image pixels is carried out in discrete wavelet domain and the relationship between the encrypted and the original satellite image is confused using chaotic-state modulation technique, thereby significantly increasing the resistance to statistical attacks. Experimental results demonstrate that the scheme has the advantage of high sensitivity to secret key and large key space. Moreover, the encrypted satellite images have uniform gray level distributions, high entropies and low correlation coefficients. Hence, the theoretical and experimental analyses confirm that the proposed scheme has high security and can be applied for practical satellite image protection.

Musheer Ahmad, Omar Farooq

Computational Analysis of Availability of Process Industry for High Performance

This paper aims to proposed methodology to evaluate the availability of the rice plant in a realistic environment. A complex mechanical system consisting five subsystems is considered .The failure of the parallel unit of any subsystems, the system remains operative for a short period of time. The paper discussed the three states of system: good state reduced state and failed state. The problem is formulated using supplementary variable technique .Lagrange’s Method is used to solve the governing equations. Availability of the system is evaluated followed by behavior analysis of subsystems.

Shakuntla, A. K. Lal, S. S. Bhatia

A Preprocessing Technique for Recognition of Online Handwritten Gurmukhi Numerals

In this paper, a preprocessing technique involving removal of duplicate points, normalization, interpolation of missing points, sharp point detection, removing hook and smoothing is applied for recognition of online handwritten Gurmukhi numerals. Above stages are performed on the data collected from different persons. It is observed that our preprocessing technique improves feature extraction rate by increasing the accuracy in recognition of some features like hole and junction.

Rajesh Kumar Bawa, Rekha Rani

A Framework for Vulnerability Analysis during Software Maintenance

Need of vulnerability analysis during software maintenance has been highly stressed by many vulnerability response experts. An analysis of why and how vulnerability happened is crucial for developing appropriate countermeasures to prevent recurrence of the vulnerabilities. In this paper, we present a framework for vulnerability analysis which needs to be applied during software maintenance. The framework helps in better and efficient cause-detection, identification of reasons of the breaches and development of countermeasures for already existing as well as new vulnerabilities.

Jitender Kumar Chhabra, Amarjeet Prajapati

Performance Optimization for Logs of Servers

Performance of any entity is said to be high, if it does greater units of work in lesser amount of time. However tuning can make a significant amount of change in the efficiency of a system. Servers are of important considerations in these days with its greater potential to serve more number of clients at different levels of architectural tiers. Hence there is need to enhance the performance of these Logs of servers so that errors and related administrative tasks can more accurately known and updated to required levels so as to minimize the impact of the performance. There are a number of parameters that influences the performance of these servers, which upon automated optimization shall result in, greater help to the analysts and further use of the servers to persistently provide the services to the clients.

M. Vedaprakash, Ramakrishna Alavala, Veena R. Desai

Ontology Based Information Retrieval for Learning Styles of Autistic People

In this paper an ontology based prototype system for information retrieval on the Internet is described. User is interested in the focused results regarding a product with some specific characteristics. A product may have different characteristics like size, length, color, functionality based parameters etc. It is, however, difficult for autistic people to identify appropriate keywords due to their lack of ability to process and retain the information. Therefore, a large amount of unwanted and irrelevant data is included in the outcome. In this proposal user may type the search queries using some words. The objective is to find the right set of keywords from the search paragraph and retrieval of correct patterns or products from the web. This is based on memories of such people and their learning styles that help them find the desired result.

Sanchika Gupta, Deepak Garg

Analyze the Performance of New Edge Web Application’s over N-Tiers Layer Architecture

This paper is an empirical case study to predict or estimate the performance and variability of same type software frameworks used for web application development. First we explore and analyze (Web Applications) PHP and ASP.NET considering quality attributes. Second we develop two same web applications i.e. Online Book’s Mart (Web applications to purchase books online) in PHP and ASP.NET. Finally we conduct automated testing to determine and analyze application’s performance. Software architecture, CSS, database design, database constraints were tried to keep simple and same for both applications i.e. Applications developed in PHP and ASP.NET. This similarity helps to establish realistic comparison of applications performance and variability. The application’s performance and variability is measured with help of automated scripts.

Pushpendra Kumar Singh, Prabhakar Gupta, S. S. Bedi, Krishna Singh

Self-Configurable Scheduling Algorithm for Heterogeneous Computing Systems

Research in real time task scheduling algorithm is a challenging problem for high performance computing system; especially achieving mapping of tasks to processors is a key design issue in the heterogeneous computing environment. There are many existing scheduling algorithms covered in the literature but none of them mapped the specific task to the corresponding processor. In this paper, we build a new scheduler model that is suitable to provide timing requirement constraints, (specific processor, specific task) pair and load balancing taken into account. This paper also addresses percentage of tasks missing their deadline due to the running out of resources of the system. When scheduler makes use of cloud of resources, the missing tasks can be scheduled properly. The new scheduler combines all types of real time and non real-time tasks and schedules them. Hereafter, we call this algorithm as Self-Configurable Scheduling (SCS) Algorithm. This algorithm automatically adjusts the jobs among processing elements and there will be a feedback loop from each processing element to the load balancer. We will build a simulation that can estimate shortage of resources in order to add resources and also allocate missed tasks to the cloud of resources.

A. PrashanthRao, A. Govardhan

Performance Analysis of Proposed MAES Cryptographic Techniques

Cryptography is an emerging technology, which is important for network security. Research on cryptography is still in its developing stages and a considerable research effort is still required for secured communication. This paper is devoted to the security and attack aspects of cryptographic techniques. The simulation based information content test such as Entropy, Floating Frequency, Histogram, N-gram, Autocorrelation and Periodicity on ciphers is done. The simulation based Randomness test such as Frequency test, Pokers test, Serial test, Long run test on ciphers are done using CrypTool. Finally, we have benchmarked proposed MAES cryptographic algorithm in search for the best compromise in security.

Richa Kalra, Ankur Singhal, Rajneesh Kaler, Promila Singhal

Analysis of Core-Level Scale-Out Efficiency for OpenMP Programs on Multi-core Processors

The majority of existing OpenMP compilers select the maximum number of available processing cores on a multi-core machine at runtime to execute a parallelized program on that machine. In this paper, we show that the use of maximum number of available cores does not necessarily result in speedup or efficiency. We show that in a considerable number of cases the use of more cores results in diminishing returns on execution time and efficiency. To help in choosing the proper number of cores, we propose an analytical method to estimate the execution times of OpenMP programs using different numbers of cores while considering the synchronization, excess computation and load imbalance overheads caused by the chosen number of cores. We validate our proposed method through a case study covering most recurrent and important structures and constructs of OpenMP.

Sareh Doroodian, Nima Ghaemian, Mohsen Sharifi

SQLIVD - AOP: Preventing SQL Injection Vulnerabilities Using Aspect Oriented Programming through Web Services

Security remains a major threat to the entire Web for many kinds of transactions. Most of the threats are created through application level vulnerabilities and have been exploited with serious consequences. Among the various types of application level vulnerabilities, command injection is the most common type of threat in web applications. Among command injection attack, SQL injection type of attacks are extremely prevalent, and ranked as the second most common form of attack on web. SQL injection attacks involve the construction of application’s input data that will result in the execution of malicious SQL statements. Hence, this paper (SQLIVD-AOP) proposes a mechanism to intercept SQL statements without any modification of an application using Aspect Oriented Programming and to analyze the query for its legitimacy, and to customize the errors. This mechanism is different from others by query interception and separation of the main scripting code with SQL injection code. The SQL validations and injection detections code are implemented by means of web services.

V. Shanmughaneethi, Ra. Yagna Pravin, C. Emilin Shyni, S. Swamynathan

Analysis and Study of Incremental K-Means Clustering Algorithm

Study of this paper describes the incremental behaviours of partitioning based K-means clustering. This incremental clustering is designed using the cluster’s metadata captured from the K-Means results. Experimental studies shows that this clustering outperformed when the number of clusters increased, number of objects increased, length of the cluster radius decreased, while the incremental clustering outperformed when the number of new data objects are inserted into the existing database. In incremental approach, the K-means clustering algorithm is applied to a dynamic database where the data may be frequently updated. And this approach measure the new cluster centers by directly computes the new data from the means of the existing clusters instead of rerunning the K-means algorithm. Thus it describes, at what percent of delta change in the original database up to which incremental K-means clustering behaves better than actual K-means. It can be also used for large multidimensional dataset.

Sanjay Chakraborty, N. K. Nagwani

Computational Model for Prokaryotic and Eukaryotic Gene Prediction

In this paper we have design a computational model for prokaryotic and eukaryotic gene prediction by using the clustering algorithm. The input DNA (Deoxyribonucleic Acid) sequence is spliced and the open reading frames are identified. For identification of consensus sequences various data mining algorithm is applied for creation of clusters. This model saves the implementation time, as whole of the database is present online so the sequence to be predicted is just taken from any one of the available database. Several experiments have been done where the parameters of gene prediction are changed manually. The performance has been tested on different unknown DNA sequences found on the internet. The sequences having score greater than or equal to the threshold value are entered into one cluster and rest of the sequences having score less than the given threshold are entered into second cluster and GC (Guanine and cytosine)-content percentage is calculated.

Sandeep Kaur, Anu Sheetal, Preetkanwal Singh

Detection of Malicious Node in Ad Hoc Networks Using Statistical Technique Based on CPU Utilization

Proposing a strategy based on statistical value provided by each node of the network for detecting their malicious activity by comparing the node’s present characteristic value with the old estimated value .If the difference between the two values is higher than expected value then that particular node become suspicions, a knowledge based system can take decision to expel the malicious node from the network topology.

Deepak Sharma, Deepak Prashar, Dalwinder Singh Salaria, G. Geetha

Optimum Controller for Automatic Generation Control

This paper deals with automatic generation control of area consisting of many generating sources i.e. hydro, thermal and gas. One percent load perturbation is given to each area considering combination of thermal, thermal hydro and thermal hydro gas generating station and the response of system frequency is analyzed. Accurate transfer function model is first required to analyze the system. To investigate the system dynamic performance, optimal control design is implemented in the wake of 1% step load disturbance.

Rahul Agnihotri, Gursewak Singh Brar, Raju Sharma

Abstraction of Design Information from Procedural Program

In the past two decades there has been a continuous change in the software development. Organizations use different programming languages for developing different software applications. The applications which were developed earlier were based on procedural programming languages like ‘C’, FORTRAN, COBOL etc. The applications which are being developed now, may be based on object oriented languages or procedural languages or a mix of both. In order to understand how the information system is designed one may need to understand the behavior of the program. The behavior of the program can be understood with the help of design information. This design information about the application program can be abstracted the from data flow diagram.

In this paper we are proposing a methodology to abstract the behavior of the program and then representing this behavior in the form of a data flow diagram through a series of steps.

R. N. Kulkarni, T. Aruna, N. Amrutha

Design of an Intelligent and Adaptive Mapping Mechanism for Multiagent Interface

The main intent of this work is to propose an intelligent interface that facilitates agent interaction in homogenous as well as heterogeneous ontologies. Literature survey indicates that there exist mapping mechanisms serving well for homogenous domains but very few researchers have made an attempt to propose a mapping interface for heterogeneous domains that is possessed with learning abilities and is therefore adaptive by nature. This work uniquely contributes towards the future vision of an intelligent and adaptive mapping mechanism that not only overcomes the curse of already existing mapping mechanisms but also is time efficient. The performance of the proposed strategy has been evaluated and compared with the existing strategies in the related fields and the results are found be to be competitive.

Aarti Singh, Dimple Juneja, A. K. Sharma

Autonomous Robot Motion Control Using Fuzzy PID Controller

Autonomous robots roles are increasing in different aspects of engineering and everyday life. This paper describes an autonomous robot motion control system based on fuzzy logic Proportional Integral Derivative (PID) controller. Fuzzy rules are embedded in the controller to tune the gain parameters of PID and to make them helpful in real time applications. This paper discusses the design aspects of fuzzy PID controller for mobile robot that decrease rise time, remove steady sate error quickly and avoids overshoot. The performance of robot design has been verified with rule based evaluation using Matlab and results obtained have been found to be robust. Overall, the performances criteria in terms of its response towards rise time, steady sate error and overshoot have been found to be good.

Vaishali Sood

A Multiresolution Technique to Despeckle Ultrasound Images

Ultrasonography is a very prevailing technique for imaging soft tissue structures and organs of human body. But when an Ultrasound image is captured it gets noisy and this added noise is known as speckle noise which hinders the diagnostic process of the radiocologists and doctors. In this paper a method to remove speckle noise from ultrasound images is proposed. So many methods have been proposed in spatial, frequency and wavelet domains. Here new thresholding method in wavelet domain is proposed which takes into account statistical properties of the image using a weighted window. Performance of the proposed algorithm is compared with conventional methods based on Peak Signal to Noise Ratio (PSNR) and Mean Square error (MSE). Results show that proposed algorithm performs better than conventional methods.

Parvinder Kaur, Baljit Singh

Theme - 3: Information Management and Network Security

Design and Analysis of the Gateway Discovery Approaches in MANET

The demand for any time anywhere connectivity has increased rapidly with the tremendous growth of the Internet in the past decade and due to the huge influx of highly portable devices such as laptops, PDAs etc. In order to provide the users with the huge pool of resources together with the global services available from the Internet and for widening the coverage area of the MANET, there is a growing need to integrate the ad hoc networks to the Internet. Due to the differences in the protocol architecture between MANET and Internet, we need gateways which act as bridges between them. The gateway discovery in hybrid network is considered as a critical and challenging task and with decreasing pause time and greater number of sources it becomes even more complex. Due to the scarcity of network resources in MANET, the efficient discovery of the gateway becomes a key issue in the design and development of future hybrid networks. In this paper we have described the design and implementation of the various gateway discovery approaches and carried out a systematic simulation based performance study of these approaches using NS2 under different network scenarios. The performance analysis has been done on the basis of three metrics - packet delivery fraction, average end-to-end delay and normalized routing load.

Koushik Majumder, Sudhabindu Ray, Subir Kumar Sarkar

Wireless Sensor Network Security Research and Challenges: A Backdrop

If sensor networks are to attain their potential, security is one of the most important aspects to be taken care of. The need for security in military applications is obvious, but even more benign uses, such as home health monitoring, habitat monitoring and sub-surface exploration require confidentiality. WSNs are perfect for detecting environmental, biological, or chemical threats over large scale areas, but maliciously induced false alarms could completely negate value of the system. The widespread deployment of sensor networks is directly related to their security strength. These stated facts form the basis for this survey paper. This paper present a brief overview of challenges in designing a security mechanism for WSN, classify different types of attacks and lists available protocols, while laying outline for proposed work.

Dimple Juneja, Atul Sharma, A. K Sharma

Automated Test Case Generation for Object Oriented Systems Using UML Object Diagrams

To reduce the effort in identifying adequate test cases and to improve the effectiveness of testing process, a graph based method has been suggested to automate test case generation for Unified Modeling Language object diagram. The system files produced in the modeling exercise have been used to list all possible valid and invalid test cases that are required to validate the software. The diagrams are treated as graphs to generate the test cases. The effectiveness of the test cases has been evaluated using mutation testing.

M. Prasanna, K. R. Chandran

Dead State Recovery Based Power Optimization Routing Protocol for MANETs (DSPO)

Mobile ad hoc networks are a set of small, low cost, low power sensing devices with wireless communication capabilities. The energy concerned is the receivers processing energy, transmitter’s energy requirement for transmission, loses in the form of heat from the transmitter devices. All nodes in the network are mobile and for measuring the efficiency at particular instant, the nodes are considered to be communicating in half duplex mode. In this paper, we introduce the DSPU algorithm which is an automated recovery based power awareness algorithm that deals with the self recovery of the nodes in case of recognition of dead state thus preventing network model going into state of congestion and overheads. The DSPU is an enhanced form of AODV protocol that has the ability of self recovering regarding the security issues of the network structure. The simulations are performed using the NS2 simulator [11] and the results obtained shows that the consideration of energy, bandwidth and the mobility factors enhances the performance of the network model and thus increases the throughput of the ad hoc networks by increasing the life of the nodal structure.

Tanu Preet Singh, Manmeet Kaur, Vishal Sharma

On the Potential of Ricart-Agrawala Algorithm in Mobile Computing Environments

The Ricart-Agrawala protocol [1] is one of the classical solutions to mutual exclusion problem. Although, the protocol was invented, essentially, for failure free static distributed systems, it has been adapted by various researchers for almost all changing computing paradigms from classical to contemporary. The purpose of the article is to highlight the strength of the concept used in the Ricart-Agrawala protocol.

Bharti Sharma, Rabinder Singh Bhatia, Awadhesh Kumar Singh

Analysis of Digital Forensic Tools and Investigation Process

Popularity of internet is not only change our life view, but change the view of crime in our society or all over the world. Increasing the number of computer crime day by day is the reason for forensic investigation. Digital forensic is used to bring justice against that person who is responsible for computer crimes or digital crimes. In this paper, we explain both type of forensic tool commercial as well as open source and comparisons between them. We also classify digital forensic and digital crimes according to their working investigation.

In this paper, we proposed a model for investigation process to any type of digital crime. This model is simple and gives efficient result to any type of digital crimes and better way to improve the time for investigation.

Seema Yadav, Khaleel Ahmad, Jayant Shekhar

Evaluation of Normalized Routing Load for MANET

Mobile Adhoc Network (MANET) is a collection of wireless mobile nodes forming a temporary network without any pre-existing network infrastructure. The stable routing over such a network is a very critical task as the wireless links are highly error prone and can go down frequently due to dynamic network topology. In this paper, evaluation of prominent on-demand routing protocols i.e. DSR and AODV has been done by varying the network size. An effort has been carried out to do the performance evaluation of these protocols using random way point model. The simulator used is NS 2.34. The performance of either protocol has been studied by using a self created network scenario and by analyzing normalized routing load with respect to pause time. Based on the performance evaluation, recommendations have been made about the significance of the protocol under various circumstances.

Sunil Taneja, Ashwani Kush

Reliabilty and Performance Based Resource Selection in Grid Environment

Over the last few decades the development of Internet and Grid technology has rapidly increased in the number of resources to which a user, program, or community may have access. When a large number of resources are available which fulfills the minimum criteria imposed by the user then the burden goes onto the user to select the best resource. Also a wrong resource selection imposes the overhead and cost burden on the user. So, an efficient and reliable resource selection is required which overcomes the burden of the user and selects the best resource. In this paper we proposed a two phase approach for efficient resource selection. The aim of this paper is to identify the most available, reliable and fastest resources for running an application. In this context, we introduced an approach for resource selection that maximizes the quality, reliability, and efficiency of resources and minimizes the other overhead.

Rajesh Kumar Bawa, Gaurav Sharma

Elliptic Curve Cryptography: Current Status and Research Challenges

Three types of standard public-key cryptographic systems that can be considered secure, efficient, and commercially practical are (i) Integer Factorization Systems (e.g. RSA) (ii) Discrete Logarithm Systems (e.g. DSA) (iii) Elliptic Curve Cryptosystems (ECC). The security of these systems is based on the relative complexity of the underlying mathematical problem. Of all these systems, for a given key size, ECC is the most secure public key cryptosystem

.

A survey of various protocols based on ECC has been done in the paper. The protocols have been classified according to their use in various cryptographic security mechanisms i.e. key agreement protocols, digital signature and encipherment. A comparison of ECC with conventional public key systems reveals that ECC is best suited for applications such as mobile computing, wireless sensor networks and other devices with constrained resources.

Sheetal Kalra, Sandeep K. Sood

SBFDR: Sector Based Fault Detection and Recovery in Wireless Sensor Networks

Sensor Networks are usually large collection of sensing nodes collecting data from monitoring environment and transmit to base station by multi hop wireless communication. The occurrences of faults in wireless sensor network are very high due to wireless communication and random deployment policy. Energy conservation in sensor network is another challenge to improve applicability of wireless sensor networks. In this paper, we propose sector based fault detection and recovery technique (SBFDR) which is also energy competent. In SBFDR, sensor nodes are arranged into some clusters. Cluster head and sensor nodes are jointly detect the fault of sensor nodes. Sensor node’s faults are recovered by the cluster head’s fault recovery policy. The simulation result depicts that the SBFDR technique more capable to detect sensor nodes faults and recover the faulty nodes in an energy efficient manner. Energy loss and fault recovery time is very low in SBFDR technique compare to other popular fault detection and recovery techniques.

Indrajit Banerjee, Prasenjit Chanak, Hafizur Rahaman

Study and Analysis of Incremental Apriori Algorithm

Abstract. Study of this paper is based on finding the threshold value of database change up to which incremental Apriori algorithm performs better. A new incremental Apriori algorithm is also proposed which performs better than the existing algorithm in terms of computation time. The performance of frequent sets generation algorithms for dynamic databases is major problem, since numbers of runs are required to accommodate the database changes. It determines the value of change percentage of original database that decides whether the user can go for re-run the actual algorithm or use the previously computed result and generate the frequent sets in incremental fashion. The purpose of this paper is two folds. First is to avoid the scans of the older database, its corresponding support count effort for newly added records by using intermediate data and results. And second is to solve the efficient updating problem of association rules after a nontrivial number of new records have been added to a database.

Neeraj Kumar Sharma, N. K. Nagwani

Energy Aware and Energy Efficient Routing Protocol for Adhoc Network Using Restructured Artificial Bee Colony System

Wireless communication is one of the fastest growing technologies all over the world. Especially, Adhoc Network is applied wide spread across the world in many different applications, which includes all major engineering systems, vehicular network etc...The optimal routing is an issue in the adhoc network and many researchers focused their attention and developed various methodologies which are feasible for certain situations. This paper proposes a honey bee mating algorithm for adhoc routing, which is a swarm intelligence technique, and this technique is already applied for data clustering; scheduling and resource allocation; optimization problems. The various benchmark proposed by the researcher for the artificial honey bee shows better result than the existing techniques. This paper has restructured the artificial bee colony algorithm from the initialization phase to the implementation phase, and shows better result than the existing methodology.

B. Chandra Mohan, R. Baskaran

Implementing Key Management for Security in Ad Hoc Network

Key management is important to security of Mobile Ad Hoc NETwork (MANET). Based on the (t, n) threshold cryptography, this paper introduced mobile agent to exchange private key and network topological information with nodes in the network. This method can not only reduce the network overload, but also improve the service velocity and success ratio of authentication. Any t nodes in the network sized n can cooperate to perform an authentication upon a new node wanting to join the network. Carrying private key and some state variables such as survival time, mobile agent navigated in the network according to visits-balance policy, namely, node with the least visits would be first visited by mobile agent.

Avinash Sharma, Narendra Agarwal, Satyabrata Roy, Ajay Sharma, Pankaj sharma

Performance Evaluation of MAC- and PHY-Protocols in IEEE 802.11 WLAN

This work evaluates and compares the performance of IEEE 802.11 WLAN scenario by evaluating QoS parameters such as medium access delay, end to end delay and re-transmission attempts at data rate of 2Mbps by means of different mechanisms in PHY- & MAC-layers. The Point coordination function (PCF) and Distribution Coordination Function (DCF) of MAC layer together with mechanism used in Physical layer i.e. Frequency hopping- and Direct sequence- spread spectrum (FHSS & DSSS) are reported and investigated to provide better QoS using OPNET simulator.

Vishal Sharma, Jagjit Malhotra, Harsukhpreet Singh

Key Authentication for MANET Security

Securing a mobile ad hoc network (MANET) is a big challenge because of the nature of the MANET. A particular challenging problem is how to detect and defend possible attacks on routing protocols feasibly. Security in mobile ad-hoc networks is difficult to achieve, mainly because of the vulnerability of wireless links, the limited physical protection of nodes, the dynamically changing topology, the absence of a certification authority, and the lack of a centralized monitoring or management point. The major difficulty in ad-hoc network occurs when a new node join network but not having any trust based relation with other nodes of network. In this paper a mechanism has been proposed that when mobile node needs secure data communication, it will generate a dynamic secret session key with the desired destination mobile node directly or via proxy mobile nodes. These dynamic secret session keys are generated using Diffie-Hellman protocol

Vijay Kumar, Rakesh Sharma, Ashwani Kush

Biometric Encryption: Combining Fingerprints and Cryptography

These days, Biometric technologies are used to analyze human characteristics for security purposes. The most common physical biometric patterns analyzed for security purposes are the fingerprint, hand, eye, face and voice. The advantages of using biometrics to verify a person’s identity over using passwords or token have been broadly presented in many research papers. However recent research has revealed that biometric technologies can be defeated with low –tech and cheap materials. This provides a new challenge when people are encouraged to use biometrics as a means to enhance network security. In this paper many approaches have been discussed to counteract security threats. We have also proposed a new method called Biometric Encryption which uses fingerprint and cryptography for enhanced security

Mini Singh Ahuja, Sumit Chabbra

Node Architectures and Its Deployment in Wireless Sensor Networks: A Survey

In conventional wireless sensor networks (WSNs), adding a few mobile nodes can greatly improve the control and sensing capabilities of the networks and can help researchers solve many challenges such as network deployment and scalability etc. The video capture, processing, and communication in wireless video sensor networks depend on the resources of the nodes forming the sensor networks. The major challenge in designing wireless sensor networks (WSNs) is the support of the functional, such as data latency, and the non-functional, such as data integrity etc. Careful sensor node placement can be a very effective optimization, means for achieving the desired design goals. In this paper we thoroughly survey and contrast the existing sensor node architectures. We also survey the research on optimized node placement in WSNs. We have provided the placement strategies into static and dynamic depending on whether the optimization is performed at the time of deployment or while the network is operational.

Sumit Kushwaha, Vinay Kumar, Sanjeev Jain

New Innovations in Cryptography and Its Applications

The invention of public-key cryptography was of central importance to the field of cryptography and provided answers to many key management problems for large scale networks. For all its benefits, however, public-key cryptography did not provide a comprehensive solution to the key management problem. Indeed, the possibilities brought forth by public-key cryptography heightened the need for sophisticated key management systems to answer questions such as the following:

"How can I easily encrypt a file once for a number of different people using public-key cryptography?"

"If I lose my keys, how can I decrypt all of my files that were encrypted with those keys?"

"How do I know that I really have Alice’s public key and not the public key of someone pretending to be Alice?"

"How can I know that a public key is still trustworthy?"

The paper discusses public key cryptography and its use in applications such as Key Agreement, Data Encryption and Digital Signature. The paper discusses some public key algorithms such as DH, RSA, and DSA and also gives working explanations of these algorithms.

Saurabh Sharma, Neeraj Kumar Mishra

Competitive Equilibrium Theory and Its Applications in Computer Science

In the capitalist market, vital regulatory functions such as ensuring, stability, competency, and fairness are relegated to pricing mechanisms. Thus, competitive equilibrium theory of equilibrium prices acquired a prominent place in mathematical economics. With the advent of internet, there has been an extensive research done at the boundary across computer science and economic theory, over the past few years. We discuss in this paper about the competitive equilibrium theory and its applications to computer science.

J. Ujwala Rekha, K. Shahu Chatrapati, A. Vinaya Babu

A Novel Approach for Information Dissemination in Vehicular Networks

Vehicular Networks or simply VANETs are important component for the development of Intelligent Transportation System. Due to the features of VANET, data dissemination is an important issue that has to be addressed. In this paper we discuss the types of information involved in the dissemination process. We discuss the existing approaches of data dissemination. A new approach is proposed for the dissemination of data using network coding. In the proposed solution the packet is forwarded to vehicles after coding only. The simulations show that the proposed method reduces the number of broadcast packets being sent on the network. Also increases the channel throughput.

Rakesh Kumar, Mayank Dave

Understanding the Generation of Cellular Technologies

Due to the increase in demand for speed, multimedia support and other resources, the wireless world is looking forward for a new generation technology to replace the third generation. This is where the fourth generation wireless communication comes into play. 4G wireless communication is expected to provide better speed, high capacity, lower cost and IP based services. The main aim of 4G wireless is to replace the current core technology with a single universal technology based on IP. This paper deals with understanding the features and challenges, the proposed architectural, multimedia support, applications and multiple access schemes for 4G.

Manjit Sandhu, Tajinder Kaur, Mahesh Chander, Anju Bala

Evaluation of Routing Schemes for MANET

The recent advancements in wireless technology have opened new vistas in the development of a new wireless system. A Mobile Adhoc Network is a self configuring network of wireless devices connected by wireless links. Reactive routing protocols have been found to be user friendly and efficient when compared to other routing protocols. In this study a comparison and performance evaluation of two reactive routing protocols AODV and DSR is done using NS-2 Simulator to identify the protocol that is best suited for MANET’s. An effort has been carried out to do the performance evaluation of these protocols using random way point model. The simulator used is NS 2.34.

Sima Singh, Ashwani Kush

Fuzzy Logic Based Routing Algorithm for Mobile Ad Hoc Networks

Mobile ad hoc networks consist of mobile nodes that communicate without an infrastructure. It is a self configuring network connected by wireless links. All the nodes move around randomly, thus changing the network topology dynamically. The primary challenge in building a MANET is equipping each device to continuously maintain the information required to properly route the traffic. In this paper a routing algorithm based on Fuzzy Logic is proposed which is having low communication overhead and storage requirements. The proposed algorithm takes three input variables: signal power, mobility and delay. The absolute value of each parameter can take a large range at different points on the network.

Sonia Gupta, P. K. Bharti, Vishal Choudhary

Analysis of Security and Key Management Schemes for Authenticated Broadcast in Heterogeneous Wireless Sensor Networks

The security in wireless sensor networks (WSN) is a critical issue due to the inherent limitations of computational capacity, storage capacity and power usage. Packets are dropped or discarded completely, or selectively forwarded by an anonymous party. Also the network is flooded with global suspicious broadcasts. These kinds of attacks may be avoided when using multi path and authenticated broadcasts, which has to be facilitated by the underlying key management architecture. Key management only makes sure the communicating nodes possess the necessary keys, at the same time providing the confidentiality, integrity and authenticity of the communicated data. The proposed work, made an effort to survey the well-known security issues in WSNs and study the various asymmetric key or public key algorithms which are used for key distribution as well as encryption/decryption in sensor network for authenticated message broad cast. Based on the analysis, proposed a new method which will improves the performance of an existing RSA algorithm by using Chinese Remainder Theorem (CRT) algorithm for decryption phase. Also it offers countermeasures for the attacks in the network layer of WSN.

P. Kalyani, C. Chellappan

Simulative Analysis of Bidirectional WDM/TDM-PON Using NRZ and RZ Downstream Signals and Narrowband AWG

In this paper, we compare the performance of WDM-PON system for NRZ and RZ data formats operating at bit rates up to 20Gb/s by varying fiber length. It is observed that the multicast signals can be delivered to the designated subscribers with acceptable performance almost upto 45km. At Optical Network Unit (ONU), upstream data can be remodulated on the downstream wavelength by using Array Wave Guide (AWG). It is found that the performance of the NRZ data format is better at low bit rates, Further it is reported that system gives optimum performance for the middle channel due to lower interchannel interference. The upstream signals can be effectively transmitted up to 20Gb/s, however downstream signals can be transmitted only at 2.5Gb/s. The faithful transmission of signal can be carried up to 50km for upstream and 60km for downstream, as the Bit Error Rate (BER) rises above 10

− 9

beyond this distance.

Rajniti, Suman Anita, Sheetal Anu, Kumar Parveen

Data Mining Techniques for Prefetching in Mobile Ad Hoc Networks

Caching is a means of providing faster access of data to the requester. The data which is frequently accessed/ required is kept in cache (fast memory) so as to improve the query latency. This technique has proved its worth in various environments vis-à-vis, web environment, mobile communication, mobile computing as well as in processor architecture. The data items in the cache are kept based on the experience or the past record of data items which are accessed frequently. In this paper we propose a data caching and prefetching scheme for mobile ad hoc networks (MANETs). Through prefetching system will try to sense the future needs of mobile nodes (MNs). Based on the assessment, the requisite data is prefetched from server and query latency is reduced and data availability is improved further. We have applied the data mining techniques to prefetch the data.

Naveen Chauhan, L. K. Awasthi, Narottam Chand

An Image Steganography Approach Based upon Matching 

Use of internet has made the data transfer very easy but at the same time very risky also. The major concern of these days is the security of the data being transferred over the Internet. Steganography is a technique that proves to be very good to achieve this. Steganography can be defined as a technique of embedding data inside some other object by altering its properties. This paper includes discussion of a technique that has been designed to hide more number of bits per pixel. The technique maps the secret data to one of the channels of the cover object and use LSBs of other channels to mark the presence of data in that channel. The results show that the technique achieves high security and provides more data hiding capacity.

Sukhpreet Kaur, Sumeet Kaur

From Calculus to Number Theory, Paves Way to Break OSS Scheme

An authentication technique that also includes measures to counter repudiation by the source is termed Digital Signature. Most cryptographic schemes rely on some hard mathematical problem. Ong-Schnorr-Shamir is a digital signature scheme with the advantage of ease of implementation. It was believed that this was of similar difficulty to that of factoring the modulus n. In this paper, we break the OSS scheme by appealing to a famous calculus problem.

G. Geetha, Saruchi

Digital Image Watermarking Technique Based on Dense Descriptor

Enthused by the robustness and simplicity of WLD(Weber’s Local Descriptor) descriptor, a new digital image watermarking is proposed. The proposed watermarking technique is based on WLD Descriptor. WLD descriptor is a histogram representation of an image that consists of two components of a pixel: its differential excitation and orientation. Differential excitation is the difference of center pixel with its surrounding neighbors and orientation is the gradient orientation of center pixel. WLD descriptor of an image is robust against various geometric and photometric attacks. This feature of WLD descriptor engrossed it to be used in digital image watermarking.

Ekta Walia, Anu Suneja

Novel Face Detection Using Gabor Filter Bank with Variable Threshold

Face detection is the method of locating human faces in a given image under all lighting conditions, scales and orientations. Face is a unique feature of every person and the same is applicable to pupil, iris and fingerprints which are also unique as well. With the improvement of technology, neural network and processors’ high capacity have resulted in induction of informatics in this area. Automatic face detection and recognition has been drawing the main attention in the recent years. We have proposed here an accurate face detection system that can detect faces under different contrast with hurdles like faces with spectacles, heavy beard and even closed eyes. We use Gabor filter bank with varying threshold for feature extraction and face detection.

P. K. Suri, Walia Ekta, Verma Amit

When to Stop Testing

Testing process is to detect variance between actual and expected results, to make good quality software. Testing is necessary but to test each and every part of software is not feasible. There should be threshold point to stop testing, in this paper we proposed an analytical scheme which provides threshold values to stop testing without compromising the quality of software. By using proposed scheme, developer can easily find out sufficient and enough level of testing.

Ajay Jangra, Gurbaj Singh, Chander Kant, Priyanka

An Efficient Power Saving Adaptive Routing (EPSAR) Protocol for Mobile Ad Hoc Networks (MANETs)

Mobile ad hoc networks (MANETs) are networks that have no centralized body and in order to communicate with the nodes it has no fixed topology. Also it is difficult to find the route from source to destination in MANETs, because of its arbitrary mobility of nodes and in general MANETs works on multihop environment to select a route from source to destination. For this a new scheme has been proposed that results in an efficient and reliable path named as Efficient Power Saving Adaptive Routing (EPSAR). In this paper we would apply some amendments to FRENSA and make a new algorithm that could make it more efficient and reliable enough and also apply it to three different scenarios like Best, Average and Worst case for selecting a path and can check the performance of the network and the complexity of selected route. In all the cases we would take some parameters that have been used in EPSAR and analyze the performance of this algorithm in all the cases.

Ajay Jangra, Nitin Goel, Chander Kant, Priyanka

Agile Software: Ensuring Quality Assurance and Processes

In the present scenario when the software systems are getting increasingly complexed. The time lines and schedule are getting tightened day by day. The processes need to be expected as adaptable rather than rigid. The process of the development also needs to be redesigned. The old concept of sequential phase must be updated with the iteration. The processes must ensure the user acceptance with the accepted level of quality of the software. The need of time is that process of software development to be reviewed. The concept of Agility can be used to provide good quality solutions for the upcoming software system. The sustainable solution will be considered now onward that will be capable of maintaining quality, acceptance of changes at any time with a minimum cost and rescheduling of each phase of development. Agile Software Development can be a winner of coming future. But on the downside for which many times the agile processes are criticized are inability to work with the CMMI environment, this paper proposes a new model for the agile software development, that includes Customer feedback and project documentation as its major element that makes the agile development more auditable, accountable and process centric.

Narinder Pal Singh, Rachna Soni

Measure Complexity in Heterogeneous System

Heterogeneous systems are becoming advanced and more complex day by day. To acknowledge the complexity of such system is a significant challenge. Distributed systems reminiscent of grid systems, internet systems, ever-present computing environments, storage systems, sensor networks and online enterprise systems often contain massive numbers of heterogeneous and mobile nodes. These systems are highly dynamic and fault-prone as well. As an end result, it’s difficult to debug and developers find it easier said than done to program new applications and services for these systems; administrators find it complicated to manage and configure these complex, device-rich systems and end-users find it difficult to use these systems to perform tasks. In this context the aspects of complexity have been studied extensively. The concepts of time and space complexity of different kinds of algorithms are well understood. The complexity of heterogeneous systems for people has been broadly acknowledged to be a key problem.

Kuldeep Sharma

Backmatter

Weitere Informationen

Premium Partner

    Bildnachweise