Skip to main content
main-content
Top

About this book

Advanced Computing, Networking and Informatics are three distinct and mutually exclusive disciplines of knowledge with no apparent sharing/overlap among them. However, their convergence is observed in many real world applications, including cyber-security, internet banking, healthcare, sensor networks, cognitive radio, pervasive computing amidst many others. This two volume proceedings explore the combined use of Advanced Computing and Informatics in the next generation wireless networks and security, signal and image processing, ontology and human-computer interfaces (HCI). The two volumes together include 132 scholarly articles, which have been accepted for presentation from over 550 submissions in the Third International Conference on Advanced Computing, Networking and Informatics, 2015, held in Bhubaneswar, India during June 23–25, 2015.

Table of Contents

Frontmatter

Distributed Systems, Social Networks, and Applications

Frontmatter

An Algorithm for Partitioning Community Graph into Sub-community Graphs Using Graph Mining Techniques

Using graph mining techniques, knowledge extraction is possible from the community graph. In our work, we started with the discussion on related definitions of graph partition both mathematical as well as computational aspects. The derived knowledge can be extracted from a particular sub-graph by way of partitioning a large community graph into smaller sub-community graphs. Thus, the knowledge extraction from the sub-community graph becomes easier and faster. The partition is aiming at the edges among the community members of different communities. We have initiated our work by studying techniques followed by different researchers, thus proposing a new and simple algorithm for partitioning the community graph in a social network using graph techniques. An example verifies about the strength and easiness of the proposed algorithm.

Bapuji Rao, Anirban Mitra

Traceback: A Forensic Tool for Distributed Systems

In spite of stringent security measures on the components of a distributed system and well-defined communication procedures between the nodes of the system, an exploit may be found that compromises a node, and may be propagated to other nodes. This paper describes an incident-response method to analyse an attack. The analysis is required to patch the vulnerabilities and may be helpful in finding and removing backdoors installed by the attacker. This analysis is done by logging all relevant information of each node in the system at regular intervals at a centralised store. The logs are compressed and sent in order to reduce network traffic and use lesser storage space. The state of the system is also stored at regular intervals. This information is presented by a replay tool in a lucid, comprehensible manner using a timeline. The timeline shows the saved system states (of each node in the distributed system) as something similar to checkpoints. The events and actions stored in the logs act on these states and this shows a replay of the events to the analyser. A time interval during which an attack that took place is suspected to have occurred can be analysed thoroughly using this tool.

Sushant Dinesh, Sriram Rao, K. Chandrasekaran

Deterministic Transport Protocol Verified by a Real-Time Actuator and Sensor Network Simulation for Distributed Active Turbulent Flow Control

Total drag of common transport systems such as aircrafts or railways is primarily determined by friction drag. Reducing this drag at high Reynolds numbers (<10

4

) is currently investigated using flow control based on transversal surface waves. For application in transportation systems with large surfaces a distributed real-time actuator and sensor network is in demand. To fulfill the requirement of real-time capability a deterministic transport protocol with a master slave strategy is introduced. With our network model implemented in Simulink using TrueTime toolbox the deterministic transport protocol could be verified. In the model the

Master

-

Token

-

Slave

(

MTS

) protocol is implemented between the application layer following the IEEE 1451.1 smart transducer interface standards and the Ethernet medium access protocol. The model obeys interfaces to the flow control and the DAQ-hardware allowing additional testing in model in the loop simulations.

Marcel Dueck, Mario Schloesser, Stefan van Waasen, Michael Schiek

Improved Resource Provisioning in Hadoop

Extensive use of the Internet is generating large amount of data. The mechanism to handle and analyze these data is becoming complicated day by day. The Hadoop platform provides a solution to process huge data on large clusters of nodes. Scheduler play a vital role in improving the performance of Hadoop. In this paper, MRPPR: MapReduce Performance Parameter based Resource aware Hadoop Scheduler is proposed. In MRPPR, performance parameters of Map task such as the time required for parsing the data, map, sort and merge the result, and of Reduce task, such as the time to merge, parse and reduce is considered to categorize the job as CPU bound, Disk I/O bound or Network I/O bound. Based on the node status obtained from the TaskTracker’s response, nodes in the cluster are classified as CPU busy, Disk I/O busy or Network I/O busy. A cost model is proposed to schedule a job to the node based on the classification to minimize the makespan and to attain effective resource utilization. A performance improvement of 25–30 % is achieved with our proposed scheduler.

M. Divya, B. Annappa

Cuckoo Search for Influence Maximization in Social Networks

In a social network, the influence maximization is to find out the optimal set of seeds, by which influence can be maximized at the end of diffusion process. The approaches which are already existing are greedy approaches, genetic algorithm and ant colony optimization. Eventhough these existing algorithms take more time for diffusion, they are not able to generate a good number of influenced nodes. In this paper, a Cuckoo Search Diffusion Model (CSDM) is proposed which is based on a metaheuristic approach known as the Cuckoo Search Algorithm. It uses fewer parameters than any other metaheuristic approaches. Therefore parameter tuning is an easy task for this algorithm which is the main advantage of the Cuckoo Search algorithm. Experimental results show that this model gives better results than previous works.

Nikita Sinha, B. Annappa

Sociopedia: An Interactive System for Event Detection and Trend Analysis for Twitter Data

The emergence of social media has resulted in the generation of highly versatile and high volume data. Most web search engines return a set of links or web documents as a result of a query, without any interpretation of the results to identify relations in a social sense. In the work presented in this paper, we attempt to create a search engine for social media datastreams, that can interpret inherent relations within tweets, using an ontology built from the tweet dataset itself. The main aim is to analyze evolving social media trends and providing analytics regarding certain real world events, that being new product launches, in our case. Once the tweet dataset is pre-processed to extract relevant entities, Wiki data about these entities is also extracted. It is semantically parsed to retrieve relations between the entities and their properties. Further, we perform various experiments for event detection and trend analysis in terms of representative tweets, key entities and tweet volume, that also provide additional insight into the domain.

R. Kaushik, S. Apoorva Chandra, Dilip Mallya, J. N. V. K. Chaitanya, S. Sowmya Kamath

Design and Implementation of a Hierarchical Content Delivery Network Interconnection Model

Content Management System (CMS) is an infrastructure for efficient distribution, organization, and delivery of digital content. It is desirable that the content must be successfully delivered regardless of the end users location or attachment network. For the end to end delivery of content, a virtual open content delivery infrastructure is formed by interconnecting several CDNs. In this paper, we focus on Content Delivery Network Interconnection. An efficient Hierarchical CDNI Architecture, named as HCDNI, is proposed to reduce the limitations of CDNIs. Next, a content distribution and redistribution scheme is proposed so that the searching time and the round trip time for the content delivery can be minimized. Finally, analysis and simulation studies show that proposed algorithm results significant improvement in terms of data routing, path selection, content distribution and redistribution.

Sayan Sen Sarma, S. K. Setua

Networking Systems and Architectures

Frontmatter

Solving Reliability Problems in Complex Networks with Approximated Cuts and Paths

The paper solves reliability problems for the design of complex network without failure. The traditional approaches based on minimal paths and cuts require significant computational effort equals to NP-hard. The major difficulty lies in calculating the minimal cuts and paths to improve exact reliability bounds. Therefore, a neural network algorithm based on approximated paths and cuts to deal with this difficulty. The proposed approach is divided into two parts. The first part is used to approximate the computed minimal cuts and paths from two networks. The approach in other part improves the reliability bounds. The proposed approach has been tested on mesh network of 256 nodes and hyper-tree of 496 nodes. The performance of proposed approach is compared with PSO for the reliability bounds improvement in low failure probability.

Baijnath Kaushik, Haider Banka

Maximal Clique Size Versus Centrality: A Correlation Analysis for Complex Real-World Network Graphs

The paper presents the results of correlation analysis between node centrality (a computationally lightweight metric) and the maximal clique size (a computationally hard metric) that each node is part of in complex real-world network graphs, ranging from regular random graphs to scale-free graphs. The maximal clique size for a node is the size of the largest clique (number of constituent nodes) the node is part of. The correlation coefficient between the centrality value and the maximal clique size for a node is observed to increase with increase in the spectral radius ratio for node degree (a measure of the variation of node degree in the network). As the real-world networks get increasingly scale-free, the correlation between the centrality value and the maximal clique size increases. The degree-based centrality metrics are observed to be relatively better correlated with the maximal clique size compared to the shortest path-based centrality metrics.

Natarajan Meghanathan

On Solving the Multi-depot Vehicle Routing Problem

Problems associated with seeking the lowest cost vehicle routes to deliver demand from a set of depots to a set of customers are called Multi-depot Vehicle Routing Problems (MDVRP). The MDVRP is a generalization of the standard vehicle routing problem which involves more than one depot. In MDVRP each vehicle leaves a depot and should return to the same depot they started with. In this paper, the MDVRP is tackled using a iterated local search metaheuristic. Experiments are run on a number of benchmark instances of varying depots and customer sizes. The numerical results show that the proposed algorithm is competitive against state-of-the-art methods.

Takwa Tlili, Saoussen Krichen, Ghofrane Drira, Sami Faiz

Prediction of Crop and Intrusions Using WSN

Nowadays, the major problem in the agriculture sector is stumpy crop production due to less number of workers in the farm and animal intrusion. The main objective is to improve the sustainable agriculture by enhancing the technology using wireless sensor technology. It uses Micro Electro Magnetic System which is used to measure temperature, humidity and moisture. The characteristic data obtained from the Wireless Sensor Network will be compared with the pre-defined data set in the Knowledge Base where historical data’s are stored. The corresponding decisions from the Knowledge Base are sent to the respective land owner’s mobile through SMS using radio frequency which has less power consumption. The sensors are co-ordinated using the GPS and are connected to the base station in an ad hoc network using WLAN. Another common issue is animal intrusion, especially in the places like Mettupalayam, Coimbatore, and Pollachi where elephants are destroying the crops. To protect the crops and common people, Seismic sensors are used to detect the footfalls of elephants in hilly areas. This sensor uses geophone to record the footfalls of elephants and immediately alert message is sent to the people.

S. Sangeetha, M. K. Dharani, B. Gayathri Devi, R. Dhivya, P. Sathya

Ethernet MAC Verification by Efficient Verification Methodology for SOC Performance Improvement

Verification of Gigabit Ethernet Media Access Control (MAC), part of most of the networking SOC is accomplished by using the most advanced verification methodology i.e. Universal Verification Methodology (UVM) has been presented in this paper. The main function of MAC is to forward Ethernet frames to PHY through interface and vice versa. With the use of UVM factory and configuration mechanism, coverage driven verification of MAC Characteristics such as frame transmission, frame reception etc. is achieved in best possible way. Coverage metrics and self-checking which reduces the time spent on verifying design. By using UVM methodology, a reusable test bench is developed which has been used to run different test scenarios on same TB environment.

Sridevi Chitti, P. Chandrasekhar, M. Asharani, G. Krishnamurthy

MCDRR Packet Scheduling Algorithm for Multi-channel Wireless Networks

In this paper we considered multi-channel Deficit Round Robin scheduler (MCDRR) for the multi-channel wireless networks to provide better fairness to the users. The scheduler needs to exploit channel availability to achieve higher network performance. The MCDRR scheduling algorithm was first implemented in hybrid TDM/WDM optical networks and this was mainly developed to study multi-channel communication. The results provided nearly perfect fairness with ill-behaved flows for different sets of conditions. Now shifting our focus back to wireless networks, many algorithms proposed for the multi-channel wireless networks have fairness issues. This paper address fairness issue by investigating the existing scheduler for the IEEE 802.11n multi-channel wireless network case to provide efficient fair queueing. We take into account the availability of channels, the availability of data packets and efficiently utilize channels to achieve better fairness. Simulation results show that the MCDRR for multi-channel wireless networks can provide nearly perfect fairness with ill-behaved flows for different sets of conditions. Finally, after comparing our results, we say MCDRR performs better than the existing schedulers Round-robin (RR) and Deficit round-robin (DRR) in terms of fairness and throughput.

Mithileysh Sathiyanarayanan, Babangida Abubakar

A Random Access Registration and Scheduling Based MAC Protocol with Directional Antennas for Improving Energy Efficiency

In this paper, we consider a random access registering and scheduling based MAC protocol with directional antennas for improving energy efficiency. In this scheme senders interested in sending packets register with the Access Point (AP) using a low data rate CSMA/CA channel. Based on the available registered users the AP schedules data transfer over the high data rate communication channel in an efficient way using directional antennas for transmission. It also provides flexibility in scheduling stations. We study two scheduling schemes—Priority based scheduling and Earliest Deadline First (EDF). Our simulation results show that the proposed scheme works better than CSMA/CA under high load with less delay and use of directional antennas result in significant energy savings compared to CSMA/CA.

Alisha, P. G. Poonacha

A Key Agreement Algorithm Based on ECDSA for Wireless Sensor Network

Today the wireless sensor networks are being used extensively for general purposes and military aviation that results into the fulfillment of the security requirements. These networks are easily susceptible to attack. Many researchers provided security by using symmetric key cryptography but advanced research shows public key cryptography algorithms can also be used in WSN. Major studies and researches in literature nowadays stress on RSA and ECC algorithms but RSA consumes more energy than ECC. We propose a protocol to provide secure data delivery between node and gateway. Our protocol is based on ECDSA and the proposed scheme imposes very light computational and communication overhead with reducing the measure of key storage. Analysis depicts that the scheme has few merits in connectivity of key, consumption of energy and communication.

Akansha Singh, Amit K. Awasthi, Karan Singh

Frame Converter for Cooperative Coexistence Between IEEE 802.15.4 Wireless Sensor Networks and Wi-Fi

With the exponentially increased users of wireless communication, a large numbers of wireless networks coexists in 2400 MHz Industrial, Scientific and Medical (ISM) band. The IEEE 802.15.4 Wireless Sensor Networks with their low power consumption and low cost advantages are widely adopted in low data rate industrial and consumer applications. The Wi-Fi, IEEE 802.11b/g offers high data rate and larger range. Both these technologies co-exist in ISM Band. The Wi-Fi signals having high signal strength interfere with the weak signals of IEEE 802.15.4, which degrades the throughput performance of IEEE 802.15.4 Wireless sensor networks. It results in a coexistence with non-cooperation. The authors developed a Frame Converter System to establish co-operative coexistence between IEEE 802.15.4 and IEEE 802.11b/g networks. The frame converter is implemented at the Media Access Control Layer using ARM7 processor. It converts the frames of IEEE 802.15.4 network to frames WiFi and WiFi to IEEE 802.15.4.

Rambabu A. Vatti, Arun N. Gaikwad

Research on Wireless Sensor Networks, VANETs, and MANETs

Frontmatter

Secured Time Stable Geocast (S-TSG) Routing for VANETs

Enhancement of safety and security on road, and reduction in traffic congestion is the theme of Intelligent Transportation System (ITS). Vehicular Ad hoc Network as an integral part of ITS protects from Denial of Service attack and maintain Security Associations (SA) due to decentralized, open, dynamic, limited bandwidth and control overhead. Dynamic Time Stable Geocast (DTSG) routing protocol unable to protect from attacks of the intruder. Data authentication, Data integrity and non-repudiation are some security features which need to be taken into account for making geocast protocol robust and critically secured. This paper proposes algorithms for achieving the above mentioned goal of securing time stable geocast routing protocol in vehicular traffic environment. The proposed protocol is simulated in Network Simulator (ns2) and the results are compared with DTSG. The comparative analysis of results reveals the usefulness of the proposed protocol under security attack.

Durga Prasada Dora, Sushil Kumar, Omprakash Kaiwartya, Shiv Prakash

Reduction in Resource Consumption to Enhance Cooperation in MANET Using Compressive Sensing

Energy and bandwidth are the scarce resource in a wireless network. In order to prolong its life nodes drop packets of others to save these resources. These resources are the major cause of selfish misbehavior or noncooperation. To enforce nodes cooperation this paper presents the reduction in resource consumption using Compressive Sensing. Our model compresses the neighborhood sparse data such as routing table updates and other advertisement. We have divided a MANET in terms of the neighborhood called neighborhood group (NG). Sparse data are compressed by neighborhood node and then forwarded to the leader node. The leader node joins all neighborhood data to reconstruct the original data and then broadcasts in its neighborhood. This gives a reduction in resource consumption because major computations are performed at leader end which saves battery power of neighborhood nodes. It compresses sparse data before transmission thus reduces the amount of transmitting data in the network which saves the total energy consumption to prolong life of the network. It also prevents from several attacks because individual nodes do not accept the advertisement and updates directly rather it uses leader node processed information.

Md. Amir Khusru Akhtar, G. Sahoo

AODV and ZRP Protocols Performance Study Using OPNET Simulator and EXata Emulator

There are two important protocols based property in mobile wireless networks and these are ZRP and AODV for understanding concept of wireless networks. Both have own properties and characteristics. As we know that mobile based temporary network is an infrastructure free network and it has ability for self configurability, easy deployment etc. Effective and efficient routing protocols will help to make this network more reliable. The characteristics of self-organization and wireless medium make Mobile Ad hoc Network (MANET) easy to set up and thus attractive to users. Its open and dynamic nature enhance its easy to handle and operate. Motivation of this paper is to determine basic difference ZRP and AODV under two different simulators.

Virendra Singh Kushwah, Rishi Soni, Priusha Narwariya

Optimal Probabilistic Cluster Head Selection for Energy Efficiency in WSN

Conventional Low Energy Adaptive Clustering Hierarchy (LEACH) is a cluster based routing protocol for Wireless Sensor Networks (WSN), which is effective in enhancing lifetime of the nodes thereby increasing the entire network life. The protocol is based on functionalities such as spatially distributed cluster formation, random selection of cluster heads, processing of data locally in the clusters and transmission of aggregated data to the base station (BS). Further, the cluster-head (CH) is selected from the member nodes (MN) from each of the cluster based on remaining energy at the node. In literature, various versions of LEACH with enhanced network life are presented. In this paper, an Efficient LEACH protocol is proposed which includes selection of CH for every round of CH selection based on results on Voronoi tessellations from stochastic geometry and remainant energy in the member node devices. In proposed protocol, a novel method is used to choose the CHs wherein the CHs and member nodes (MNs) of clusters are distributed as two independent homogeneous spatial Poisson Point Processes (PPPs). Probability of selecting the CHs and threshold is derived using results from spatial statistics. The Proposed algorithm selects optimum number of CHs leading to reduction in total energy spent in the network compared to conventional LEACH and other such algorithms. The network life is measured by number of rounds. Monte-Carlo simulations are carried out for performance analysis of LEACH, TEEN and other PPP based protocols. Furthermore, total energy dissipated in the network for each round is fairly constant throughout the network life i.e. distribution of total energy consumption by the network is fairly uniform over the rounds.

Madhukar Deshmukh, Dnyaneshwar Gawali

Firefly Algorithm Approach for Localization in Wireless Sensor Networks

Development of sensor technology has led to low power, low cost and small sized distributed wireless sensor networks (WSN). The self organizable and distributed characteristics of wireless sensor networks had made it for monitoring and control applications at home and other environment. In most of these applications location information plays a crucial role in increasing the performance and reliability of the network. This paper makes an attempt in analyzing and implementation of a novel nature based algorithm known as firefly localization algorithm. This is a distributed algorithm which uses range based trilateration method for distance measurement required for estimating the location of the sensor node. The algorithm is simple to implement and has better convergence and accuracy.

R. Harikrishnan, V. Jawahar Senthil Kumar, P. Sridevi Ponmalar

Performance Analysis of Location and Distance Based Routing Protocols in VANET with IEEE802.11p

A new sub-category of MANET is vehicular ad hoc network where vehicles communicate with each other. Because of constraint roads and very high speed of vehicles, routing is an issue in VANET. Most of the papers analyzed the performance of topology based routing protocols. This paper analyzed the performance of distance-effect routing algorithm for mobility (DREAM) and location aided routing (LAR) protocols for city and highway environment. Packet delivery ratio, throughput and delay metrics are considered for analysis of DREAM and LAR routing protocols using intelligent driver model (IDM) based VanetMobiSim and ns2 with IEEE802.11p.

Akhtar Husain, S. C. Sharma

Fuzzy Based Analysis of Energy Efficient Protocols in Heterogeneous Wireless Sensor Networks

Recent technological advances in wireless sensor network (WSN) lead to many new applications where energy awareness consideration is essential. There are some constrained to use wireless sensor network like energy, environmental effect like temperature, pressure, sound. To overcome these issues many new protocols are developed where energy awareness consideration is important. Most of the work reputed deals with modification, design, development of new routing protocols; which works for different applications of wireless network architectures. This paper analyzes the energy efficient operation in wireless sensor nodes. For this purpose, Low Energy Adaptive Clustering Hierarchy (LEACH), Hierarchical Cluster based Routing (HCR), Stable Election Protocol (SEP) and Gateway Cluster Head Election–Fuzzy Logic (GCHE-FL) protocols were analyze and compare. Clustering algorithm has been used because it reduces the energy of the sensor network. In the present work, Fuzzy logic with three input parameters and single output parameter; is used in cluster head election and protocol gateway heterogeneous wireless sensor network. It has been observed that in GCHE-FL the sensor node being alive for much more time as compare to LEACH, SEP, HCR.

Rajeev Arya, S. C. Sharma

Node Application and Time Based Fairness in Ad-Hoc Networks: An Integrated Approach

In ad hoc networks, balancing overall delay among multiple flows with different priorities is a major challenge. Normally, packets having higher priority are scheduled to dispatch first than the packets having lower priority. However, priority is calculated by considering the weights of either application or region. As a result, flows with lower priority reach to the destination after a long delay or dropped because of starvation. So an efficient mechanism must be framed for achieving delay bounded service in heterogeneous applications of ad hoc network. In this paper, we propose an integrated priority measurement technique for minimize overall delay among multiple flows. The proposed model uses the elapsed time as a measure factor to calculate priority. The packets suffered a larger as well as less delay are assigned a lower priority as they are either less important at sink node or can be delayed to reach at destination within time bound. Performance evaluation of this protocol moderates the end to end latency to the flows having lower priority.

Tapas Kumar Mishra, Sachin Tripathi

Bee Colony Optimization for Data Aggregation in Wireless Sensor Networks

Energy constraint nature of wireless sensor networks has led to the need of data aggregation. Problem of optimal data aggregation scheme is a NP-hard problem. Bee colony System, a metaheuristic algorithm, imparts inherent and natural means of optimization for optimal data aggregation. In this paper, Bee Colony Optimization (BCO) using Bee Fuzzy System is used for data aggregation in wireless sensor networks (WSNs). Simulation is done using MATLAB. The performance shows the considerable improvement in energy optimization of wireless sensor networks.

Sujit Kumar, Sushil Kumar

Cryptography and Security Analysis

Frontmatter

Extended Visual Cryptography Scheme for Multi-secret Sharing

This paper proposes a novel user-friendly visual cryptography scheme for multiple secret sharing. We have generated meaningful shares for multiple secret images using cover images. These meaningful shares are shared among participants. All the shares are required to recover secret images that are shared. The proposed scheme uses Boolean-based operations for generating meaningful shares and recovering all secret images that are used. The proposed scheme achieved lossless recovery of multiple secrets and overcomes the problem of management of meaningless shares.

L. Siva Reddy, Munaga V. N. K. Prasad

Identifying HTTP DDoS Attacks Using Self Organizing Map and Fuzzy Logic in Internet Based Environments

The increasing usage of internet resources may lead to more cyber crimes in the network domain. Among the various kinds of attacks, HTTP flooding is one of the major threats to uninterrupted and efficient internet services that depletes the application layer. It is hard to find out the traces of this attack because the attacker deletes all possible traces in the network. Thus, the only possible way to find the attack is from the trace log file located in the server. This paper proposes a method using Self Organizing Map (SOM) and fuzzy association rule mining to identify the attack. SOM is used to isolate the unknown patterns and to identify the suspicious source. The attacks are identified using fuzzy association rule mining. The statistical test has been carried out to measure the significance of features to identify the legitimate or intrusive behavior.

T Raja Sree, S Mary Saira Bhanu

Improving the Visual Quality of (2, n) Random Grid Based Visual Secret Sharing Schemes

Visual secret sharing scheme (VSS) based on the concept of random grid (RG) that suffers from low visual quality of the decrypted secret. In this work a (2,

n

) RG based VSS has been proposed to improve the contrast. The proposed scheme supports the traditional stacking decryption method. To improve the visual quality further a decryption algorithm is proposed. The experimental result and comparison of visual quality with related methods has shown that the proposed schemes perform well.

Lingaraj Meher, Munaga V. N. K. Prasad

An Efficient Lossless Modulus Function Based Data Hiding Method

Data hiding methods are useful for secure communication over the internet. Important data is hidden in cover data that is used as decoy. A lossless data hiding method is judged by the maximum size of secret data that can be hidden without distortion and similarity between the original cover image and stego image. In this work, module based least significant bit substitution methods are studied and a new enhancement is proposed to improve the quality of stego image. The new method divides the cover and secret images in several blocks and compressed secret block is hidden into corresponding cover block using modulo operation. The extra data is stored in the free space that is generated after compression. The quality of stego image in proposed method is better than that of existing methods that hides the secret data using modulo operation.

Nadeem Akhtar

On the Use of Gaussian Integers in Public Key Cryptosystems

We present a comparative analysis of the processes of factorization of Gaussian integers and rational integers, with the objective of demonstrating the advantages of using the former instead of the latter in RSA public key cryptosystems. We show that the level of security of a cryptosystem based on the use of the product of two Gaussian primes is much higher than that of one based on the use of the product of two rational primes occupying the same storage space. Consequently, to achieve a certain specific degree of security, the use of complex Gaussian primes would require much less storage space than the use of rational primes, leading to substantial saving of expenditure. We also set forth a scheme in which rings of algebraic integers of progressively higher and higher degrees and class numbers can be used to build cryptosystems that remain secure by forever staying ahead of advances in computing power.

Aakash Paul, Somjit Datta, Saransh Sharma, Subhashis Majumder

Stack Overflow Based Defense for IPv6 Router Advertisement Flooding (DoS) Attack

Internet Protocol version 6 (IPv6) is the future for Internet. But still it has some serious security problems e.g., IPv6 Router advertisement (RA) flood. IPv6 Router advertisement (RA) flooding is a severe Denial of Service attack. In this attack using only single computer, attacker can attack all the computers connected to the Local area Network. As a result all victim machines get frozen and become unresponsive. In this chapter, we have described IPv6 RA flood attack in a test environment and its effects on victims. We proposed an effective solution to counter IPv6 RA flood attack by using a stack. The proposed system would use a stack at victim’s machine. All the incoming RA packets would be sent to this stack before they are processed by the system. After regular interval, RA packet would Pop up from the stack and would be sent for processing. The proposed approach would protect from the IPv6 RA flood, detect the attack and also raise an alarm to alert the user. This proposed approach can also be used to counter other DoS attacks with some modification. We have also provided an algorithm and experimental results in this chapter.

Jai Narayan Goel, B. M. Mehtre

Ideal Contrast Visual Cryptography for General Access Structures with AND Operation

In Visual Cryptographic Scheme (

VCS

) reversing operations (NOT operations) are carried out during reconstruction process to improve the quality of the reconstructed secret image. Many studies were reported on using reversing operation in both perfect and nonperfect black

VCS

. In 2005, Cimato et al. put forward an ideal contrast

VCS

with reversing, which is applicable to any access structures (

IC-GA-VCS

). Here there will not be any change in resolution for reconstructed secret image in comparison with original secret image (

KI

). In this paper a proposal for replacing reversing operations with AND operations during reconstruction in perfect black

VCS

is shown. A comparison of the proposed modification with Cimato et al. construction and other related work are discussed in this paper.

Kanakkath Praveen, M. Sethumadhavan

Extending Attack Graph-Based Metrics for Enterprise Network Security Management

Measurement of enterprise network security is a long standing challenge to the research community. However, practical security metrics are vital for securing enterprise networks. With the constant change in the size and complexity of enterprise networks, and application portfolios as well, network attack surface keeps changing and hence monitoring of security performance is increasingly difficult and challenging problem. Existing attack graph-based security metrics are inefficient in capturing change in the network attack surface. In this paper, we have explored the possible use of graph-based distance metrics for capturing the change in the security level of dynamically evolving enterprise networks. We used classical graph similarity measures such as

Maximum Common Subgraph

(

MCS

), and

Graph Edit Distance

(

GED

) as an indicator of change in the enterprise network security. Our experimental results shows that graph similarity measures are efficient and capable of capturing changing network attack surface in dynamic (i.e. time varying) enterprise networks.

Ghanshyam S. Bopche, Babu M. Mehtre

A Graph-Based Chameleon Signature Scheme

Ordinary digital signatures are universally verifiable but they are unsuitable for certain applications that are personally or commercially sensitive. Chameleon signature introduced by Krawczyk and Rabin is non interactive and prevents the receiver of a signed document from disclosing its contents to third parties. Most of the efficient chameleon signature schemes are based on some known hard problems such as integer factorization and discrete logarithm. In this paper, a chameleon signature scheme is proposed and to the best of our knowledge it is the first such scheme using graph theory. It is a collision-resistant scheme that also satisfies the security requirements such as message-hiding, key exposure freeness and semantic security.

P. Thanalakshmi, R. Anitha

Design of ECSEPP: Elliptic Curve Based Secure E-cash Payment Protocol

The present scenario in the area of e-commerce the most popular term is E-cash. E-cash is developed to allow fully anonymous secure electronic cash transfer to support online trading between buyers and sellers. E-cash transfer system has helped us to make transaction electronically. In this paper we propose an elliptic curve based secure e-cash payment protocol. The proposed system secures the transactions not only by the nature of the curve but also makes use of the hash function to enhance the desired security measure. It also ensures mutual authentication, anonymity, non-repudiation and traceability of the users.

Aditya Bhattacharyya, S. K. Setua

Security Improvement of One-Time Password Using Crypto-Biometric Model

In many e-commerce systems, to counter network eavesdropping/replay attack, OTP concept has been used. However if the OTP itself gets attacked and then there might be possibility of attacking the account of the legitimate client too. This paper proposes a model for improving the security of OTP using ECC with iris biometric for an e-commerce transaction. This model also offers improve security with shorter key length than the RSA and also avoids to remember the private keys as the private keys are generated dynamically as and when required.

Dindayal Mahto, Dilip Kumar Yadav

An Image Encryption Technique Using Orthonormal Matrices and Chaotic Maps

Many image encryption techniques have been designed based on block based transformation techniques or chaos based algorithms. In this paper, a two stage image encryption technique is proposed. In the first stage, a new method of generating orthonormal matrices is presented and used along with sparse matrices to generate key matrices which are used to transform the original image. Since, chaos based encryption techniques are very efficient due to sensitivity to initial conditions and system parameters, this paper makes an analytical study of some recent chaos based image encryption techniques. In the second stage, the image is encrypted using one of the studied algorithms which has an optimal balance between robustness and execution time. This transformation is very fast and the overall cryptosystem is very robust which can be observed from entropy analysis, resistance to statistical and differential attacks and large key space.

Animesh Chhotaray, Soumojit Biswas, Sukant Kumar Chhotaray, Girija Sankar Rath

Learning Probe Attack Patterns with Honeypots

The rapid growth of internet and internet based applications has given rise to the number of attacks on the network. The way the attacker attacks the system differs from one attacker to the other. The sequence of attack or the signature of an attacker should be stored, analyzed and used to generate rules for mitigating future attack attempts. In this paper, we have deployed honeypot to record the activities of the attacker. While the attacker prepares for an attack, the IDS redirects him to the honeypot. We make the attacker to believe that he is working with the actual system. The activities related to the attack are recorded by the honeypot by interacting with the intruder. The recorded activities are analyzed by the network administrator and the rule database is updated. As a result, we improve the detection accuracy and security of the system using honeypot without any loss or damage to the original system.

Kanchan Shendre, Santosh Kumar Sahu, Ratnakar Dash, Sanjay Kumar Jena

Operating System and Software Analysis

Frontmatter

Fuzzy Based Multilevel Feedback Queue Scheduler

In multilevel feedback queue scheduling algorithm the major concern is to improve the turnaround time by keeping the system responsive to the user. Presence of vagueness in a system can further affect these performance metrics. With this intent, we attempt to propose a fuzzy based multilevel feedback queue scheduler which deals with the vagueness of parameters associated with tasks as well as to improve the performance of system by reducing the waiting time, response time, turnaround time and normalized turnaround time. Performance analysis shows that our methodology performs better than the multilevel feedback scheduling approach.

Supriya Raheja, Reena Dadhich, Smita Rajpal

Dynamic Slicing of Feature-Oriented Programs

We intend to suggest a dynamic slicing algorithm for feature-oriented programs. We have named our algorithm

Execution Trace file Based Dynamic Slicing

(ETBDS) algorithm. The ETBDS algorithm constructs an intermediate program representation known as

Dynamic Feature

-

Oriented Dependence Graph

(DFDG) based on various dependences exist amongst the program statements. We use an execution trace file to keep the execution history of the program. The dynamic slice is computed by first performing breadth-first or depth-first traversal on the DFDG and then mapping out the resultant nodes to the program statements.

Madhusmita Sahu, Durga Prasad Mohapatra

Mathematical Model to Predict IO Performance Based on Drive Workload Parameters

Disk drive technologies have evolved rapidly over the last decade to address the needs of big data. Due to rapid growth in social media, data availability and data protection has become an essence. The availability or protection of the data ideally depends on the reliability of the disk drive. The disk drive speed and performance with minimum cost still plays a vital role as compared to other faster storage devices such as NVRAM, SSD and so forth in the current data storage industry. The disk drive performance model plays a critical role to size the application, to cater the performance based on the business needs. The proposed performance model of disk drives predict how well any application will perform on the selected disk drive based on performance indices such as response time, MBPS, IOPS etc., when the disk performs intended workload.

Taranisen Mohanta, Leena Muddi, Narendra Chirumamilla, Aravinda Babu Revuri

Measure of Complexity for Object-Oriented Programs: A Cognitive Approach

A new cognitive metric named “Propose Object-Oriented Cognitive Complexity” (POOCC) is proposed to measure the complexity of Object-Oriented (OO) programs. The proposed metric is based on some fundamental factors like: number of operands, operators available in a method of an OO class, cognitive weight of different basic control structures and ratio of accessing similar parameters of an OO class. Evaluation of propose cognitive metric is done with the help of 16 C++ programs and Weyuker’s property is used to validate it, seven out of nine properties are satisfied by the proposed metric. Furthermore, this work is also present the relationship between POOCC and Lines of Code (LOC) to examine the density of the code.

Amit Kumar Jakhar, Kumar Rajnish

Measurement of Semantic Similarity: A Concept Hierarchy Based Approach

Resolving semantic heterogeneity is one of the major issues in many fields, namely, natural language processing, search engine development, document clustering, geospatial information retrieval and knowledge discovery, etc. Semantic heterogeneity is often considered as an obstacle for realizing full interoperability among diverse datasets. Appropriate measurement metric is essential to properly understand the extent of similarity between concepts. The proposed approach is based on the notion of concept hierarchy which is built using a lexical database. The

WordNet

, a semantic lexical database, is used here to build the semantic hierarchy. A measurement metric is also proposed to quantify the extent of similarity between a pair of concepts. The work is compared with existing methodologies on

Miller

-

Charles

benchmark dataset using three correlation coefficients (

Pearson’s

,

Spearman’s

and

Kendall Tau rank

correlation coefficients). The proposed approach is found to yield better results than most of the existing techniques.

Shrutilipi Bhattacharjee, Soumya K. Ghosh

Evaluating the Effectiveness of Conventional Fixes for SQL Injection Vulnerability

The computer world is definitely familiar with SQL as it plays a major role in the development of web applications. Almost all applications have data to be stored for future reference and most of them use RDBMS. Many applications choose its backend from the SQL variants. Large and important applications like the bank and credit-cards will have highly sensitive data in their databases. With the incredible advancement in technology, almost no data can survive the omniscient eyes of the attackers. The only thing that can be done is to make the attackers work difficult. The conventional fixes help in the prevention of attacks to an extent. However, there is a need for some authentic work about the effectiveness of these fixes. In this paper, we present a study of the popular SQL Injection Attack (SQLIA) techniques and the effectiveness of conventional fixes in reducing them. For addressing the SQLIA’s in depth, a thorough background study was done and the mitigation techniques were evaluated using both automated and manual testing. We took the help of a renowned penetration testing tool, SQLMap, for the automated testing. The results indicate the importance of incorporating these mitigation techniques in the code apart from going for complex fixes that require both effort and time.

Swathy Joseph, K. P. Jevitha

Salt and Pepper Noise Reduction Schemes Using Cellular Automata

Two filters in the light of two-dimensional Cellular Automata (CA) are presented in this paper for salt and pepper noise reduction of an image. The design of a parallel algorithm to remove noise from corrupted images is a demanded approach now, so we utilize the idea of cellular automata to cater this need. The filters are mainly designed according to the neighborhood structure of a cell with different boundary conditions. The performances of the proposed filters with that of existing filters are evaluated in terms of peak signal-to-noise ratio (PSNR) values and it has been observed that the proposed filters are extremely promising for noise reduction of an image contaminated by salt and pepper noise. The primary point of interest in utilizing these proposed filters is; it preserves more image details in expense of noise suppression.

Deepak Ranjan Nayak, Ratnakar Dash, Banshidhar Majhi

Internet, Web Technology, and Web Security

Frontmatter

Optimizing the Performance in Web Browsers Through Data Compression: A Study

Web browsers are used on daily basis, where web compression plays an important role in compressing the web pages content and thereby, providing the users in launching the pages much faster. The study is carried out with the concept of web compression and its existing techniques. Outcome of the study on existing research contributions would present the various challenges in web compression. Other problem issues related to this work are discussed and it will be taken further in the future works.

S. Aswini, G. ShanmugaSundaram, P. Iyappan

Architectural Characterization of Web Service Interaction Verification

Web service interaction utilizes disparate models as it still does not have its own model for verification process. Adaptation of a different model is not always beneficial as it may prune several significant characteristics that worth considering in verification. The primary reason behind this adaptation is that the primitive characteristics are not well identified, standardized, and established for Web service interaction model. In this article, we therefore investigate the primitive characteristics of Web service interaction model that need to be well considered in verification. Further, we study the appropriateness and effectiveness of two modeling and verification phenomena namely

model checking

and

module checking

with respect to investigated primitive characteristics.

Gopal N. Rai, G. R. Gangadharan

Revamping Optimal Cloud Storage System

With the prosperous growth of cloud computing, it is being used widely in business as well as in researches today, considering its numerous advantages over traditional approaches. However, security and privacy concerns are ascending day by days. Cloud is being utilized for not only to use software and platform over the Internet, but also for storing confidential data. This paper presents a novel approach wherein multi-cloud environment is used to mitigate data privacy apprehensions. The viability and design of this method are described in the proposed paper.

Shraddha Ghogare, Ambika Pawar, Ajay Dani

Performance Improvement of MapReduce Framework by Identifying Slow TaskTrackers in Heterogeneous Hadoop Cluster

MapReduce is presently recognized as a significant parallel and distributed programming model with wide acclaim for large scale computing. MapReduce framework divides a job into

map

,

reduce

tasks and schedules these tasks in a distributed manner across the cluster. Scheduling of tasks and identification of “slow TaskTrackers” in heterogeneous Hadoop clusters is the focus of recent research. MapReduce performance is currently limited by its default scheduler, which does not adapt well in heterogeneous environments. In this paper, we propose a scheduling method to identify “slow TaskTrackers” in a heterogeneous Hadoop cluster and implement the proposed method by integrating it with the Hadoop default scheduling algorithm. The performance of this method is compared with the Hadoop default scheduler. We observe that the proposed approach shows modest but consistent improvement against the default Hadoop scheduler in heterogeneous environments. We see that it improves by minimizing the overall job execution time.

Nenavath Srinivas Naik, Atul Negi, V. N. Sastry

Router Framework for Secured Network Virtualization in Data Center of IaaS Cloud

Data center exploits network virtualization to fully utilize physical network resources by collocating tenants’ virtual networks. The virtual networks consist of sets of virtual routers connected by virtual links. The network virtualization must efficiently embed virtual networks on a physical network of the data center to balance load among physical resources to fully utilize the physical network. The virtual networks must also be securely managed so that they are not compromised by collocated users or a data center network administrator who has direct access to the physical network. In this paper, we propose a router framework in which virtual routers and links can be securely placed on physical router by adding a virtual plane on top of data and control planes, two abstract protocols and an enforcement of Federation Access Control Model (FACM). The two abstract protocols, viz. Secure Virtual Topology Embedding Protocol (SVTEP) and Node-and-Path Label Distribution Protocol (NPLDP) are presented along with a theoretical evaluation of the proposed router framework to fulfill all the aforesaid requirements.

Anant V. Nimkar, Soumya K. Ghosh

Analysis of Machine Learning Techniques Based Intrusion Detection Systems

Attacks on Computer Networks are one of the major threats on using Internet these days. Intrusion Detection Systems (IDS) are one of the security tools available to detect possible intrusions in a Network or in a Host. Research showed that application of machine learning techniques in intrusion detection could achieve high detection rate as well as low false positive rate. This paper discusses some commonly used machine learning techniques in Intrusion Detection System and also reviews some of the existing machine learning IDS proposed by authors at different times.

Rupam Kr. Sharma, Hemanta Kumar Kalita, Parashjyoti Borah

Compression and Optimization of Web-Contents

Various image formats used online have different characteristics and can only be used in an appropriate manner. The paper creates a boundary around each format, for it to be used efficiently. It also discusses neat tricks and tips for optimizing rich content websites in mobile devices, which generally have low processing power. To save bandwidth usage and provide speedy access to content, compression is required. And hence this paper will do more in increasing efficiency of websites and greater effect on the performance and overall load time in slow internet connections. Reducing network latency has become very important so that many devices can connect to the server at the same time. However, it is even more important to save the bandwidth of the users who pay to view the websites. This research paper aims at outlining techniques to reduce the size of a website and encourage developers to code efficiently.

Suraj Thapar, Sunil Kumar Chowdhary, Devashish Bahri

Harnessing Twitter for Automatic Sentiment Identification Using Machine Learning Techniques

User generated content on twitter gives an ample source to gathering individuals’ opinion. Because of the huge number of tweets in the form of unstructured text, it is impossible to summarize the information manually. Accordingly, efficient computational methods are needed for mining and summarizing the tweets from corpuses which, requires knowledge of sentiment bearing words. Many computational techniques, models and algorithms are there for identifying sentiment from unstructured text. Most of them rely on machine-learning techniques, using bag-of-words (BoW) representation as their basis. In this paper, we have applied three different machine learning algorithm (Naive Bayes (NB), Maximum Entropy (ME) and Support Vector Machines (SVM)) for sentiment identification of tweets, to study the effectiveness of various feature combination. Our experiments demonstrate that NB with Laplace smoothing considering unigram, Part-of-Speech (POS) as feature and SVM with unigram as feature are effective in classifying the tweets.

Amiya Kumar Dash, Jitendra Kumar Rout, Sanjay Kumar Jena

Big Data and Recommendation Systems

Frontmatter

Analysis and Synthesis for Archaeological Database Development in Nakhon Si Thammarat

Nakhon Si Thammarat, a province in Southern Thailand, was the heartland of ancient cultures and kingdoms, developed over several centuries, which had left behind tremendously valuable archaeological finds, including artifacts, structures, and sites. Unfortunately, these precious archaeological treasures are disturbed and destroyed daily by looting and modern land development. To keep a record of this irreplaceable archaeological data, we decided to develop an archaeological database. This database is managed within an RDBMS aimed at describing the relationship between artifacts, structures, and historical, stratigraphic and environmental complexity of archaeological sites, and is directly connected to a digital archive containing the raw data. In order to analyze and synthesize the archaeological database, we surveyed currently popular standard metadata for archaeological data VRA and other existing solutions. In this paper, we propose an extension metadata structure of VRA and a logical database for specific archaeological data in the area.

Kanitsorn Suriyapaiboonwattana

Evaluation of Data Mining Strategies Using Fuzzy Clustering in Dynamic Environment

The recent applications of data mining such as biological, scientific, financial and others are changing data regularly, which is uncertain and incomplete. For finding tendency in these data up-to-date, we need to modify existing data mining algorithms with dynamic characteristics. Soft computing methods are suitable for finding changes in uncertain data. In order to adopt change in data we can apply any of two approaches, update algorithm by ignoring earlier state or update with respect to earlier state. In this paper, we have framed two fuzzy clustering methods based on these approaches and implementation done using R software with comparison.

Chatti Subbalakshmi, G. Ramakrishna, S. Krishna Mohan Rao

A Survey of Different Technologies and Recent Challenges of Big Data

Big Data, the buzz around the globe in recent days is used for large-scale data which have huge volume, variety and with some genuinely difficult complex structure. The last few years of internet technology as well as computer world has seen a lot of growth and popularity in the field of cloud computing. As a consequence, these cloud applications are continually generating this big data. There are various burning problems associated with big data in the research field, like how to store, analysis and visualize these for generating further outcomes. This paper initially points out the recent developed information technologies in the field of big data. Later on, the paper outlines the major key problems like, proper load balancing, storage and processing of small files and de-duplication regarding the big data.

Dipayan Dev, Ripon Patgiri

Privacy Preserving Association Rule Mining in Horizontally Partitioned Databases Without Involving Trusted Third Party (TTP)

In this research work, we design a security protocol that derives association rules securely from the horizontally distributed databases without a trusted third party (TTP), even communication channel is unsecured between involving sites. It ensures privacy and security of owner’s with the help of elliptic curve based Diffie-Hellman and Digital Signature Algorithm.

Chirag N. Modi, Ashwini R. Patil

Performance Efficiency and Effectiveness of Clustering Methods for Microarray Datasets

Numerous of clustering methods are proficiently work for low dimensional data. However Clustering High dimensional data is still challenging related to time complexity and accuracy. This paper presents the performance efficiency and effectiveness of K-means and Agglomerative hierarchical clustering methods based on Euclidean distance function and quality measures Precision, Recall and F-measure for Microarray datasets by varying cluster values. Efficiency concerns about computational time required to build up dataset and effectiveness concerns about accuracy to cluster the data. Experimental results on Microarray datasets reveal that K-means clustering algorithm is favorable in terms of effectiveness where as efficiency of clustering algorithms depends on dataset used for empirical study.

Smita Chormunge, Sudarson Jena

Data Anonymization Through Slicing Based on Graph-Based Vertical Partitioning

Data anonymization is a technique that uses data distortion to preserve privacy of public data to be published. Several data anonymization techniques and principles have been proposed in the past such as k-anonymity, l-diversity, and slicing. Slicing promises to address the drawbacks of the other two anonymization models. Our proposition is the use of a graph-based vertical partitioning algorithm (GBVP) in the process of Slicing instead of the originally proposed Partition Around Medoids (PAM). We will present several arguments that favor GBVP against PAM as a choice for clustering algorithm.

Kushagra Sharma, Aditi Jayashankar, K Sharmila Banu, B. K. Tripathy

An Efficient Approach for the Prediction of G-Protein Coupled Receptors and Their Subfamilies

G-protein coupled receptors are responsible for many physiochemical processes such as neurotransmission, metabolism, cellular growth and immune response. So it necessary to design a robust and efficient approach for the prediction of G-protein coupled receptors their subfamilies. To address the issue of efficient classification G-protein coupled receptors and their subfamilies, here in this paper we propose to use a weighted k-nearest neighbor classifier with UNION of best 50 features selected by Fisher score based feature selection, ReliefF, fast correlation based filter, minimum redundancy maximum relevancy and support vector machine based recursive feature elimination feature selection methods. The proposed method achieved an overall accuracy of 99.9, 98.3 % MCC values of 1.00, 0.98 ROC area values of 1.00, 0.998 and precision of 99.9 and 98.3 % using 10-fold cross validation to predict the G-protein coupled receptors and their subfamilies respectively.

Arvind Kumar Tiwari, Rajeev Srivastava, Subodh Srivastava, Shailendra Tiwari

Fault and Delay Tolerant Systems

Frontmatter

Max-Util: A Utility-Based Routing Algorithm for a Vehicular Delay Tolerant Network Using Historical Information

A vehicular delay tolerant network (VDTN) has emerged as special type of delay tolerant network (DTN), characterized by the non-existence of end-to-end connectivity between different source nodes and destination nodes thereby diminishing the possibility of delivering messages to the destination node. Hence, routing becomes a vital issue in VDTN. In this paper, a utility-based routing algorithm (Max-Util) for a VDTN has been proposed. This algorithm exploits the historical information of a node for computing the utility of a node to take a decision to forward a message among nodes. This history comprises the past encounters of a node with destination and relay nodes, the contact duration between nodes and the remaining buffer size of a node in contact. Our evaluations show that, Max-Util performs better in terms of message delivery ratio, overhead ratio and average delivery latency as compared to some existing DTN routing approaches like PRoPHET and Spray-and-Wait.

Milind R. Penurkar, Umesh A. Deshpande

A Novel Approach for Real-Time Data Management in Wireless Sensor Networks

Wireless sensor networks (WSNs) become an emerging area of research now-a-days. It plays an important role to support a wide range of applications. In this work, we propose an architecture for data management in WSNs where in each cluster, sensor nodes transfer data to their cluster-head (CH) and among the CHs one CH is selected based on the highest energy label. The CH aggregates data and reduces the size of the data to be transmitted to the base station. We have also proposed an algorithm which determines how fast the data is transferred from the source node to the base station.

Joy Lal Sarkar, Chhabi Rani Panigrahi, Bibudhendu Pati, Himansu Das

Contact Frequency and Contact Duration Based Relay Selection Approach Inside the Local Community in Social Delay Tolerant Network

Social Delay Tolerant Network (DTN) allows mobile devices to exchange the data opportunistically without end-to-end path when they come in contact. As the connectivity is opportunistic so, the routing task becomes very challenging. By investigating the real social network traces, it exhibits that a node tends to meet a certain group of nodes more frequently and regularly compared to other nodes outside their group which forms the local community. There are many social characteristics (e.g. centrality, similarity) of human beings which are exploited to select the appropriate relay node. Many community based routing protocols proposed in the literature which select the relay node as one of the community members or the most central node inside the community. In this paper, we propose two different approaches of the relay selection inside the local community: contact frequency based approach and contact duration based approach. When message carrier and encountered node belongs to the same community of the message’s destination, the relay selection is done based on contact frequency and contact duration of the node with the message’s destination. It is usually inside the social community that people in the same community meet very often and spend more time with only a few members of their community than all other members. To evaluate the performance of our approaches, we choose the real traces from the campus and the conference environment. The simulation result on the real traces shows that the proposed approaches of selecting a relay node outperform better in terms of delivery ratio for the campus environment where as they perform quite similar as an existing scheme in the conference environment.

Nikhil N. Gondaliya, Dhaval Kathiriya, Mehul Shah

Satellite Communication, Antenna Research, and Cognitive Radio

Frontmatter

Optimal Structure Determination of Microstrip Patch Antenna for Satellite Communication

Microstrip Antenna design has experienced more development in the past period of years and still is subjected to more development. There are different kind of microstrip antenna which can be used for many handheld devices and in the communication devices like satellite link, radar system, radio and cellular mobiles. In this paper the behavior of microstrip antenna is analyzed with two types of electromagnetic band-gap structure such as frequency selective Structure (FSS), and photonic band-gap (PBG) structure. The major characteristics of EBG structure are to show the band gap features in the suppression of surface-wave propagation. This aspect helps to give better antenna performance such as increasing the gain and reducing back radiation. In particular, the distance between EBG cells and the patch is free from the outside control of the cell period, which can be arbitrarily selected, and the final setup offers footprint reduction. With these factors the Gain, bandwidth, and the Voltage standing wave ratio (VSWR) are analyzed using the FEM based EM simulator HFSS.

A. Sahaya Anselin Nisha

Multi-metric Routing Protocol for Multi-radio Wireless Mesh Networks

Wireless Mesh Networks (WMNs) are emerging as low-cost technology to build broadband access networks. With the demand for the real-time multimedia applications like voice over IP and video on demand, providing better quality of service (QoS) in WMNs has become as important research issue. To address this, many cross layer routing metrics are proposed in the literature. But the usage of single link quality metric for different types of traffic within the network is insufficient to guarantee fine grained QoS. Towards this, we propose a multi-metric routing protocol which finds the optimal path based on two routing metrics depending on the type of traffic. We use two metrics namely Metric for Interference, Load and Delay (MILD) and Contention Aware Transmission Time (CATT) for TCP and CBR traffic respectively. MILD is throughput optimizing routing metric where as CATT is delay sensitive metric suitable for multimedia traffic. We implemented these metrics in well-known proactive routing protocol Optimized Link State Routing (OLSR) in ns-2.34. The results indicate that the multi-metric protocol perform better in terms of throughput and average end-to-end delay compared to single metric.

D. G. Narayan, Jyoti Amboji, T Umadevi, Uma Mudenagudi

A Novel Dual Band Antenna for RADAR Application

We have proposed the design, performance and analysis of a novel dual band antenna for radar communication. The working frequency of the antenna is 1.55 and 6 GHz and it is designed on a Rogers RT/Duroid 6202 laminate substrate (dielectric constant = 2.2). The length and width of the patch are respectively 15 and 20 mm, while the length (

l

) and width (

w

) of the slots are respectively 14 and 1 mm with a 10 mm feed length. We have used IE3D Zealand (Ver-12) for obtaining the simulation result and analysis of our proposed antenna. The results are analyzed by investigating current distribution, S(1,1), elevation pattern, directivity pattern and voltage standing wave ratio (VSWR). Our antenna finds both L band (1–2 GHz) and C band (4–8 GHz) applications where it can be effectively used for RADAR application.

Kousik Roy, Debika Chaudhuri, Sukanta Bose, Atanu Nag

Backmatter

Additional information

Premium Partner

    Image Credits