Skip to main content
Top

2017 | Book

Advanced Informatics for Computing Research

First International Conference, ICAICR 2017, Jalandhar, India, March 17–18, 2017, Revised Selected Papers

Editors: Dharm Singh, Balasubramanian Raman, Ashish Kumar Luhach, Pawan Lingras

Publisher: Springer Singapore

Book Series : Communications in Computer and Information Science

insite
SEARCH

About this book

This book constitutes the refereed proceedings of the First International Conference on Advanced Informatics for Computing Research , ICAICR 2017, held in Jalandhar, India, in March 2017. The 32 revised full papers presented were carefully reviewed and selected from 312 submissions. The papers are organized in topical sections on computing methodologies, information systems, security and privacy, network services.

Table of Contents

Frontmatter

Computing Methodologies

Frontmatter
Fuzzy Based Efficient Mechanism for URL Assignment in Dynamic Web Crawler

World wide web (WWW) is a huge collection of unorganized documents. To build the database from this unorganized network, web crawlers are often used. The crawler which interacts with millions of web pages needs to be efficient in order to make a search engine powerful. This utmost requirement necessitates the parallelization of web crawlers. In this work, a fuzzy-based technique for uniform resource locater (URL) assignment in dynamic web crawler is proposed that utilizes the task splitting property of the processor. In order to optimize the performance of the crawler, the proposed scheme addresses two important aspects, (i) creation of crawling framework with load balancing among parallel crawlers, and (ii) making of crawling process faster by using parallel crawlers with efficient network access. Several experiments are conducted to monitor the performance of the proposed scheme. The results prove the effectiveness of the proposed scheme.

Raghav Sharma, Rajesh Bhatia, Sahil Garg, Gagangeet Singh Aujla, Ravinder Singh Mann
Towards Filtering of SMS Spam Messages Using Machine Learning Based Technique

The popularity of mobile devices is increasing day by day as they provide a large variety of services by reducing the cost of services. Short Message Service (SMS) is considered one of the widely used communication service. However, this has led to an increase in mobile devices attacks like SMS Spam. In this paper, we present a novel approach that can detect and filter the spam messages using machine learning classification algorithms. We study the characteristics of spam messages in depth and then found ten features, which can efficiently filter SMS spam messages from ham messages. Our proposed approach achieved 96.5% true positive rate and 1.02% false positive rate for Random Forest classification algorithm.

Neelam Choudhary, Ankit Kumar Jain
Intelligent Computing Methods in Language Processing by Brain

Language processing by brain focuses on experimental, theoretical and psychological study of the brain functioning while processing language. Techniques of dynamic brain imaging and behavioral study require mathematical modeling and methods to explore the scenario. Intelligent computing methods model the observed behavior and process images to obtain clear picture of the brain. This paper illustrates the various models and methodology of neurolinguistic with special emphasis on intelligent computing methods in the field. Finally a comparative study of research going on aphasia and dyslexia has been done.

Ashish Ranjan, R. B. Mishra, A. K. Singh
Classification Algorithms for Prediction of Lumbar Spine Pathologies

Classes can be predicted correctly in the dataset using Classification. For the present study, weka data mining tool is used to predict the lumbar spine pathologies. In this work dataset is firstly classified using different algorithms and then it is determined that which classification algorithm performs better for predicting lumbar spine pathologies. Lumbar spine diseases are predicted with identification of symptoms in patients. We have evaluated and compared six classification algorithms using different evaluation criteria. For the present work, the multilayer perceptron algorithm gives best results to predict the lumbar spine pathologies. This model can be used by the radiologists for lumbar spine pathologies prediction.

Rajni Bedi, Ajay Shiv Sharma
Keyword Based Identification of Thrust Area Using MapReduce for Knowledge Discovery

Keyword based identification generally used in many applications like Web pages, Query processing, Searching interfaces with dealing the power of data mining algorithms which contributes effective and efficient work in large datasets. Keywords are most important terms in documents or text fields to get some interesting knowledge for fulfill the discovery goal. The goal of this paper is to specify the Thrust Area for particular searched keyword in computer science field by this interface. This paper use MapReduce framework with some modification and search the keyword from database to identify the Thrust Area. The proposed interface is mapped on the processed query resulting in the relevant information extracted from the given datasets. MapReduce can work with keywords in large datasets such as sorting, counting frequency etc. with high efficiency. Experimental work has also been carried out to analyses the performance on various parameters such as the time taken by each input source to make clusters and identify Thrust Areas.

Nirmal Kaur, Manmohan Sharma
An Efficient Genetic Algorithm for Fuzzy Community Detection in Social Network

A new fuzzy genetic algorithm proposed for community identification in social networks. In this paper, we have used matrix encoding that enables traditional crossover between individuals and mutation takes place in some of the individuals. Matrix encoding determines which node belongs to which community. Using these concepts enhance the overall performance of any evolutionary algorithms. In this experiment, we used the genetic algorithm with the fuzzy concept and compared to other existing methods like as crisp genetic algorithm and vertex similarity based genetic algorithm. We employed the three real world dataset strike, Karate Club, Dolphin in this work. The usefulness and efficiency of proposed algorithm are verified through the accuracy and quality metrics and provide a rank of proposed algorithm using multiple criteria decision-making method.

Harish Kumar Shakya, Kuldeep Singh, Bhaskar Biswas
Predicting the Outcome of Spanish General Elections 2016 Using Twitter as a Tool

In recent years, twitter has become a popular micro blogging tool among the people world over. These persons post small messages depicting their likes and dislike towards a certain entity e.g. election results. These messages can be used to predict their opinion towards a political party in general or an individual candidate in particular. This paper takes into consideration numerous tweets related data towards the context of 2016 Spanish General Elections. Next, we have tried to establish a relationship between the tweets gathered during the campaigning period and actual vote share that various political parties received in 2016 Spanish General Elections. We have developed a tool in ASP.Net and collected 90154 tweets from 6th June to 26th June 2016 and computed our results based on these tweets. Finally, we compared our computations to establish a close co-relation with the actual results. The factors those may have caused small variance, are minutely investigated upon to give us better insight so as elucidate even closer predictions for future events.

Prabhsimran Singh, Ravinder Singh Sawhney, Karanjeet Singh Kahlon
Priority Based Service Broker Policy for Fog Computing Environment

With an increase in number of services being provided over the Internet, the number of users using these services and the number of servers/Datacentres providing the services have also increased. The use of Fog Computing enhances reliability and availability of these services due to enhanced heterogeneity and increased number of computing servers. However, the users of Cloud/Fog devices have different priority of device type based on the application they are using. Allocating the best Datacentre to process a particular user’s request and then balancing the load among available Datacentres is a widely researched issue. This paper presents a new service broker policy for Fog computing environment to allocate the optimal Datacentre based on users’ priority. Comparative analysis of simulation results shows that the proposed policy performs significantly better than the existing approaches in minimizing the cost, response time and Datacentre processing time according to constraints specified by users.

Deeksha Arya, Mayank Dave
Software Remodularization by Estimating Structural and Conceptual Relations Among Classes and Using Hierarchical Clustering

In this paper, we have presented a technique of software remodularization by estimating conceptual similarity among software elements (Classes). The proposed technique makes use of both structural and semantic coupling measurements together to get much more accurate coupling measures. In particular, the proposed approach makes use of lexical information extracted from six main parts of the source code of a class, namely comments, class names, attribute names, method signatures, parameter names and method source code statements zone. Simultaneously, it also makes use of counting of other class’s member functions used by a given class as a structural coupling measure among classes. Structural coupling among software elements (classes) are measured using information-flow based coupling metric (ICP) and conceptual coupling is measured by tokenizing source code and calculating Cosine Similarity. Clustering is performed by performing Hierarchical Agglomerate Clustering (HAC). The proposed technique is tested on three standard open source Java software’s. The obtained results encourage remodularization by showing higher accuracy against the corresponding software gold standard.

Amit Rathee, Jitender Kumar Chhabra
Requirements Traceability Through Information Retrieval Using Dynamic Integration of Structural and Co-change Coupling

Requirement Traceability (RT) links correlate requirements to their corresponding source code and helps in better requirement understanding, reusability and other software maintenance activities. Since a major portion of software artifacts is in the form of text, for finding these links Information Retrieval (IR) techniques based on textual similarity are widely adopted for Requirement Traceability. But it is hard to find RT links when artifacts have less textual description. So, for finding these links indirectly non-textual techniques like structural information based, co-change history based, ownership based are used with IR. However, if the results of IR contain false positives, the combined approach may increase them further. So, instead of directly combining, this paper proposes an automatic technique for RT by first improving the IR approach and then combining it with the non-textual based techniques. Also, we present a new non-textual based technique based on weighted integration of structural coupling and change history based coupling of classes for retrieving indirect links. The results show that our proposed approach performs better than the existing methods which use coupling information complementary to IR.

Jyoti, Jitender Kumar Chhabra

Information Systems

Frontmatter
Bilingual Code-Mixing in Indian Social Media Texts for Hindi and English

Code Mixing (CM) is an important in the area of Natural Language Processing (NLP) but it is more challenging technique. There are many techniques available for code-mixing but till now less work has been done for code mixing. In this paper we discussed the various approaches used for code mixing and classifying existing code mixing algorithm according to their techniques. Most of people do not always use the Unicode that means only one language during chatting on Facebook, Gmail, Twitter, etc. If some people do not understand the Hindi language, then it is very difficult task for these people to understanding the meaning of code-mixedsentences. For correct Hindi words we used the converter form Hindi words to English words. But most of the words are not correct words according to dictionary and also the code-mixed sentences contained the short form, abbreviation words, phonetic typing, etc. So we have used the character N-gram pruning which is one of the most popular and successful technique of Natural Language Processing (NLP) with dictionary based approaches for language identification of social media text. This paper proposed a scheme which improve the translation by removing the phonetic typing, abbreviation words, shortcut, Hindi word and emotions.

Rajesh Kumar, Pardeep Singh
Performance Evaluation and Comparative Study of Color Image Segmentation Algorithm

In this research paper, authors have been proposed the color image segmentation algorithm by using HSI (hue, saturation and intensity) color model. The HSI color model is used to get the color information of the given image. The boundary of the image is extracted by using edge detection algorithm, whereas, the image regions are filled where the boundaries make the closure. Both HSI color information and edge detection are applied separately and simultaneously. The color segmented image is obtained by taking the union of HSI color information and edge detection. The performance of the proposed algorithm is evaluated and compared with existing region-growing algorithm by considering three parameters, precision (P), recall (R) and F1 value. The accuracy of the proposed algorithm is also measured by using precision-recall (PR) and receiver operator characteristics (ROC) analysis. The efficiency of the proposed algorithm has been tested on more than 1500 images from UCD (University College Dublin) image dataset and other resources. The experiment results show that the efficiency of proposed algorithm is found very significant. MATLAB is used to implement the proposed algorithm.

Rajiv Kumar, S. Manjunath
Electroencephalography Based Analysis of Emotions Among Indian Film Viewers

The film industry has been a major factor in the rapid growth of the Indian entertainment industry. While watching a film, the viewers undergo an experience that evolves over time, thereby grabbing their attention. This triggers a sequence of processes which is perceptual, cognitive and emotional. Neurocinematics is an emerging field of research, that measures the cognitive responses of a film viewer. Neurocinematic studies, till date, have been performed using functional magnetic resonance imaging (fMRI); however recent studies have suggested the use of advancements in electroencephalography (EEG) in neurocinematics to address the issues involved with fMRI. In this article the emotions corresponding to two different genres of Indian films are captured with the real-time brainwaves of viewers using EEG and analyzed using R language.

Gautham Krishna G, Krishna G, Bhalaji N
Fuel Assembly Height Measurements at the Nuclear Power Plant Unit Active Zone

The article studies the system of fuel assembly (FA) height measurement by single CCD camera. The algorithm of FA height measurement involves camera moving around the circle of radius 3 m and forming the images from six locations. Suspension height of the camera provides simultaneous observation of the cell with 7 fuel assemblies. Optical axis of camera in each location passes through the gravity center of the central FA upper surface. The camera mathematical model and developed software allows to register discrete pixel coordinates, that display the gravity centers of FA heads upper surfaces. The paper gives the detailed analysis for three stereo pairs of images, got as combinations of measurements from four camera locations: two adjacent locations, two locations in one from each other, and two opposite locations. Algorithm analysis gives minimum height measuring error and reasonable camera locations selection for obtaining most precise measurement results. The paper proves possibility of restriction to measurements from pairs of opposite camera locations to reach the maximum FA height measurement precision.

Konstantin E. Rumyantsev, Sergey L. Balabaev, Irina Yu. Balabaeva
A Novel Approach to Segment Nucleus of Uterine Cervix Pap Smear Cells Using Watershed Segmentation

This paper presents an approach for segmentation of nuclei of uterine cervix pap smear cells using watershed segmentation. The proposed approach consists of manly three steps: preprocessing of pap smear images, formation of improved binary image and application of watershed segmentation to extract nucleus part of pap smear images. The novelty of proposed approach is that it has introduced systematic way to create improved binary image using thresholding and series of morphological operations as explained in proposed methodology Sect. 2. Each single cell image is segmented in three regions namely nucleus, boundary between nucleus and background and background of image.

Sanjay Kumar Singh, Anjali Goyal
Parametric Study of Various Direction of Arrival Estimation Techniques

The adaptive antenna array system consists of a number of an antenna array element with the signal processing unit, which can adapt its radiation pattern to provide maxima towards the desired user and null towards the interferer. Hence to locate the desired user Direction of Arrival (DOA) of the signal needs to be estimated. This paper presents a parametric study of various DOA estimation algorithms on the uniform linear array (ULA) and also a comparative performance regarding resolution, Signal-to-Noise Ratio (SNR), the number of snapshots, separation angle, etc. It starts with traditional methods of Minimum Variance Distortionless Response (MVDR) algorithm. The subspace-based techniques use the eigenstructure of data covariance matrix. The different subspace-based techniques are Multiple Signal Classification (MUSIC), Root-MUSIC, and Estimation of Signal Parameters via Rotational Invariance Technique (ESPRIT). However, simulation results show that the antenna array elements, SNR, the snapshots, coherent nature of signals, and separation angle between the two sources can affect the DOA estimation results. The result shows that MUSIC algorithm has comparatively better resolution than MVDR, Root-MUSIC and ESPRIT algorithms.

Dharmendra Ganage, Y. Ravinder
Deep CNN-Based Method for Segmenting Lung Fields in Digital Chest Radiographs

Lung Field Segmentation (LFS) is an indispensable step for detecting austere lung diseases in various computer-aided diagnosis. This paper presents a deep learning-based Convolutional Neural Network (CNN) for segmenting lung fields in chest radiographs. The proposed CNN network consists of three sets of convolutional-layer and rectified linear unit (ReLU) layer, followed by a fully connected layer. At each convolutional layer, 64 filters retrieve the representative features. Japanese Society of Radiological Technology (JSRT) dataset is used for training and validation. Test results have 98.05% average accuracy, 93.4% average overlap, 96.25% average sensitivity, and 98.80% average specificity. The obtained results are promising and better than many of the existing state-of-the-art LFS techniques.

Simranpreet Kaur, Rahul Hooda, Ajay Mittal, Akashdeep, Sanjeev Sofat
Quality Assessment of a Job Portal System Designed Using Bout Design Pattern

Design Patterns provide solutions to problems that are notably prevailing in software engineering. The paper targets the importance of design patterns, but also aims on how design patterns uncover and fortify good object oriented principles. A design pattern called Bout was discovered to maintain sessions for a specific period of time. The design is a generic solution to implementing web portals by storing session data of clients on the server. The Bout pattern comprises the design principle of Singleton and Prototype patterns, thus guaranteeing a more reusable design. The Bout pattern is documented in the Gang of Four pattern description template. The Bout pattern was tested with a Job Portal system with additional patterns, Factory Method, Decorator and Observer, with significant improvement in object oriented design metrics. Metrics which showed a significant enhancement were Depth of Inheritance Tree and McCabe Cyclomatic Complexity. The reusability of black box components was analyzed for the Job Portal system which shows a momentous rise in the metrics. The source code was analyzed for modularity traits such as size, complexity, cohesion and coupling, which in turn determines the class quality, package quality and hence the modularity index. These quality metrics showed a symbolic upswing with Bout pattern and supporting patterns. Thus software designers can enhance the quality of distributed systems with the exercising of Bout pattern.

G. Priyalakshmi, R. Nadarajan, Hironori Washizaki, Smriti Sharma
Analyzing Factors Affecting the Performance of Data Mining Tools

Data mining is the key technique for finding interesting patterns and hidden information from huge volume of data. There is a wide range of tools available with different algorithms and techniques to work on data. These data mining tools provide a generalized platform for applying machine learning techniques on dataset to attain required results. These tools are available as open source as well as on payment mode which provide more customizable options. Every tool has its own strength and weakness, but there is no obvious consensus regarding the best one. This paper focuses on three tools namely WEKA, Orange and MATLAB. Authors compared these tools on some given factors like correctly classified accuracy, in-correctly classified accuracy and time by applying four algorithms i.e. Support Vector Machine (SVM), K Nearest Neighbour (KNN), Decision Tree and Naive Bayes for getting performance results with two different datasets.

Balrajpreet Kaur, Anil Sharma
Privacy Preserving Data Mining Using Association Rule Based on Apriori Algorithm

Data mining is a process of extracting knowledge from the large databases. This has made data mining a significant and functional emerging trend. Association rule is one of the most used data mining techniques that discover hidden correlations from huge data sets. There are several mining algorithms for association rules Apriori is one of the most popular algorithm used for extracting frequent item sets from databases and getting the association rule for knowledge discovery. The time required for generating frequent item sets plays an important role. Based on this algorithm we are performing comparison of sanitized data and existing data based on number of iterations and the execution time. The experimental results shows that the number of iteration is reduced in sanitized data than that of existing data also the time is reduced in sanitized data. The association rule generation leads to ensure privacy of the dataset by creating items so, in this way privacy of association rules along with data quality is well maintained.

Shabnum Rehman, Anil Sharma
Stable Feature Selection with Privacy Preserving Data Mining Algorithm

Data mining extracts previously not known and valuable type of patterns and information procured from large storage of data that is archived. In the last few decades, the advancements in internet technologies results in enormous increase in the dimensionality of the dataset concerned with data mining. Feature selection is an important dimensionality reduction technique as it improves accuracy, efficiency and model interpretability of data mining algorithms. Selection of feature and its stability may be perceived to be the robustness of the algorithm for feature selection which helps selecting similar or the same subset of features for small perturbations in the dataset. The essential purpose of data mining that is used for the preservation of privacy is the modification of original datasets by means of a method to preserve privacy of the individuals and work out subsequent data mining algorithm to get information from it. This perturbation of the dataset will affect the feature selection stability. There will be a correlation between privacy preserving data mining and feature selection stability. This paper explores on this problem and also introduces a privacy preserving algorithm which has less impact on feature selection stability as well as accuracy.

Mohana Chelvan P, Perumal K
Process Mining in Intrusion Detection-The Need of Current Digital World

In the current age of digital world, all users of Internet/Network as well as organizations are suffering from intrusions which results into data/information are theft/loss. In the present manuscript concept of intrusion detection system (IDS) were discussed along with its types and basic approaches. It is found that signature analysis, expert system, data mining etc. still using for IDS. Survey was given related to cybercrime incidents across various industry sectors. After analyzing the attacks on networks of organizations in different industry sectors it is found that still attacks like DDoS are not preventable. Comparison of data mining algorithms used for intrusion detection was also done. Various methods to implement the algorithm along with the advantages and disadvantages were also discussed in detail. Because of the disadvantages like over fitting, slow testing speed, unstable algorithms etc., intruders in the network are still active. To avert these shortcomings there is a need to develop real-time intrusion detection and prevention system through which data/information can be protected and saved in real-time basis before a severe loss is experienced. The real-time prevention is possible only if alerts are received instantly without delays. For this purpose, process mining could be used. This technique gives instant time alerts with real time analysis so as to prevent intrusions and data loss.

Ved Prakash Mishra, Balvinder Shukla

Security and Privacy

Frontmatter
Public Network Security by Bluffing the Intruders Through Encryption Over Encryption Using Public Key Cryptography Method

Cryptography and network security is the concept of protecting the network and data transmission. Security of data is important, as data is transferred over unreliable network. Public-key cryptography provides optimal solutions to many of the existing security issues in communication systems. In our proposed algorithm we have used some of these secure algorithms in order to generate a highly secured algorithm which can deal with the various flaws of these algorithms. RSA, Reil-fence algorithm, substitution and transposition algorithms has been used in the proposed algorithm. Encryption over encryption is done in order to confuse the intruder. The proposed algorithm is used to provide multilevel security.

Vishu Madaan, Dimple Sethi, Prateek Agrawal, Leena Jain, Ranjit Kaur

Network Services

Frontmatter
Happiness Index in Social Network

Happiness index in social networking sites shows the happiness of human being. For this dataset is collected from social network, Twitter and applied various sentiment analysis techniques to the data set. Positive Sentiment and Negative Sentiment is calculated based on the tweets. Tweets were retrieved based on location-wise (latitude and longitude) and country-wise. Consequently, maps are plotted based on these sentiments location-wise and country-wise. Individual users are also track, and their happiness is monitored over time to get an idea of how frequent their mood changes and their general happiness pattern. Proposed algorithm finds a change of the pattern of happiness. A user’s past tweets are also monitored and applied topic detection and the subjectivity extraction technique to get a more concrete idea of this pattern.

Kuldeep Singh, Harish Kumar Shakya, Bhaskar Biswas
An Efficient Routing Protocol for DTN

Delay Tolerant Networks (DTNs) are different from the conventional ad hoc networks as they support information exchange in the presence of long latency in end to end path availability. Hence, they use store-carry-forward approach for message exchange unlike other ad hoc networks that use carry-forward approach. As DTNs have evolved from MANETs, they inherit many properties of MANETs, like sparse structure, mobility, network partitioning etc. However, route discovery in DTNs is more challenging because of above mentioned difference. Therefore, routing in DTNs is an interesting topic of research. As end to end path availability is not guaranteed the nodes communicate in opportunistic manner. The performance of store carry forward approach is heavily dependent on the probability of availability of path and storage space at the node. Therefore, we have proposed a probabilistic method of routing using efficient buffer management. The performance of proposed protocol has been evaluated with the help of simulation experiments.

Aikta Arya, Awadhesh Kumar Singh
Impact of Higher Order Impairments on QoS Aware Connection Provisioning in 10 Gbps NRZ Optical Networks

Invent of the optical amplifier has increased the data traffic being carried by each fiber such that even a brief disruption could result in enormous data loss. As we are moving towards transparent networks, connection provisioning is becoming quite challenging task due to additive nature of physical layer impairments and decreased O-E-O conversions taking place inside the routes. In static connection provisioning, the cost function representing the signal quality must consider not only the linear impairments but also, the effect of higher order impairments as they significantly influence the quality of transmission. Present work describes the design of a system with capability to consider the effect of higher order impairments on OSNR (optical signal to noise ratio) and BER (bit error rate) for connection provisioning in optical networks with the focus on comprehensiveness, transparency, and scalability of the system. The processing is carried out in time domain by offline digital signal processing using eye diagram analysis.

Karamjit Kaur, Hardeep Singh, Anil Kumar
Super-Router: A Collaborative Filtering Technique Against DDoS Attacks

DDoS attack is one of the well known cyber attacks of Internet era, which affects on the availability of the network. In 1999, though Computer Incident Advisory Capability (CIAC) reports the first ever DDoS attack, but first major DDoS attack was recorded in year 2000 on some of the big websites e.g., Yahoo, Amazon, CNN, eBay etc. due to which their services went offline for few hours and huge amount of revenue losses were recorded. Since then DDoS attacks become favourite attacks of antagonists. There are so many different defense techniques available to detect and filter malicious traffic, but none of these methods could adequately filter out the malicious traffic. In this context, this paper proposed a new filtering scheme, Super-router, which uses collaborative filtering technique to filter malicious traffic. More specifically, Super-router uses unicast method of communication between filters which reduces the communication overheads and response time of individual filters. This makes Super-router an effective defense against DDoS attacks for high speed networks.

Akshat Gaurav, Awadhesh Kumar Singh
A Systematic Survey on Congestion Mechanisms of CoAP Based Internet of Things

An IoT is a ubiquitous intelligent technology that enables interaction among intelligent devices via Internet. It consists of the lossy and low powered networks (LLN) connected sensors having limited power and bandwidth. For the messaging protocol, IFTF has standardized with a lightweight RESTful application layer protocol known as CoAP. It runs on the unreliable UDP layer for communication, reliability and congestion control. With the increase in no of transmissions, congestion in the network increases and it becomes difficult to overcome it using the conventional CoAP mechanism. Therefore, a new paradigm has been approached for the advancement known as CoAP congestion Control Mechanism (CoCoA). In this paper, there is a systematic survey of the traditional CoAP along with the new paradigm approaches and their algorithms is discussed. Also, a brief introduction of its working, limitations and evaluation parameters such as throughput, latency, and re-transmission are taken into consideration for survey.

Ananya Pramanik, Ashish Kr. Luhach, Isha Batra, Upasana Singh
Increasing Route Availability in Internet of Vehicles Using Ant Colony Optimization

Smart City, where a large number of self-configurable and intelligent devices communicate with each other, provides a platform for collaborative decision making processes affecting virtually every other device present in the ecosystem. Internet of Vehicles (IoV) forms a major part of thus ecosystem that comprises of mobile vehicles capable of generating, storing and moreover processing the data flowing through the system. The vehicles continuously communicate with each other and with the external environment to collect and process real-time information. This collaboration provides a means to build up optimized routing decisions that may lead to the improvement in overall congestion suffered by the network. In this paper, we apply two optimization algorithms Any Colony and Firefly Optimization, on the real-time data collected from various vehicular sources to provide them optimized and congestion-free routes. Various road parameters have been considered that may affect the selection of a particular route toward the destination. The results of experimental setup, conducted using two open source simulators NS2 and SUMO, have shown a predominant enhancement by reducing the average travelling time of the vehicles in the complete system taken into consideration.

Nitika Chowdhary, Pankaj Deep Kaur
Classification Based Network Layer Botnet Detection

Botnets has emerged as the capacious cyber security menace that is encountered by the institutions as well as population around the terrene. It has matured into becoming the primal carrier for launching the most serious menace such as DDOS attacks, spreading of spams, stealing of user’s sensitive information (Banking info, credit card info etc.) and more. Generally, the community of common users are unaware of security standards that make them even more susceptible to bot attacks. A sententious amount of research for botnet detection and analysis has been done but significant amount of work has not been done in terms of contributing a community herded tool for bots. We propose an idea to perform filtration and classification on data received by Botflex that can help to reduce processing overhead and throughput of IDS will be improved. Botflex have limited set of detection parameters which are extended in our proposed approach.

Shivangi Garg, R. M. Sharma
An Efficient and Reliable Data Gathering Protocol Based on RSU Identification

Vehicular Ad hoc Networks (VANETs) can act as an essential platform for providing various types of safety and non-safety applications by collecting the data sensed by the vehicles during their journey. To collect data in an efficient manner exclusive of network congestion is a very challenging task. In this paper, an efficient data gathering protocol based on Road side unit (RSU) identification approach is proposed for collecting the essential data sensed by the vehicle during their journey in an efficient manner without network congestion. The proposed protocol is able to reduce the communication overhead and average delay and capable of increasing the packet delivery ratio. Moreover, the proposed protocol is compared with the RSU activated data gathering protocol based on beacon broadcast. With the help of simulation results it has been identified that the proposed data gathering protocol considerably improves the efficiency of data collection as compared to RSU activated data gathering protocol based on beacon broadcast on the basis of communication overhead, average delay and packet delivery ratio.

Arun Malik, Babita Pandey
A Neural Network Model for Intrusion Detection Using a Game Theoretic Approach

The problem of intrusion detection in the computer networks is not new and various methodologies have been formulated to address the same. A game-theoretic representation was also formulated, using one of the oldest game playing techniques, the minimax algorithm to solve this problem. It exploited the adversary like situation between the intruder and the Intrusion Detection System (IDS) and the essence of this approach lies in the assumption that the intruder and the IDS have complete knowledge of the network and each other’s strategy. The solution for the intrusion detection problem via game theory gives the detection probability by which the IDS can detect the malicious packets on a given network when the probabilities with which the intruder sends the malicious packets on the various paths leading him to the target are known to the IDS. However, in the real world scenario, the role of the intruder and the IDS is dynamic, if the attack is detected or goes undetected the intruder tries to breach the network again with a different approach or the IDS tries to defend the network with a different strategy respectively. The next strategy for either of the two can be learnt by experience and thus, this paper, models an artificial neural network to represent this game-theoretic representation. The modeled neural network gives the detection probability of an attack by the IDS when the probabilities of sending malicious packets on the various paths leading the intruder to the target are given as an input pattern to the neural network.

Pallavi Kaushik, Kamlesh Dutta
Backmatter
Metadata
Title
Advanced Informatics for Computing Research
Editors
Dharm Singh
Balasubramanian Raman
Ashish Kumar Luhach
Pawan Lingras
Copyright Year
2017
Publisher
Springer Singapore
Electronic ISBN
978-981-10-5780-9
Print ISBN
978-981-10-5779-3
DOI
https://doi.org/10.1007/978-981-10-5780-9

Premium Partner