Zum Inhalt

Proceedings of the 5th International Conference on Frontiers in Intelligent Computing: Theory and Applications

FICTA 2016, Volume 2

  • 2017
  • Buch
insite
SUCHEN

Über dieses Buch

The book is a collection of high-quality peer-reviewed research papers presented at International Conference on Frontiers of Intelligent Computing: Theory and applications (FICTA 2016) held at School of Computer Engineering, KIIT University, Bhubaneswar, India during 16 - 17 September 2016. The book aims to present theories, methodologies, new ideas, experiences, applications in all areas of intelligent computing and its applications to various engineering disciplines like computer science, electronics, electrical, mechanical engineering, etc.

Inhaltsverzeichnis

Frontmatter
Information Retrieval for Gujarati Language Using Cosine Similarity Based Vector Space Model

Based on user query, to retrieve most relevant documents from the web for resource poor languages is a crucial task in Information Retrieval (IR) system. This paper presents Cosine Similarity Based Vector Space Document Model (VSDM) for Information Retrieval in Gujarati language. VSDM is widely used in information retrieval and document classification where each document is represented as a vector and each dimension corresponds to a separate term. Influence and relevancy of documents with user query is measured using cosine similarity under vector space where set of documents is considered as a set of vectors. The present work considers user query as a free order text, i.e., the word sequence does not affect results of the IR system. Technically, this is Natural Language Processing (NLP) application wherein stop-words removal, Term Frequency (TF) calculation, Normalized Term Frequency (NF) calculation and Inverse Document Frequency (IDF) calculation was done for 1360 files using Text and PDF formats and precision and recall values of 78 % and 86 % efficiency respectively were recorded. To the best of our knowledge, this is first IR task in Gujarati language using cosine similarity based calculations.

Rajnish M. Rakholia, Jatinderkumar R. Saini
BLDC Motor Drive with Power Factor Correction Using PWM Rectifier

Major constraints while using motor drive system are efficiency and cost. Commutation in the conventional DC motors is carried out by commutator which is rotating part placed on the rotor and brushes. Due to these mechanical parts, conventional DC motor consist high amount of losses. Brushless DC (BLDC) Motors are very extensively used motors these days because of its advantages over conventional DC motors. Commutation is carried out with the help of solid-state switches in BLDC motor instead of mechanical commutator as in conventional DC motor. This improves the performance of the motor. BLDC motor draws non-linear currents from the source affecting the loads connected at the source point due to harmonic production. This harmonic production reduces the system efficiency and mainly stresses the loads connected at source point. BLDC drive system with power factor (PF) correction was discussed in this paper. BLDC with normal AC-DC diode bridge rectifier and the performance of BLDC drive with PWM rectifier for power factor correction was discussed. BLDC drive system with PWM rectifier for power factor correction was validated by considering different cases. BLDC motor without power factor correction, BLDC drive with PF correction at starting condition, at steady state and with step-change in DC link voltage models was developed. Torque ripple in BLDC motor drive for these cases were compared. Models were developed and results were obtained using Matlab/Simulink software.

P. Sarala, S. F. Kodad, B. Sarvesh
Threshold Based Clustering Algorithm Analyzes Diabetic Mellitus

Diabetes Mellitus is caused due to disorders of metabolism and its one of the most common diseases in the world today, and growing. Threshold Based Clustering Algorithm (TBCA) is applied to medical data received from practitioners and presented in this paper. Medical data consist of various attributes. TBCA is formulated to effectually compute impactful attributes related to Mellitus, for further decisions. TBCAs primary focus is on computation of Threshold values, to enhance accuracy of clustering results.

Preeti Mulay, Rahul Raghvendra Joshi, Aditya Kumar Anguria, Alisha Gonsalves, Dakshayaa Deepankar, Dipankar Ghosh
Bio-Inspired Algorithms for Mobile Location Management—A New Paradigm

Mobile location management (MLM) has gained a new aspect in today’s cellular wireless communication scenario. It has two perspectives: location registration and location search and a trade-off between the two give optimal cost for location management. An outline of the prominent solutions for the cost optimization in location management using various bio-inspired computations is surveyed. For solving complex optimization problems in various engineering applications more and more such bio-inspired algorithms are recently being explored along with incremental improvement in the existing algorithms. This paper surveys and discusses potential approaches for cost optimization using fifteen bio-inspired algorithms such as Artificial Neural Network, Genetic Algorithm to newly developed Flower Pollination Algorithm and Artificial Plant Optimization. Finally, we survey the potential application of these bio-inspired algorithms for cost optimization in mobile location management issue available in the recent literature and point out the motivation for the use of bio-inspired algorithms in cost optimization and design of optimal cellular network.

Swati Swayamsiddha, Smita Parija, Sudhansu Sekhar Singh, Prasanna Kumar Sahu
TKAR: Efficient Mining of Top-k Association Rules on Real—Life Datasets

Data mining is an important facet for discovering association rules among the biggest scope of itemsets. Association rule mining (ARM) is one of the techniques in data processing with the two sub processes. One is identifying frequent itemsets and the other is association rule mining. Frequent itemset mining has developed as a major issue in data mining and assumes an essential part in various data mining tasks, for example, association analysis, classification, etc. In the structure of frequent itemset mining, the outcomes are itemsets which are frequent in the entire database. Association rule mining is a basic data mining task. Researchers developed many algorithms for finding frequent itemsets and association rules. However, relying upon the choice of the thresholds, present algorithms become very slow and produce a greatly large amount of outcomes or generates few outcomes, omitting usable information. Furthermore, it is well-known that an expansive extent of association rules produced is redundant. This is truly a significant issue because in practice users don’t have much asset for analyzing the outcomes and need to find a certain amount of outcomes within a limited time. To address this issue, we propose a one of a kind algorithm called top-k association rules (TKAR) to mine top positioned data from a data set. The proposed algorithm uses a novel technique for generating association rules. This algorithm is unique and best execution and characteristic of scalability, which will be a beneficial alternative to traditional association rule mining algorithms and where k is the number of rules user want to mine.

O. Gireesha, O. Obulesu
Predicting Defect of Software System

Any particular study on software quality with all desirable attributes of software products can be treated as complete and perfect provided it is defective. Defects continue to be an emerging problem that leads to failure and unexpected behaviour of the system. Prediction of defect in software system in the initial stage may be favourable to a great extend in the process of finding out defects and making the software system efficient, defect-free and improving its over-all quality. To analyze and compare the work done by the researchers on predicting defects of software system, it is necessary to have a look on their varied work. The most frequently used methodologies for predicting defects in the software system have been highlighted in this paper and it has been observed that use of public datasets were considerably more than use of private datasets. On the basis of over-all findings, the key analysis and challenging issues have been identified which will help and encourage further work in this field with application of newer and more effective methodologies.

Soumi Ghosh, Ajay Rana, Vineet Kansal
Fusion of Medical Image Using STSVD

The process of uniting medical images which are taken from different types of images to make them as one image is a Medical Image Fusion. This is performed to increase the image information content and also to reduce the randomness and redundancy which is used for clinical applicability. In this paper a new method called Shearlet Transform (ST) is applied on image by using the Singular Value Decomposition (SVD) to improve the information content of the images. Here two different images Positron Emission Tomography (PET) and Magnetic Resonance Imaging (MRI) are taken for fusing. Initially the ST is applied on the two input images, then for low frequency coefficients the SVD method is applied for fusing purpose and for high frequency coefficients different method is applied. Then fuse the low and high frequency coefficients. Then the Inverse Shearlet Transform (IST) is applied to rebuild the fused image. To carry out the experiments three benchmark images are used and are compared with the progressive techniques. The results show that the proposed method exceeds many progressive techniques.

K N Narasimha Murthy, J Kusuma
Modified Cuckoo Search Algorithm for Fittest Relay Identification in Microgrid

Microgrid is a group of interconnected generating units and loads at the distribution level which operates in two modes—Grid connected mode and Islanded mode. Fault clearance in a microgrid is a key challenge for protection engineers. This paper aims to identify the best fit relay suitable for a microgrid using modified cuckoo search algorithm based on key parameters like current rating, Time Multiplier Setting (TMS), Plug Setting Multiplier (PSM) and time of operation (top). This algorithm aids in providing suitable relay coordination in microgrid and clears the faulty portion of network effectively from the healthy portion of network.

O. V. Gnana Swathika, Santanab Mukhopadhyay, Yatharth Gupta, Arka Das, S. Hemamalini
Optimization of Overcurrent Relays in Microgrid Using Interior Point Method and Active Set Method

Microgrid is an aggregate of generating units and loads at the distribution level. It operates in two modes: grid connected mode and Islanded mode. Fault clearance in a microgrid is a key challenge for protection engineers. This paper aims to identify the optimized values of time of operation of overcurrent relays in a microgrid network. This paper solves the optimization problem using two methods—Interior Point method and Active Set method. Also three types of relays are used to determine which relay works best in similar constraint environment. These methods aid in providing suitable relay coordination in microgrid and clear the faulty portion of network quickly from the healthy portion of network.

O. V. Gnana Swathika, Arka Das, Yatharth Gupta, Santanab Mukhopadhyay, S. Hemamalini
Series Smart Wire—Managing Load and Congestion in Transmission Line

Nowadays, congestion management is a major problem in power system deregulation. With the continuous increase in load demand, there is continuous requirement for different new technologies resulting in advanced network operation. This paper presents a solution for congestion management by developing a series smart wire module which operates with the increment in load. The circuitry is bypassed if it does not detects congestion in line else with the detection of congestion series smart wire module is operated. This method improves the reliability of the system by reducing active power losses. The effectiveness of this module is demonstrated in standard IEEE 15 bus system model using MATLAB/Simulink and the results are formulated with graphical representations.

Abhishek, Divya Asija, Pallavi Choudekar, Yogasree Manganuri
Gene Ontology Based Function Prediction of Human Protein Using Protein Sequence and Neighborhood Property of PPI Network

Predicting functions of protein from its amino acid sequence and interacting protein partner is one of the major challenges in post genomic era compared with costly, time consuming biological wet lab techniques. In drug discovery, target protein identification is important step as its inhibition may disturb the activities of pathogen. So, the knowledge of protein function is necessary to inspect the cause of diseases. In this work, we have proposed two function prediction methods FunPred1.1 and FunPred1.2 which use neighbourhood analysis of unknown protein empowered with Amino Acid physico-chemical properties. The basic objective and working of these two methods are almost similar but FunPred1.1 works on the entire neighbourhood graph of unknown protein whereas FunPred1.2 does same with greater efficiency on the densely connected neighbourhood graph considering edge clustering coefficient. In terms of time and performance, FunPred1.2 achieves better than FunPred1.1. All the relevant data, source code and detailed performance on test data are available for download at FunPred-1.

Sovan Saha, Piyali Chatterjee, Subhadip Basu, Mita Nasipuri
PLoc-Euk: An Ensemble Classifier for Prediction of Eukaryotic Protein Sub-cellular Localization

Protein Sub-Cellular Localization is very important information as they play a crucial role in their functions. Thus, prediction of protein Sub-Cellular Localization has become very promising and challenging problem in the field of Bioinformatics. Recently, a number of computational methods based on amino acid compositions or on the functional domain or sorting signal. But, they lack of contextual information of the protein sequence. In this paper, an ensemble classifier, PLoc-Euk is proposed to predict sub-cellular location for the eukaryotic proteins which uses multiple physico-chemical properties of amino acid along with their composition. PLoC-Euk aims to predict protein Sub-Cellular Localization in eukaryotes across five different locations, namely, Cell Wall, Cytoplasm, Extracellular, Mitochondrion, and Nucleus. The classifier is applied to the dataset extracted from http://www.bioinfo.tsinghua.edu.cn/~guotao/data/ and achieves 73.37% overall accuracy.

Rajkamal Mitra, Piyali Chatterjee, Subhadip Basu, Mahantapas Kundu, Mita Nasipuri
Drive-by-Download Malware Detection in Hosts by Analyzing System Resource Utilization Using One Class Support Vector Machines

Drive-by-Download is an unintentional download of a malware on to a user system. Detection of drive-by-download based malware infection in a host is a challenging task, due to the stealthy nature of this attack. The user of the system is not aware of the malware infection occurred as it happens in the background. The signature based antivirus systems are not able to detect zero-day malware. Most of the detection has been performed either from the signature matching or by reverse engineering the binaries or by running the binaries in a sandbox environment. In this paper, we propose One Class SVM based supervised learning method to detect the drive-by-download infection. The features comprises of system RAM and CPU utilization details. The experimental setup to collect data contains machine specification matching 4 user profiles namely Designer, Gamer, Normal User and Student. The experimental system proposed in this paper was evaluated using precision, recall and F-measure.

Prabaharan Poornachandran, S. Praveen, Aravind Ashok, Manu R. Krishnan, K. P. Soman
Privacy Preserving Data Mining: A Parametric Analysis

With technological revolution, a huge amount of data is being collected and as a consequence the need of mining knowledge from this data is triggered. But, data in its raw form comprises of sensitive information and advances in data mining techniques have increased the privacy breach. However, due to socio-technical transformations, most countries have levied the guidelines and policies for publishing certain data. As a result, a new area known as Privacy Preserving Data Mining (PPDM) has emerged. The goal of PPDM is to extract valuable information from data while retaining privacy of this data. The paper focuses on exploring PPDM in different aspects, such as types of privacy, PPDM scenarios and applications, methods of evaluating PPDM algorithms etc. Also, the paper shows parametric analysis and comparison of different PPDM techniques. The goal of this study is to facilitate better understanding of these PPDM techniques and boost fruitful research in this direction.

Darshana Patel, Radhika Kotecha
Low Power 14T Hybrid Full Adder Cell

The performance of the adder entirely influenced by the performance of its basic modules. In this paper, a new hybrid 1-bit 14 transistor full adder design is proposed. The proposed circuit has been implemented using pass gate as well as CMOS logic hence named hybrid. The main design objective for this circuit is low power consumption and full voltage swing at a low supply voltage. As a result the proposed adder cell remarkably improves the power consumption, power-delay product and has less parasitic capacitance when compared to the 16T design. It also improves layout area by 7–8 % than its peer design. All simulations are performed at 90 & 45 nm process technology on Synopsys tool.

Chauhan Sugandha, Sharma Tripti
Improved Dynamic Time Warping Based Approach for Activity Recognition

Dynamic Time Warping (DTW) has been a very efficient tool in matching two time series and in past much work has already done in modifying DTW so as to enhance its efficiency and further broadening its application areas. In this paper we are proposing an enhanced version of DTW by calculating mean and standard deviation of the minimum warping path because of which the efficiency of DTW increased in detecting different human activities. We also introduce a new fusion of DTW with Histogram of Gradients (HOG) as it helped in extracting both temporal and spatio information of the activity and this fusion has worked very effectively to depict human activities. We used Random Forest as a classification tool giving highest accuracy of 88 % in weizMan dataset.

Vikas Tripathi, Sharat Agarwal, Ankush Mittal, Durgaprasad Gangodkar
STRIDE Based Analysis of the Chrome Browser Extensions API

Chrome browser extensions have become very popular among the users of Google Chrome and hence they are used by attackers to perform malicious activities which lead to loss of user’s sensitive data or damage to the user’s system. In this study, we have done an analysis on the security of the Chrome extension development APIs. We have used the STRIDE approach to identify the possible threats of the Chrome specific APIs which are used for extension development. The analysis results show that 23 out of the 63 Chrome specific APIs are having various threats as per the STRIDE approach. Information disclosure is the threat faced by many APIs followed by tampering. This threat analysis result can be used as reference for a tool which can detect whether the extension is malicious or not by deeply analysing the ways in which the APIs having threats are used in the extension code.

P. K. Akshay Dev, K. P. Jevitha
Some Properties of Rough Sets on Fuzzy Approximation Spaces and Applications

The notion of Rough sets introduced by Pawlak has been extended in many directions to enhance its modelling power. One such approach is to reduce the restriction of the base relation being an equivalence relation. Adding the flavour of fuzzy sets to it a fuzzy proximity relation was used to generate a fuzzy approximation space by De et al. in 1999 and hence the rough sets on fuzzy approximation spaces could be generated. These are much more general than the basic rough sets and also the rough sets defined on proximity relations. However, some of the results established in this direction by De et al. have been found to be faulty. In this paper we show through examples that the results are actually faulty and provide their correct versions. Also, we establish some more properties of these rough sets. A real life application is provided to show the application of the results.

B. K. Tripathy, Suvendu Kumar Parida
Design and Optimization of Fractal Antenna for UWB Application

UWB application increases day-by-day along with various applications. As the communication is mostly dependent on wireless based, so a suitable antenna design is a major challenge for the researchers. In this paper we’ve taken an attempt to partially meet the challenge. The antenna is microstrip type based on Fractal Geometry. Initially the fractal antenna design has been made so its performance is in terms of bandwidth. Further the parameter of the antenna have been optimized to have better performance as compared to Un-optimized antenna. It shows output as shown in result section. Particle Swarm Optimization (PSO) which is a viable developmental improvement strategy is used for optimizing the proposed antenna. This method provides better results in the design of the antenna and it also analyzed the effect of the various design parameters like ground plane, feed line width, middle triangle radius. The improvement in the results has been included in the design. It is found to be suitable for UWB communication application.

Arati Behera, Sarmistha Satrusallya, Mihir Narayan Mohanty
Multi-objective IT Professionals’ Utilization Problems Using Fuzzy Goal Programming

This paper presents a fuzzy goal programming approach to solve IT professionals’ utilization problems for software firms. These problems involve multiple objectives and binary decision variables. The fuzzy goal programming approach helps to quantify uncertainness of the objectives of the problem. With the help of membership functions, the problem is converted to its equivalent deterministic form. A case study demonstrates the effectiveness of the approach.

R. K. Jana, Manas Kumar Sanyal, Saikat Chakrabarti
Convex Hyperspectral Unmixing Algorithm Using Parameterized Non-convex Penalty Function

Unmixing of hyperspectral data is an area of major research because the information it provides is utilized in plethora of fields. The year of 2006 witnessed the emergence of Compressed Sensing algorithm which was later used to spearhead research in umixing problems. Later, the notion of $$\ell _p$$ norms $$0< p < 1$$ and other non-smooth and non-convex penalty function were used in place of the traditional convex $$\ell _1$$ penalty. Dealing with optimization problems with non-convex objective function is rather difficult as most methodologies often get stuck at local optima. In this paper, a parameterised non-convex penalty function is used to induce sparsity in the unknown.The parameters of penalty function can be adjusted so as to make the objective function convex, thus resulting in the possibility of finding a global optimal solution. Here ADMM algorithm is utilized to arrive at the final iterative algorithm for the unmixing problem. The algorithm is tested on synthetic data set, generated from the spectral library provided by US geological survey. Different parametric penalty functions like $$\log $$ and $$\arctan $$ are used in the algorithm and is compared with the traditional $$\ell _1$$ penalties, in terms of the performance measures RSNR and PoS. It was observed that the non-convex penalty functions out-performs the $$\ell _1$$ penalty in terms of the aforementioned measures.

K. HariKumar, K. P. Soman
Novel Techniques for Detection of Anomalies in Brain MR Images

With the significant growth in the field of medical imaging, the analysis of brain MR images is constantly evolving and challenging filed. MR Images are widely used for medical diagnosis and in numerous clinical applications. In brain MR Image study, image segmentation is mostly used for determining and visualizing the brain’s anatomical structures. The parallel research results articulated the enhancement in brain MR image segmentation by combining varied methods and techniques. Yet the precise results are not been proposed and established in the comparable researches. Thus, this work presents an analysis of accuracy for brain disorder detection using most accepted Watershed and Expectation Maximization-Gaussian Mixture Method. The bilateral filter is employed to the Watershed and Expectation Maximization-Gaussian Mixture Method to improve the image edges for better segmentation and detection of brain anomalies in MR images. The comparative performance of the Watershed and EM-GM method is also been demonstrated with the help of multiple MR image datasets.

K. Bhima, A. Jagan
Cryptanalysis of Secure Routing Among Authenticated Nodes in MANETs

Secure routing (SR) is one of the most important issues in Mobile Ad hoc Networks (MANETs). Recently, in 2013, Zhao et al. proposed an efficient routing integrated framework for MANETs. They claimed that their proposed scheme distributes the system parameter only to the authenticate nodes before network set up phase. However, based on cryptanalysis, we have found that an unauthenticated nodes are also be able to get that original system parameter and behave like a malicious node in the network. Thus, their scheme fails to provide an authenticate distribution mechanism in real life application. As a counter measurement, this paper aims to present an efficient authenticated distribution mechanism that can be incorporated very efficiently in their scheme. Our proposed technique is found to be secure under the hardness of Computational Diffie-Hellman (CDH) assumption.

Rajeev Ranjan
EEG Based Oscitancy Classification System for Accidental Prevention

Drowsiness and alcohol consumption has always been the root cause of the road mishaps that takes place. Excessive consumption of alcohol gives rises to many complications such as it prevents healthy thinking and slows down reflex actions. So in order to determine a person’s capability to do a job, his oscitancy tracking is very much important. In this paper, we classify the EEG signal taken from 50 drunk and 50 non drunk people. Various band decomposition of the data was done using the DWT (Discrete Wavelet transformation) and further trained by ANN (Artificial Neural Network) approach. Further we suggest an intelligence system which monitors and decide whether the driver should be allowed to drive the vehicle or not based on his drowsiness classification which can prevent accidents with drunken drivers.

Jay Sarraf, Satarupa Chakrabarty, Prasant Kumar Pattnaik
Robust Flare Phase Auto Landing of Aircrafts, Based on Modified ASBO

The presented research work is focused on automatic flight landing control of an aircraft for synthesis of optimal flare phase with considering the dynamic deflection angle as a control parameter. The behavior of the aircraft has considered in terms of four first-order differential equations and an explicit one step Runge Kutta has an order of 4 to 5 has applied to solve that. Computational intelligence based approach; modified adaptive social behavior optimization (mASBO) has applied to estimate the optimum deflection angles on discrete time to deliver the desired flare phase performances. The proposed modification in adaptive social behavior optimization has a better balance between exploration and exploitation by providing a competitive environment for leader selection, in result, faster convergence achieved. Height based on ascent rate controlling function presented here provides an adaptive control mechanism to obtain the desired landing performances in the presence of change in starting landing altitude in compare of predefined reference altitude due to poor visibility or wind disturbance.

G. Parimala Gandhi, Nagaraj Ramrao, Manoj Kumar Singh
RSentiment: A Tool to Extract Meaningful Insights from Textual Reviews

Every system needs continuous improvement. Feedback from different stakeholders plays a crucial role here. From literature study, the need of textual feedback analysis for an academic institute is well established. In fact, it has been perceived that often a textual feedback is more informative, more open ended and more effective in producing actionable insights to decision makers as compared to more common score based (on a scale from 1: n) feedback. However, getting this information from textual feedback is not possible through the traditional means of data analysis. Here we have conceptualized a tool, which can apply text mining techniques to elicit insights from textual data and has been published as an open source package for a broader use by practitioners. Appropriate visualization techniques are applied for intuitive understanding of the insights. For this, we have used a real dataset consisting of alumni feedback from a top engineering college in Kolkata.

Subhasree Bose, Urmi Saha, Debanjana Kar, Saptarsi Goswami, Amlan Kusum Nayak, Satyajit Chakrabarti
A Purely Localized Random Key Sequencing Using Accelerated Hashing in Wireless Ad-Hoc Networks

Wireless Ad hoc networks represent a form of cooperative networking through peer to peer behavior with others nodes in the networks. Hop by hop communication is default way of communication. Most of the communications are localized and interaction among local nodes requires local security provisioning. In the absence of any centralized certification authority and absence of viable localization and synchronization hardware, schematic localization and periodic refreshing proved to be a feasible solution. Several solutions have exploited GPS based localization and periodic refreshing cycles to provide a viable security solution for wireless Ad hoc networks. In this paper, we have proposed an accelerated hashing mechanism with schematic localization based on variable or multiple transmission range of few nodes. The solution has been evaluated mathematically for performance parameters like connectivity, storage overhead and computation efficiency.

Amit Kumar, Vijay K. Katiyar, Kamal Kumar
GOASREP: Goal Oriented Approach for Software Requirements Elicitation and Prioritization Using Analytic Hierarchy Process

Software requirements elicitation is a valuable process for the identification of software requirements according to the need of different types of stakeholders. There are different methods for the elicitation of software requirements like traditional methods, group elicitation methods, goal oriented methods, etc. Among these methods, goal oriented methods have received much recognition by software requirements engineering community. On the basis of our literature review, we identify that “goal oriented requirements elicitation processes do not support how to select and prioritize the requirements using analytic hierarchy process on the basis of the cost and effort criteria”. Therefore, in-order to address this issue, we proposed a method, i.e. GOASREP, for the elicitation of software requirements using “goal oriented approach” and the prioritization of the elicited requirements using “analytic hierarchy process”. In the proposed method, we used function point analysis approach for the estimation of the cost of each requirement. COCOMO model has been applied to estimate the effort of each requirement. Finally, the usage of the GOASREP is explained using Institute Examination System.

Nikita Garg, Mohd. Sadiq, Pankaj Agarwal
Confidentiality and Storage of Data in Cloud Environment

In this paper, we provide a secure storage of data in the cloud server using Identity-Based Encryption. Where the data owner shares the stored data with the cloud users on the basics of pay-as-you-use principle. The cloud user request for the encrypted data, stored on the cloud server’s database (by applying encryption on keyword) and the server verifies the request by performing test on encrypted data. So, that illegal users or unauthorized servers cannot attack on the data.

Prerna Mohit, G. P. Biswas
A Method for the Selection of Agile Methods Using AHP

There are different types of lightweight methods for the development of software like eXtreme Programming (XP), scrum, agile modeling, etc. These methods are also referred to as agile methods. Different criteria’s are involved during the selection of agile methods so we visualize the agile methods selection problem as a multi-criteria decision making problem. Selection of an appropriate agile method according to the need of the project is an important research issue. Therefore, in order to address this issue, we present a method for the selection of agile methods using Analytic Hierarchy Process (AHP). Following criteria’s have been used for the selection of agile methods, i.e., positive response in dynamic requirements (PRDR), incorporation of requirements changes (IRC), communication with the customer (CWC), and the size of development team (SDT). Finally, a case study is given to explain the proposed method.

Bushra Sayed, Zeba Shamsi, Mohd. Sadiq
Load Balancing with Job Switching in Cloud Computing Network

Cloud computing, described as distributed online computing, is a kind of Internet-based computing that provides pooled web resources and applications to connected servers and other machines on user’s demand. It is a web system for enabling ubiquitous, on-demand access to a shared pool of configurable computing resources which can be rapidly provisioned and released with minimal management effort. Load balancing is an important issue in the cloud computing. Cloud computing comprises of many web resources and managing. This plays a vital role in executing a user’s request. In this present condition the load balancing algorithms should be very efficient in allocating the user request and also ensuring the usage of the resources in an intelligent way so that underutilization of the resources will not occur and preemptive based resource management be there in the cloud environment. Cloud computing services different types of nodes connected to cloud to assist the execution of great number of tasks. As a result, to select suitable node or machine for executing a task is able to develop the performance of cloud computing system. A job switching is the switching of the job from one machine to another machine to minimize the overall completion time. In this paper, we propose a load balancing algorithm combining minimum completion time as well as load balancing strategies with job switching.

Ranjan Kumar Mondal, Subhranshu Roy, Palash Samanta, Payel Ray, Enakshmi Nandi, Biswajit Biswas, Manas Kumar Sanyal, Debabrata Sarddar
An Efficient Detection and Mitigation of Selfish and Black Hole Attack in Cognitive Mobile Ad-hoc Network

Cognitive radio network (CRN) is one where the licensed unused band can be redistributed among the demanding of users without an access to the licensed bands. It is essentially a programmable software radio. Such architecture suffers from different types of attacks, i.e. selfish attack and black hole attack. In this work, the problem is solved by providing an integrated solution for detecting selfish and black hole attacks and once the attack is detected that is being mitigated to all secondary user (SU), such that the SU would blacklist the attacking node. The result shows that. Such attack detection and mitigation improve the network quality significantly by allowing the node to black listed the attacking node and re-utilize of the spectrum among the among the SU nodes.

Deepa, Jayashree Agarkhed
An Innovative Method for Load Balanced Clustering Problem for Wireless Sensor Network in Mobile Cloud Computing

Mobile Cloud Computing is a revolutionary way where global world is progressing in massive way. Connecting wireless sensor network with Mobile Cloud computing is a novel idea in this era. In this year several research has demonstrated to integrate wireless sensor networks (WSNs) with mobile cloud computing, so that cloud computing can be exploited to process the sensory data collected by WSNs and allow these date to the mobile clients in fast, reliable and secured way. For rising lifetime of wireless sensor network, minimizing energy consumption is an important factor. In this case clustering sensor nodes is one of the effective solutions. It is required to gain some excessive load for cluster heads of cluster based WSN in case of collection of huge data, aggregation and communication of this respective data to base station. Particle Swarm Optimization or PSO is an efficient solution of for this problem.

Debabrata Sarddar, Enakshmi Nandi, Anil Kumar Sharma, Biswajit Biswas, Manas Kumar Sanyal
Dengue Fever Classification Using Gene Expression Data: A PSO Based Artificial Neural Network Approach

A mosquito borne pathogen called Dengue virus (DENV) has been emerged as one of the most fatal threats in the recent time. Infections can be in two main forms, namely the DF (Dengue Fever), and DHF (Dengue Hemorrhagic Fever). An efficient detection method for both fever types turns out to be a significant task. Thus, in the present work, a novel application of Particle Swarm Optimization (PSO) trained Artificial Neural Network (ANN) has been employed to separate the patients having Dengue fevers from those who are recovering from it or do not have DF. The ANN’s input weight vector are optimized using PSO to achieve the expected accuracy and to avoid premature convergence toward the local optima. Therefore, a gene expression data (GDS5093 dataset) available publicly is used. The dataset contains gene expression data for DF, DHF, convalescent and healthy control patients of total 56 subjects. Greedy forward selection method has been applied to select most promising genes to identify the DF, DHF and normal (either convalescent or healthy controlled) patients. The proposed system performance was compared to the multilayer perceptron feed-forward neural network (MLP-FFN) classifier. Results proved the dominance of the proposed method with achieved accuracy of 90.91 %.

Sankhadeep Chatterjee, Sirshendu Hore, Nilanjan Dey, Sayan Chakraborty, Amira S. Ashour
Modified Non Linear Diffusion Approach for Multiplicative Noise

Synthetic Aperture Radar (SAR) is a useful coherent imaging tool for extracting information from various fields such as astronomy and meteorology. SAR images are often corrupted by granular noise known as speckle which follows a multiplicative model. Speckle reflection in homogenous as well as heterogeneous areas obscures the contrast between the target-of-interest and its surroundings. This paper proposes a modified Non Linear Diffusion Approach for despeckling SAR images. The essence is to develop an approach that can suppress speckle and preserve the structural content as an improvement over conventional anisotropic diffusion filtering.

Vikrant Bhateja, Aditi Sharma, Abhishek Tripathi, Suresh Chandra Satapathy
L-Slotted Microstrip Fed Monopole Antenna for Triple Band WLAN and WiMAX Applications

In this paper, a monopole antenna is presented for triple band WLAN and WiMAX applications. The antenna consists of four L-slots on a radiating rectangular patch and a truncated ground plane. The multiband characteristic of the antenna is achieved by a rectangular patch with four L-slots and bandwidth of the antenna is improved by cutting slots on truncated ground plane. The entire volume of the antenna is 29 × 34 × 0.8 mm3 which is very compact with operating bands of (2.276–2.58 GHz)/2.411 GHz, (3.585–3.623 GHz)/3.609 GHz and (5.508–5.765)/5.56 GHz which covers operating bands for WLAN as per IEEE 802.11 a/b/g/n standards with 12 %, 1 % and 4.5 % impedance bandwidth respectively.

Chandan, Toolika Srivastava, B. S. Rai
Performance Analysis of Fully Depleted SOI Tapered Body Reduced Source (FD-SOI TBRS) MOSFET for Low Power Digital Applications

The fully depleted silicon-on-insulator metal oxide semiconductor field effect transistor (FD- SOI MOSFET) have been considered a promising candidate to extend scaling of planar CMOS technology beyond 100 nm. This technology has been used to reduce leakage current, parasitic capacitances, and fabrication complexity as compared to planar CMOS technology at 50 nm gate length. This paper presents the performance analysis of proposed Tapered Body Reduced Source (FD-SOI TBRS) MOSFET. The proposed structure consumes less chip area and better electrical performance as compared to conventional FD-SOI MOSFET. The proposed structure exhibits higher Ion to Ioff ratio when compared with conventional FD-SOI MOSFET. The structures were designed and simulated using the Cogenda device simulator.

Vimal Kumar Mishra, R. K. Chauhan
A Study and Analysis of Different Brain Tumor Segmentation Techniques

In this paper, a thorough study and quantitative analysis of different brain tumor segmentation techniques will be addressed. Several significant algorithms are proposed in literature for partitioning MRI (Magnetic Resonance Imaging) brain image into considerable multiple disjoint regions indicating tumor tissues and normal brain tissues. But, benchmarking brain tumor segmentation (BTS) techniques is found very less. In this regard, we study, explore and create benchmark for most popular and widely accepted segmentation techniques such as histogram thresholding, adaptive k-means clustering, region based active contour SVM and PCA based K-NN classifier. A detailed quantitative evaluation of aforesaid techniques on the MRI images from the standard datasets as well as collection of our own datasets is presented. An analysis of results reported will have an excellent impact on current and future research efforts in brain tumor segmentation.

K. Mahantesh, B. V. Sandesh Kumar, V. N. Manjunath Aradhya
A Novel Representation for Classification of User Sentiments

In this paper we present a term sequence preserving representation for text documents called as label matrix for classification of opinions. This is an efficient yet effective representation in polarity of opinions with the use of only three parts of speech feature set viz verb, adverb and adjective. To draw out the efficiency of our proposed technique we led experimentation on one publically accessible extremity audit dataset furthermore our own made motion picture survey and item survey datasets. We have explored quantitative comparative analysis between existing classifiers and proposed method.

Steven Lawrence Fernandes, S. N. Bharath Bhushan, Vinlal Vinod, M. J. Adarsh
A 3D Approach for Palm Leaf Character Recognition Using Histogram Computation and Distance Profile Features

Handwritten character recognition has been a well-known area of research for last five decades. This is an important application of pattern recognition in image processing. Generally 2D scanning is used and the text is captured in the form of an image. In this work instead of regular scanning method, the X, Y co-ordinates are measured using measuroscope at every pixel point. Further a 3D feature, depth of indentation, ‘Z’, which is proportional to the pressure applied by the scriber at that point, is measured using a dial gauge indicator. In the present work the profile based features extracted for palm leaf character recognition are ‘histogram’ and ‘distance’ profiles. The recognition accuracy obtained using the Z-dimension, a 3D feature, is very high and the best result obtained is 92.8 % using histogram profile algorithm.

Panyam Narahari Sastry, T. R. Vijaya Lakshmi, N. V. Koteswara Rao, Krishnan RamaKrishnan
Feature Optimization to Recognize Telugu Handwritten Characters by Implementing DE and PSO Techniques

Recognizing Indian handwritten text is relatively complex compared to recognized foreign language such as English. In this work optimization techniques are presented to recognize Telugu handwritten characters. By extracting cell-based directional features from these characters, optimum features are selected by implementing optimization algorithms such as differential evolution and particle swarm optimization. An improvement of 3.5 % recognition accuracy is achieved using differential evolution algorithm. The optimization techniques are compared with the existing hybrid approach of Telugu script.

T. R. Vijaya Lakshmi, Panyam Narahari Sastry, T. V. Rajinikanth
Writer Specific Parameters for Online Signature Verification

In this work, we present a new model which is capable of handling variations in signatures for better representation by employing structure preserving feature selection method and representing the selected data in the form of interval representation scheme. The proposed model represents each writer with writer dependent dimension and authentication threshold. Decisions on the number of features to be used for each writer and the similarity threshold for deciding the authenticity of a given signature are arrived based on minimum equal error rate (EER) criteria. Based on the symbolic representation, a method of verification is proposed. The proposed model is tested for its effectiveness on benchmarking MCYT (DB1) and MCYT (DB2) datasets consisting of signatures of 100 and 330 writers respectively. The obtained results indicate the effectiveness of the proposed model.

K. S. Manjunatha, S. Manjunath, D. S. Guru
Word-Level Script Identification from Scene Images

Script identification on camera based bus sign boards are presented in this work. The text localization is achieved by a series of morphological operations. Number of texture features, such as gabor features, log-gabor features and wavelet features are extracted from the segmented text images to classify the text images into three scripts, English, Kannada and Malayalam. Three different classifiers are evaluated and results are reported in this work.

O. K. Fasil, S. Manjunath, V. N. Manjunath Aradhya
Recognizing Human Faces with Tilt

Issues related to realtime face recognition are perpetual even with many existing approaches. Generalizing these issues is tedious over different applications. In this paper, the real time issues such as tilt or rotation variation and few samples problem for face recognition are addressed and proposed an efficient method. In preprocessing, an edge detection method using Robert`s operator is utilized to identify facial borders for cropping purpose. The query images are axially tilted for different degrees of rotation. Both database and test images are segmented into one hundred fragments of 5 * 5 size each. Four different matrix characteristics are derived for each divided part of the image. Corresponding attributes are added to yield features related to final matrix. Final one hundred facial attributes are obtained by fusing diagonal features with one hundred features of matrix. Euclidean distance between the final attributes of gallery and query images is computed. The results on Yale dataset has superior performance compared to the existing different approaches and it is convincing over the dataset created.

H. S. Jagadeesh, K. Suresh Babu, K. B. Raja
Cloud Computing Technology as an Auto Filtration System for Cost Reduction While Roaming

Cloud computing environment appears to be indispensable in ensuring economical telecommunication. When a person changes current state (national and international), the mobile service providers change their respective charges adding incoming call charges as well. Further, from the telecom service providers end, no filtering of unnecessary calls can be done. In this paper, a cloud-based system has been proposed indicating a reduction of the communication cost between the users when one of the two users is in roaming. By making use of the technology, a robust system has been developed that automatically identifies a roaming mobile number and blocks any unknown number but notifies at either end. The paper provides an analysis of service roaming considering the ground realities of the international mobile roaming of both industry and market.

S. Biswas, A. Mukherjee, M. Roy Chowdhury, A. B. Bhattacharya
Design of Visual Cryptography Scheme Using C(m, w) Combinations

VCS is a perfect secure technique that allows easy concealment of images without any cryptographic computation, however, the encrypted image can be recovered by human visual system. The scheme is proposed by Naor and Shamir for binary images, called k-out-of-n VCS, where k ≤ n participants can recover images. Subsequently, a number of efficient VCS models are proposed that enhanced different VCS features. This work proposes a new model for k = 2, 3 that generates shares using C(m, w), where a bit string of length m with w < m number of 1’s is taken for constructing a share. On analysis, it has been found that our scheme realizes 2-out-of-n VCS more efficiently than the scheme proposed by Naor et al. Also our design for 3-out-of-n VCS shows improvement as well as supports access structure similar to Ateniese et al. On simulation, it has been found that our VCS performs satisfactorily.

Anindya Kr. Biswas, S. Mukhopadhyay, G. P. Biswas
A Secured Digital Signature Using Conjugacy and DLP on Non-commutative Group over Finite Field

In the present paper, we propose a secured scheme of digital signature connecting both conjugacy problem and discrete logarithm problem based on non-commutative group generated over a finite field. For this, we define a non-commutative group over matrices with the elements of finite field such that conjugacy and discrete logarithm problems can be executed together proficiently. By doing so, we can formulate the signature structures using conjugacy and discrete logarithm through non commutative group. In some domains, the above combination reduces to completely in discrete logarithm problem. This digital signature scheme more elemental over F*q(x) = G Ln (Fq). Here the security of the signature protocol depending on complexity of the problems associated with conjugacy and discrete logarithm. The security analysis and intermission of proposed protocol of digital signature is presented with the aid of order of complexity, existential forgery and signature repudiation.

L. Narendra Mohan, G.S.G.N. Anjaneyulu
A Novel Edge Based Chaotic Steganography Method Using Neural Network

This paper provides a method of hiding sensitive information in digital image. In this paper, we introduce a chaotic edge based steganography techniques based on artificial neural network. First we find the edges of image using artificial neural network which is given by Jinan Gu et al. Secondly, the key based chaotic scheme is used to disperse the bits of the secret message randomly into edge pixel of the image to produce the stego image that take advantage of edge detection techniques. Finally the experiment results show the higher value of PSNR that indicate that there is no difference between the original and stego image. Therefore the proposed algorithms are dependent on the key which make it robust and can protect the secret data from stealing. The experimental results show the satisfactory performance of proposed method based on edge detection techniques.

Shahzad Alam, Tanvir Ahmad, M. N. Doja
A Latest Comprehensive Study on Structured Threat Information Expression (STIX) and Trusted Automated Exchange of Indicator Information (TAXII)

One of the important challenges in threat intelligence is to use them efficiently which can be obtained by both external and internal sources. The need for organization to have cyber threat intelligence is growing and a basic component for any such capacity is sharing threat intelligence between trusted partners, which will help us to target and compute the large cyber security information. This paper briefly explains the way of sharing the threat information which is both human and machine readable using Structured Threat Information Expression (STIX) and Trusted automated exchange of indicator information.

M. Apoorva, Rajesh Eswarawaka, P. Vijay Bhaskar Reddy
Preventive Maintenance Approach for Storage and Retrieval of Sensitive Data

Securing user data is one of the challenges in all applications. User’s sensitive data present in the storage server is expected to be highly available, secured and easily accessible from anywhere according to the demand in time. This paper provides a preventive maintenance approach to access the data, though storage server containing the sensitive data fails. This approach uses a simultaneous copying technique to save the data on both storage server and backup server during the upload process. Thus, when server fails, the data needs of the user can be served by the backup server. Data file is broken into data blocks and these blocks are encrypted and stored in the storage server instead of directly uploading the sensitive data file. Thus, when an intruder gains access to the storage server and tries to access the data, retrieval of the data file is not possible since mapping of files on the data blocks is random and encrypted. This supports preventive maintenance of data which not only secures data but also reduces risk and cost of recovery.

V Madhusudhan, V Suma
Join Operations to Enhance Performance in Hadoop MapReduce Environment

Analyzing large data sets is gaining more importance because of its wide variety of applications in parallel and distributed environment. Hadoop environment gives more flexibility to programmers in parallel computing. One of the advantages of Hadoop is query evaluation over large datasets. Join operations in query evaluation plays a major role over the large data. This paper Ferret outs the earlier solutions, prolongs them and recommends a new approach for the implementation of joins in Hadoop.

Pavan Kumar Pagadala, M. Vikram, Rajesh Eswarawaka, P Srinivasa Reddy
Adaptive Spatio-Temporal Filtering with Motion Estimation for Mixed Noise Removal and Contrast Enhancement in Video Sequence

Naturally available noises in the videos are complex but fortunately they can be broadly classified as Gaussian and Impulse noises. Most of the available models for noise removal emphasize on any one kind of noise removal thus an optimum model of mixed noise removal is still a challenge. This paper describes about removal of video flickering and artifacts due to sensor motion, unprofessional recording behaviors, device defects, poor lighting conditions and high dynamic exposure. The adaptive spatio-temporal filter gives excellent result for mixed (Gaussian and Impulse) noise removal. Dense optical flow is introduced to reduce the motion blur and enhance the video. The analysis of PSNR and SSIM values were compared with existed method like Non-local Means and BM3D approach and results are tabulated. The Histogram graph gives the better intensity distribution in frames thus the proposed method even works good for low illumination or night vision surveillance videos.

S. Madhura, K. Suresh
Social and Temporal-Aware Personalized Recommendation for Best Spreaders on Information Sharing

As the growth of online information sharing and online shopping is tremendous, the social networking sites and Online Shops (OSs) have become the potential information sources to the Recommender System (RS). The RS provides either services or products to the users based on their preferences. An Online Social Network (OSN) enables the users to share information with their social neighbors. An OS furnishes the products to the customers or users based on their requirements. The conventional context-aware recommender systems are inept at predicting the new user’s preferences, and user’s recent preferences. Usually, customers like to drift their preferences over time due to the evolution of the products in the OS. Hence, together considering of the key parameters of social popularity and temporal dynamics is crucial for modeling the RS. This paper presents Social and Temporal-Aware personalized Recommendation for the best Spreaders (STARS) approach which recommends the products based on the social influence and recent context information about the user. It employs the collaborative filtering and incorporates the three phases such as influence user identification using OSN, user’s preference identification using OS, and recent preference based recommendation using temporal dynamics. Initially, the STARS identifies the best spreader in OSN by applying Eigen Vector Centrality (EVC) measurement in the k-shell structure. Secondly, it analyzes the customer’s explicit as well as implicit feedback information using a user-item matrix factorization and Pearson correlation measurement. Finally, the STARS recommends the appropriate products to the users by predicting the user’s recent preferences reading it from the context-aware explicit and implicit feedback information. The experimental results show that the STARS significantly outperform the conventional context-aware recommender systems.

Ananthi Sheshasaayee, Hariharan Jayamangala
A Combined System for Regionalization in Spatial Data Mining Based on Fuzzy C-Means Algorithm with Gravitational Search Algorithm

The proposed new hybrid approach for data clustering is achieved by initially exploiting spatial fuzzy c-means for clustering the vertex into homogeneous regions. Further to improve the fuzzy c-means with its achievement in segmentation, we make use of gravitational search algorithm which is inspired by Newton’s rule of gravity. In this paper, a modified modularity measure to optimize the cluster is presented. The technique is evaluated under standard metrics of accuracy, sensitivity, specificity, Map, RMSE and MAD. From the results, we can infer that the proposed technique has obtained good results.

Ananthi Sheshasaayee, D. Sridevi
Design and Functional Verification of Axi2OCP Bridge for Highly Optimized Bus Utilization and Closure Using Functional Coverage

Given the density of current SOC’s, Bridge design is used to connect interconnects working on different frequencies, protocols and bus widths. AXI and OCP are very commonly used protocols in industry given the fact that they support wide range of features. AXI2OCP bridge is used to connect 2 interconnects, one working on AXI protocol another on OCP protocol. Expected functioning of the Bridge design can be obtained by the process of verification, without proper verification the system may show unexpected behavior. Performance of the bus can be increased with effective bus utilization. Building verification environment for AXI2OCP Bridge using System verilog, Generating and simulating the test cases for various features of AXI and OCP, Measuring the Bus utilization parameter for the AXI 3.0 protocol, implementing Functional coverage and assertions with the proposed integrated verification environment using Questa—sim tool is main idea of the paper.

N. Shalini, K. P. Shashikala
Person Recognition Using Surf Features and Vola-Jones Algorithm

Face recognition is one of the prominent biometric software applications, which can identify specific person in a digital image by analysing few parameters and comparing them. These type of recognitions are commonly used in security systems but are used increasingly in variety of other applications. Few non static conditions like facial hair can make recognition system a serious problem. The three stages of face recognition system are facing detection, feature extraction and classification. For enhancing the face recognition from video successions against dissimilar occlusion invariant and posture is proposed by using a novel approach. This face identification system made use of Viola and Jones algorithm for face detection and SURF (Speed Up Robust Feature) for feature extraction. Classifications of these face images are done using RBF (Radial Basis Function kernel) SVM (Support Vector Machine) classifier.

S. Shwetha, Sunanda Dixit, B. I. Khondanpur
2D Shape Representation and Analysis Using Edge Histogram and Shape Feature

To identify the images, the images have so many components which will give the visual information of the image. Shape diagram are characterized that has to be described the shape features and the properties. The important properties to represent the image are shape property which represented in 2D or 3D in Euclidean plane. To represent the shape there are many methods and techniques are available like canny edge. The major aim of this paper is to find out the shape of the object by comparing with the mathematical formulas and properties of the 2D shapes with different orientation.

G. N. Manjula, Muzameel Ahmed
Automatic Classification of Lung Nodules into Benign or Malignant Using SVM Classifier

Carcinoma of lungs is allied to the cancers that are causing the highest number of deaths all over the world. It is very important to improvise the detection methods so that the rate of survival can be increased. In this paper, new algorithm has been proposed to segment the lung regions using Active Contour method. Once the detection of nodules is through and Gray level Co-occurrence Matrix (GLCM) is used to calculate the texture features. HARALICK texture features are calculated and dominant features are extracted. Support Vector Machine (SVM) Classification of the nodules is done using SVM classifier. Satisfactory results have been obtained. Lung CT scan images are taken from LIDC-IDRI database.

B Sasidhar, G Geetha, B. I. Khodanpur, D. R. Ramesh Babu
Automatic Detection of Diabetic Retinopathy Using Two Phase Tophat Transformations—A Novel Approach

Diabetes is the most common disease which occurs when the pancreas fails to produce enough insulin. It gradually affects the retina of the human eye. As this disease aggravates, the vision of the patient starts deteriorating which ends up in Diabetic Retinopathy (DR). 80 % of all the patients who have had diabetes for 10 plus years are affected by this DR disease which can also lead to the vision loss. In this regard, the early detection of DR is hoped to help the patients from vision loss. In this paper, an attempt is made to propose a system for automatic classification of normal and abnormal retinal fundus images by detecting exudates and microaneurysms. Some other features like area of exudates, number of microaneurysms, entropy, homogeneity, contrast and energy are also calculated. The extracted features are fed to SVM classifier for automatic classification. The paper is based on secondary data gathered from different sources.

A S Akshaya, Sunanda Dixit, Ngangom Priyobata Singh
Evaluation of Heart Rate Using Reflectance of an Image

Observing heart rate by electrocardiogram and oximetry sensors may cause skin irritation to some patients. In order to avoid this heart rate evaluation from face reflectance analysis is carried out. Initially procedure adapted is by taking face reflectance and details analysis is carried out. This results in changes in hemoglobin of blood vessels. The green channel is selected because hemoglobin observes only green colour. Hilbert-Huang transform is utilized to procure heart rate by reducing the light variations. Next to reduce noise EMD is applied. The proposed methodology is to get most accurate heart rate with some variations. With this approach, it is now possible to get heart rate accurate up to 70 %.

Neha Deshmukh, Sunanda Dixit, B. I. Khondanpur
Analysis of Logs by Using Logstash

The key functionality of this proposed system is its ability to handle, collect and analysis huge volume of different kinds of log data. When deployed in a network would facilitate collection of logs from different nodes across the network. This paper explains the proposed system which collects the logs using Logstash which is having a capability of handling the many types of Logs data which helps to identify the malicious activity in the network.

Sushma Sanjappa, Muzameel Ahmed
Estimation of Degree of Connectivity to Predict Quality of Software Design

The main goal of Object Oriented Methodology is to deliver software which is maintainable. Post_Development_Quality_Requirements such as software maintainability is depending on design quality. Coupling and Cohesion (C&C) are two design quality factors which are measurable. C&C are influenced by structure of a class which is a basic unit of Object Oriented design. Defining the class structure and their relationships measures the design quality which in turn is indicator for quality requirements such as maintainability, reusability and scalability. This paper explores how different dependency types between classes adds on to design complexity and hence software quality by proposing a model which calculates Degree of Connectivity (DC) between the classes and Coupling Index (CI) of overall software. Thus, it is now possible to infer that design quality not only depends on class structure, but also upon the level of relationships such as inheritance, aggregation, composition and association present in software.

S. D. Thilothame, M. Mashetty Chitralekha, U. S. Poornima, V. Suma
Multipath Load Balancing and Secure Adaptive Routing Protocol for Service Oriented WSNs

Existing multipath routing methods in a Wireless Sensor Networks (WSNs) have presented the effective distribution of traffic over multipath to accomplish required quality of service (QOS). But, failure of the any link in WSN affects the data transmission performance, security of data, scalability and reliability of WSN. Hence, by considering the reliability, scalability, security and congestion for the multipath in a wireless sensor network, it is necessary to design and develop a service focused multi path routing scheme which should provide high failure tolerance and effective routing scheme. This paper put forth a Secure Multipath AODV (SMAODV) protocol in which RSA algorithm is used for secure data transmission and path vacant ratio is calculated to discover the link disjoint path to destination sensor node from source sensor node from all presented paths in a network. Load balancing metrics and technique like detection of congestion, congestion notification and congestion control is used to fine tune and balance the load over multi paths. Split the data packets into multiple fragments and sent it to the destination node through multipath based on the path vacant ratio.

C. P. Anandkumar, A. M. Prasad, V. Suma
Visual Based Information Retrieval Using Voronoi Tree

Content retrieval from large databases needs an efficient approach due to the increasing growth in the digital images. Especially content based image retrieval is an extensive research area. This mainly includes retrieving similar images from the large dataset based on the extracted features. The extracted feature content can be texture, colour, shape etc. Efficient method for image recuperation is proposed in this paper based on shape feature. Shape features like computing Boundary, mode using morphological operations and Harris corner detector and Voronoi diagram are proposed. These matching decisions can be made by different classification models. SVM classifier is used in this research work to get the best matched images during image retrieval. The proposed algorithm is evaluated on JPEG images to get accuracy of about 90 %.

Megha Biradar, Muzameel Ahmed
Gradient Magnitude Based Watershed Segmentation for Brain Tumor Segmentation and Classification

MRI is one of the tool for detecting the tumor in any part of the body. But precise tumor segmentation from such Magnetic resonance imaging (MRI) is difficult and also time consuming technique. To overcome such difficulty, this work proposes a very simple, efficient and automatic segmentation and classification of brain tumor. The proposed system is composed of four stages to segment, detect and classified tumor as benign and malignant. Pre-processing is carried out in the first stage after which watershed segmentation technique is applied for segmenting the image which is the second stage. The segmented image undergo for post processing to remove the unwanted segmented image so as to obtain only the tumor image. In the last stage, gray-level co-occurrence matrix (GLCM) is used to extract the feature. This feature is given as input to Support Vector Machine (SVM) to classify the brain tumor. Results and experiment shows that the proposed method accurately segments and classified the brain tumor in MR images.

Ngangom Priyobata Singh, Sunanda Dixit, A. S. Akshaya, B. I. Khodanpur
Information Retrieval Through the Web and Semantic Knowledge-Driven Automatic Question Answering System

The rising popularity of the Information Retrieval (IR) field has created a high demand for the services which facilitates the web users to rapidly and reliably retrieve the most pertinent information. Question Answering (QA) system is one of the services which provide the adequate sentences as answers to the specific natural language questions. Despite its importance, it lacks in providing the accurate answer along with the adequate, significant information while increasing the degree of ambiguity in the candidate answers. It encompasses three phases to enhance the performance of QA system using the web as well as the semantic knowledge. The WAD approach defines the context-aware candidate sentences by using the query expansion technique and entity linking method, second, Ranks the sentences by exploiting the conditional probability between the query and candidate sentences and the automated system, third, identifies the precise answer including the reasonable, adequate information by optimal answer type identification and validation using conditional probability and ontology structure. The WAD methodology provides an answer to a posted query with maximum accuracy than baseline method.

Ananthi Sheshasaayee, S. Jayalakshmi
Level Set Based Liver Segmentation and Classification by SVM

Liver segmentation from CT image is the key exploration works in representing a liver, which has incredible effect on the examination of liver issue. Hence, numerous computer-aided segmentation approaches have been proposed to partition liver locale from medical image automatically in the past numerous years. A method for liver segmentation system is proposed by consolidating level set based method with Pseudo Zenerike moment and GLDM Features. The objective of proposed algorithm is to solve the segmentation issue which is created by indistinguishable intensities between liver region and its adjacent tissues. Radial Basis Function SVM is used in this work to classify the type of the tumor.

Mallikarjun Kesaratti, Sunanda Dixit, B. I. Khodanpur
An Exploratory Study of RDF: A Data Model for Cloud Computing

Semantic web is an extension of the web which focuses on the meaning of data content rather than the structure of the data content. It promotes many standards by the world wide web consortium (W3C). RDF is a data interchange standard widely used by semantic web community. Cloud computing is a computing paradigm which involves outsourcing of computing resources with the capabilities of resource scalability, on demand provisioning with little or no up-front IT infrastructure investment costs. Resource Description framework is a semantic data model for cloud computing. This paper analyze RDF in terms of its present status, comparison between RDF and traditional data models, its usage in semantic web data management, overview of semantic web rule languages and finally its limitations in representing concepts are examined.

A Clara Kanmani, T. Chockalingam, N. Guruprasad
Domain Independent Approach for Aspect Oriented Sentiment Analysis for Product Reviews

The Sentiment analysis from text documents is emerging field for the research in Natural Language Processing (NLP) and text mining. Feature specific opinion matters more than the overall opinion. Given a collection of review texts, the goal is to detect the individual product aspects comments by reviewers and to decide whether the comments are rather positive or negative. In this research paper unsupervised approach for domain independent feature specific sentiment analysis has been proposed. SentiWordNet lexical resource has been used to determine the polarity of identified features. Research work has shown the promising results over the previously used approaches using SentiWordNet. Newly introduced SentiWordNet 3.0 has been proved to be an important lexical resource.

Nilesh Shelke, Shriniwas Deshpande, Vilas Thakare
Approach for Emotion Extraction from Text

Emotion extraction from text is the categorization of given pieces of text (reviews/comments) into diffident emotions with NLP techniques. Now a days, internet is flooded with individual’s social interaction. Also there are emotionally rich environments on the internet where close friends can share their emotions, feelings and thoughts. It has lots of applications in the next generation of human-computer interfaces. Experimentation aims at evaluating efficiency performance of proposed KEA algorithm for emotion extraction from text for ISEAR dataset as well as for any user defined comments. Fuzzy rules also have been incorporated in the algorithm.

Nilesh Shelke, Shriniwas Deshpande, Vilas Thakare
Performance of Multiple String Matching Algorithms in Text Mining

Ever since the evolution of Internet Information retrieval is being made by surfers in large amount. The data gets increased everyday as the thirst of acquiring knowledge by the users gets increased day-by-day. The data which is raw needs to be processed for usage which increases the potential value in all major areas like Education, Business etc. Therefore Text Mining is an emerging area where unstructured information were made as relevant information. Text mining process can be divided into Information Extraction, Topic Tracking, Summarization, Categorization, Clustering, concept Linkage and Information visualization. Even though all other things can be applied to text only properly it is extracted from the web. Using Pattern matching or String matching algorithms to retrieve proper results from the Sea of information. In this paper we discuss the three types of algorithms Aho Corasick, Wu Manber and Commentz Walter. The performance of the algorithms are identified by implementing it in Python language. Finally the suitable algorithm for extracting information is found.

Ananthi Sheshasaayee, G. Thailambal
Backmatter
Titel
Proceedings of the 5th International Conference on Frontiers in Intelligent Computing: Theory and Applications
Herausgegeben von
Suresh Chandra Satapathy
Vikrant Bhateja
Siba K. Udgata
Prasant Kumar Pattnaik
Copyright-Jahr
2017
Verlag
Springer Singapore
Electronic ISBN
978-981-10-3156-4
Print ISBN
978-981-10-3155-7
DOI
https://doi.org/10.1007/978-981-10-3156-4

Informationen zur Barrierefreiheit für dieses Buch folgen in Kürze. Wir arbeiten daran, sie so schnell wie möglich verfügbar zu machen. Vielen Dank für Ihre Geduld.

    Bildnachweise
    AvePoint Deutschland GmbH/© AvePoint Deutschland GmbH, NTT Data/© NTT Data, Wildix/© Wildix, arvato Systems GmbH/© arvato Systems GmbH, Ninox Software GmbH/© Ninox Software GmbH, Nagarro GmbH/© Nagarro GmbH, GWS mbH/© GWS mbH, CELONIS Labs GmbH, USU GmbH/© USU GmbH, G Data CyberDefense/© G Data CyberDefense, FAST LTA/© FAST LTA, Vendosoft/© Vendosoft, Kumavision/© Kumavision, Noriis Network AG/© Noriis Network AG, WSW Software GmbH/© WSW Software GmbH, tts GmbH/© tts GmbH, Asseco Solutions AG/© Asseco Solutions AG, AFB Gemeinnützige GmbH/© AFB Gemeinnützige GmbH