Skip to main content

2016 | Buch

Information Systems Design and Intelligent Applications

Proceedings of Third International Conference INDIA 2016, Volume 1

herausgegeben von: Suresh Chandra Satapathy, Jyotsna Kumar Mandal, Siba K. Udgata, Vikrant Bhateja

Verlag: Springer India

Buchreihe : Advances in Intelligent Systems and Computing

insite
SUCHEN

Über dieses Buch

The third international conference on INformation Systems Design and Intelligent Applications (INDIA – 2016) held in Visakhapatnam, India during January 8-9, 2016. The book covers all aspects of information system design, computer science and technology, general sciences, and educational research. Upon a double blind review process, a number of high quality papers are selected and collected in the book, which is composed of three different volumes, and covers a variety of topics, including natural language processing, artificial intelligence, security and privacy, communications, wireless and sensor networks, microelectronics, circuit and systems, machine learning, soft computing, mobile computing and applications, cloud computing, software engineering, graphics and image processing, rural engineering, e-commerce, e-governance, business computing, molecular computing, nano-computing, chemical computing, intelligent computing for GIS and remote sensing, bio-informatics and bio-computing. These fields are not only limited to computer researchers but also include mathematics, chemistry, biology, bio-chemistry, engineering, statistics, and all others in which computer techniques may assist.

Inhaltsverzeichnis

Frontmatter
Study and Analysis of Subthreshold Leakage Current in Sub-65 nm NMOSFET

As the technology scales down, the subthreshold leakage increases exponentially which leads to a dramatic increase in static power consumption especially in nanoscale devices. Consequently, it is very important to understand and estimate this leakage current so that various leakage minimization techniques can be devised. So in this paper we attempt to estimate the subthreshold leakage current in an NMOSFET at 16, 22, 32 and 45 nm technology nodes. Various factors which affect the subthreshold leakage such as temperature, drain induced barrier lowering (DIBL) and other short channel effects have also been explored. All the measurements are carried out using extensive simulation on HSPICE circuit simulator at various technology nodes.

Krishna Kumar, Pratyush Dwivedi, Aminul Islam
Implementations of Secure Reconfigurable Cryptoprocessor a Survey

One among the several challenges in the area of applied cryptography is not just devising a secure cryptographic algorithm but also to manage with its secure and efficient implementation in the hardware and software platforms. Cryptographic algorithms have widespread use for every conceivable purpose. Hence, secure implementation of the algorithm is essential in order to thwart the side channel attacks. Also, most of the cryptographic algorithms rely on modular arithmetic, algebraic operations and mathematical functions and hence are computation intensive. Consequently, these algorithms may be isolated to be implemented on a secure and separate cryptographic unit.

Rajitha Natti, Sridevi Rangu
Mitigating and Patching System Vulnerabilities Using Ansible: A Comparative Study of Various Configuration Management Tools for IAAS Cloud

In an organization, Configuration management is an essentially important technique for assuring that the desired configuration are intact all the time. Configuration keeps an eye on management and consistency of a software product’s versions, update etc. Currently many system administrators manage and maintain their systems using a collection of batch scripts. Ansible replaces this plodding and makes application deployment over cloud very simple. In this paper we will understand and exploit Ansible’s true potential. We will write playbooks for patching system vulnerabilities and also understand the advantages of using Ansible.

Sanjeev Thakur, Subhash Chand Gupta, Nishant Singh, Soloman Geddam
Adaptive Fractal Image Compression Based on Adaptive Thresholding in DCT Domain

The encoding procedure for fractal image compression consumes up huge amount of time due to enormous number of search operations through domain pool. Sequential search continues till a best harmonious block is found for a range block. Domain pool reduction, an adaptive fractal image encoding method is one solution to overcome this computational complexity. The use of variance domain selection in DCT domain with lesser operations for domain-range comparison is fruitful. In this paper, variances of range and domain blocks are compared and only domain blocks under a threshold value are taken. To further optimize the performance, adaptive thresholding in quadtree partitioning is applied in ally with variance domain selection approach. Experimental results show that executing the proposed method on grayscale images shortens the encoding time and improves the PSNR.

Preedhi Garg, Richa Gupta, Rajesh K. Tyagi
Improved Resource Exploitation by Combining Hadoop Map Reduce Framework with VirtualBox

MapReduce is a framework for processing huge volumes of data in parallel, on large groups of nodes. Processing enormous data requires fast coordination and allocation of resources. Emphasis is on achieving maximum performance with optimal resources. This paper portraits a technique for accomplishing better resource utilization. The main objective of the work is to incorporate virtualization in Hadoop MapReduce framework and measuring the performance enhancement. In order to realize this master node is setup on physical machine and slave nodes are setup in a common physical machine as virtual machines (VM), by cloning of Hadoop configured VM images. To further enhance the performance Hadoop virtual cluster are configured to use capacity scheduler.

Ramanpal Kaur, Harjeet Kaur, Archu Dhamija
Density Based Outlier Detection Technique

Outlier Detection has become an emerging branch of research in the field of data mining. Detecting outliers from a pattern is a popular problem. Detection of Outliers could be very beneficial, knowledgeable, interesting and useful and can be very destructive if remain unexplored. We have proposed a novel density based approach which uses a statistical measure i.e. standard deviation to identify that a data point is an outlier or not. In the current days there are large variety of different solutions has been efficiently researched. The selection of these solutions is sometimes hard as there is no one particular solution that is better than the others, but each solution is suitable under some specific type of datasets. Therefore, when choosing an outlier detection method to adapt to a new problem it is important to look on the particularities of the specific dataset that the method will be applied. To test the validity of the proposed approach, it has been applied to Wisconsin Breast Cancer dataset and Iris dataset.

Raghav Gupta, Kavita Pandey
Systematic Evaluation of Seed Germination Models: A Comparative Analysis

Agriculture field has recently adopted the various techniques to boost the production and monitoring of seed germination in a professional manner. Temperature is the key factor in amending the germination of non-dormant seeds. This paper presents the various computational developed prediction models for calculating seed germination growth rate. Also in this paper their merits and demerits are depicted to analyze them. These are helpful in monitoring and controlling the seed germination process to enable higher quality storage and produce good yield of crops.

Lalita Chaudhary, Rajesh, Vikas Deep, Preeti Chawla
Business Modeling Using Agile

The selection of methodology used is of great importance in business process modeling. It has a great impact on satisfaction of the customer. The aim of this paper is to fulfill the gaps of the existing model using agile methodology. There also exist several aspects in the business modeling as proposed by various scholars such as market-oriented aspects, value aspects, product oriented aspects, actors in business aspects etc. There were various shortcomings in the existing business agility. Keeping those short comings in mind the model is being proposed and hence various advantages are being extracted.

Lalita Chaudhary, Vikas Deep, Vishakha Puniyani, Vikram Verma, Tajinder Kumar
An Efficient Hybrid Encryption Technique Based on DES and RSA for Textual Data

The data security in almost every field is a challenging concern all around the globe. The application area may be as wide in the area of banking, internet, network and mobile data etc. The main focus of this paper is to secure the text data and provide a comparison with different parameters. DES and RSA are being used for comparison. A hybrid approach has been proposed in this paper based on the combination of DES and RSA algorithm. The comparison is done on the basis of size, length, number of keys and the time of encryption and decryption. The overall results suggest the hybrid encryption approach for the encryption and decryption process.

Smita Chourasia, Kedar Nath Singh
Application of Machine Learning on Process Metrics for Defect Prediction in Mobile Application

This paper studied process metrics in detail for predicting defects in an open source mobile applications in continuation with our previous study (Moser et al. Software Engineering, 2008). Advanced modeling techniques have been applied on a vast dataset of mobile applications for proving that process metrics are better predictor of defects than code metrics for mobile applications. Mean absolute error, Correlation Coefficient and root mean squared error are determined using different machine learning techniques. In each case it was concluded that process metrics as predictors are significantly better than code metrics as predictors for bug prediction. It is shown that process metrics based defect prediction models are better for mobile applications in all regression based techniques, machine learning techniques and neuro-fuzzy modelling. Therefore separate model has been created based on only process metrics with large dataset of mobile application.

Arvinder Kaur, Kamaldeep Kaur, Harguneet Kaur
Automatic Insurance and Pollution Challan Generator System in India

In today’s time there is paper wastage of issuing the pollution and insurance certificate. Traffic police used to check the documents which cause traffic jam on the road. This paper proposes use of Information technology to save the usage of paper; corruption and it reduce possibility of misuse of power by proper monitoring by introducing smart chips in the number plates of the vehicles which will provide necessary details required for monitoring. Everything will be done automatically just by fetching the number plate of the vehicle this will reduce the problem of traffic jam as well as consumption of paper.

Harleen, Shweta Rana, Naveen Garg
Effectively Implementation of KNN-Search on Multidimensional Data Using Quadtree

In modern systems like location based network systems, radio frequency identification based network systems where we use different techniques to collect or capture the data. This represents them in position with the Longitude and Latitude which we call as Multidimensional Data. So study of multidimensional data is attracting more attention in research field. In this paper we are focusing mainly on effective searching for nearest neighbor search. Data is collected using some type of capturing/scanning device. This captured region in divided into quad format two times and an indexing is applied for fast and accurate searching of objects. By considering different facts we propose a new technique of nearest neighbor for effective cost model. Comprehensive experiments shows that quad region technique gives better performance in nearest neighbor search on multidimensional objects.

B. D. Jitkar, S. D. Raut, S. T. Patil
Edge Detectors Based Telegraph Total Variational Model for Image Filtering

For the existing issues of edge blur and uncertainty of parameter selection during image filtering, a novel telegraph total variational PDE model based on edge detector is proposed. We propose image structure tensor as an edge detector to control smoothing process and keep more detail features. The proposed model takes advantages of both telegraph and total variational model, which is edge preserving and robust to noise. Experimental results illustrate the effectiveness of the proposed model and demonstrate that our algorithm competes favorably with state of the-art approaches in terms of producing better denoising results.

Subit K. Jain, Rajendra K. Ray
Cloud Based K-Means Clustering Running as a MapReduce Job for Big Data Healthcare Analytics Using Apache Mahout

Increase in data volume and need for analytics has led towards innovation of big data. To speed up the query responses models like NoSQL has emerged. Virtualized platforms using commodity hardware and implementing Hadoop on it helps small and midsized companies use cloud environment. This will help organizations to decrease the cost for data processing and analytics. As health care generating volumes and variety of data it is required to build parallel algorithms that can support petabytes of data using hadoop and MapReduce parallel processing. K-means clustering is one of the methods for parallel algorithm. In order to build an accurate system large data sets need to be considered. Memory requirement increases with large data sets and algorithms become slow. Mahout scalable algorithms developed works better with huge data sets and improve the performance of the system. Mahout is an open source and can be used to solve problems arising with huge data sets. This paper proposes cloud based K-means clustering running as a MapReduce job. We use health care data on cloud for clustering. We then compare the results with various measures to conclude the best measure to find number of vectors in a given cluster.

Sreekanth Rallapalli, R. R. Gondkar, Golajapu Venu Madhava Rao
Symbolic Decision Tree for Interval Data—An Approach Towards Predictive Streaming and Rendering of 3D Models

3D content streaming and rendering system has attracted a significant attention from both academia and industry. However, these systems struggle to provide comparable quality to that of locally stored and rendered 3D data. Since the rendered 3D content on to the client machine is controlled by the users, their interactions have a strong impact on the performance of 3D content streaming and rendering system. Thus, considering user behaviours in these systems could bring significant performance improvements. To achieve this, we propose a symbolic decision tree that captures all attributes that are part of user interactions. The symbolic decision trees are built by pre-processing the attribute values gathered when the user interacts with the 3D dynamic object. We validate our constructed symbolic tree through another set of interactions over the 3D dynamic object by the same user. The validation shows that our symbolic decision tree model can learn the user interactions and is able to predict several interactions with very limited set of summarized symbolic interval data and thus could help in optimizing the 3D content streaming and rendering system to achieve better performance.

V. Vani, S. Mohan
Predictive 3D Content Streaming Based on Decision Tree Classifier Approach

3D content streaming and rendering system has attracted a significant attention from both academia and industry. However, these systems struggle to provide comparable quality to that of locally stored and rendered 3D data. Since the rendered 3D content on to the client machine is controlled by the users, their interactions have a strong impact on the performance of 3D content streaming and rendering system. Thus, considering user behaviours in these systems could bring significant performance improvements. Towards the end, we propose a decision tree that captures all parameters making part of user interactions. The decision trees are built from the information found while interacting with various types of 3D content by different set of users. In this, the 3D content could be static or dynamic 3D object/scene. We validate our model through another set of interactions over the 3D contents by same set of users. The validation shows that our model can learn the user interactions and is able to predict several interactions helping thus in optimizing these systems for better performance. We also propose various approaches based on traces collected from the same/different users to accelerate the learning process of the decision tree.

S. Mohan, V. Vani
Quantitative Characterization of Radiographic Weld Defect Based on the Ground Truth Radiographs Made on a Stainless Steel Plates

This paper presents a new approach for quantification of radiographic defects. This approach is based on calculating the size of the pixel using the known image quality indicator present in the radiographic image. This method is first applied on the ground truth realities of different shapes whose size is known in advance. The proposed method is then validated with the defect (porosity) where the defect is quantified accurately. The image processing techniques applied on the radiographic image are contrast enhancement, noise reduction and image segmentation to quantify the defects present in the radiographic image. The image processing algorithms are validated using image quality parameter Mean Square Error (MSE) and Peak Signal to Noise Ratio (PSNR).

P. Chitra
Research and Topology of Shunt Active Filters for Quality of Power

The utilization of distorting loads has been increasing exponentially, which results power quality issues in electrical power systems. Transmission of the pollution free power is an important task for power engineers. Power quality issues may affect on end user equipment like electronic goods, digital meters etc. which results the spoilage of products. To nullify the power quality concerns the custom power devices are playing a determinant role in the power systems in the present scenario. This paper mainly demonstrates on research and topology of Shunt Active Power Filters (SAF) to magnify the quality of power in sensitive industrial loads, electrical distribution networks, transmission, and power generation systems.

Lakshman Naik Popavath, K. Palanisamy, D. P. Kothari
Development of 3D High Definition Endoscope System

Recent advances in technology have paved way for 3D endoscope, which has propelled the progress of minimum invasive surgical methods. The conventional two dimensional endoscopy based Minimally Invasive Surgery (MIS) can be performed by experienced surgeons. Inability to perceive depth was the main cause of migration to 3D endoscope. In this paper, a prototype of the stereo endoscopic system is presented. Problems pertaining to the stereo endoscope such as ease of use, inhomogeneous illumination and severe lens distortion are eliminated in the proposed system. Moreover, stereo calibration and rectification have been performed for 3D visualization. Polarization technique is used for depth perception. The proposed system also allows real time HD view to the surgeons.

Dhiraj, Zeba Khanam, Priyanka Soni, Jagdish Lal Raheja
Excel Solver for Deterministic Inventory Model: INSOLVER

An excel solver is proposed for optimized result of deterministic inventory models. Seven inventory models including price break models can be solved by using this Excel Solver. This solver is designed by using yes-no algorithm. INSOLVER has benefit of solving even quantity discount inventory models with different price breaks, which makes it unique.

Pratiksha Saxena, Bhavyanidhi Vats
DSL Approach for Development of Gaming Applications

This research paper mainly concentrates on introducing DSL(Domain Specific language) approach in developing gaming applications. DSL approach hides the lower level implementation in C, C Sharp, C++, and JAVA and provides abstraction of higher level. The higher level of abstraction provided by the Domain Specific Language approach is error-free and easy to develop. The aim of this paper is to propose an approach to use GaML (Gamification Modelling Language, a form of DSL for gaming) for Unity based complex games efficiently in this paper. The paper doesn’t focus on the How and Whys of the Gaming Modelling Language usage, but rather focuses on the run-time enforcement. At the end of the paper, survey has been made on total lines of code and time invested for coding using a case study. The case study proves that DSL approach of automated code generation is better than manual.

Aadheeshwar Vijayakumar, D. Abhishek, K. Chandrasekaran
Digital Forensic Architecture for Cloud Computing Systems: Methods of Evidence Identification, Segregation, Collection and Partial Analysis

Various advantages offered by cloud computing business model has made it one of the most significant of current computing trends like personal, mobile, ubiquitous, cluster, grid, and utility computing models. These advantages have created complex issues for forensic investigators and practitioners for conducting digital forensic investigation in cloud computing environment. In the past few years, many researchers have contributed in identifying the forensic challenges, designing forensic frameworks, data acquisition methods for cloud computing systems. However, to date, there is no unique universally accepted forensic process model for cloud computing environment to acquire and analyze data available therein. This paper contributes in three specific areas to expedite research in this emerging field. First is designing a digital forensic architecture for cloud computing systems; second is evidence source identification, segregation and acquisition; and finally methods for partial analysis of evidence within and outside of a virtual machine (VM).

Digambar Povar, G. Geethakumari
Brushing—An Algorithm for Data Deduplication

Deduplication is mainly used to solve the problem of space and is known as a space-efficient technique. A two step algorithm called ‘brushing’ has been proposed in this paper to solve individual file deduplication. The main aim of the algorithm is to overcome the space related problem, at the same time the algorithm also takes care of time complexity problem. The proposed algorithm has extremely low RAM overhead. The first phase of the algorithm checks the similar entities and removes them thus grouping only unique entities and in the second phase while the unique file is hashed, the unique entities are represented as index values thereby reducing the size of the file to a great extent. Test results shows that if a file contains 40–50 % duplicate data, then this technique reduces the size up to 2/3 of the file. This algorithm has a high deduplication throughput on the file system.

Prasun Dutta, Pratik Pattnaik, Rajesh Kumar Sahu
A New Approach for Integrating Social Data into Groups of Interest

Our daily life is connected with various social network sites with large-scale public networks like Google+, WhatsApp, Twitter, or Facebook. For sharing and publishing, the people are increasingly connected to these services. Therefore, Social network sites have become a powerful tool of contents of interest, part of which may fall into the scope of interests of a given group. There is no solution has been proposed for a group of interest to tap into social data. Therefore, we have proposed an approach for integrating social data into groups of interests. This method makes it possible to aggregate social data of the group’s members and extract from these data the information relevant to the group’s topic of interests. Moreover, it follows a user-centered design allowing each member to personalize his/her sharing settings and interests within their respective groups. We describe in this paper the conceptual and technical components of the proposed approach.

Abdul Ahad, Suresh Babu Yalavarthi, Md. Ali Hussain
Estimation of Shape Parameter of the Radial Distribution of Cosmic Ray Shower Particles

The problem of manipulating lateral distributions of secondary cosmic ray (CR) charged particles is the central in ground-based extensive air shower (EAS) experiments. The analytical approaches in obtaining the spectra of EAS electrons (e±), muons (μ±), and hadrons ($$ h(\bar{h}) $$h(h¯)) in the cascade theory suffer from restricted applicability due to several assumptions or approximations adopted in the theory. Estimation of the shape parameter of the radial distribution of shower particles from simulated data can bypass all these bindings adopted in the theory and thereby improving the reliability of the method, even if it has normal dependencies upon hadronic interactions implemented in the simulation. We have various profile functions for the radial distribution of EAS particles in terms of observables called shape or slope or age. These parameters actually determine how number of shower particles or radial density changes with atmospheric depth or core distance. A more judicious estimation of such observables has been made by defining the local age or segmented slope parameter (LAP or SSP) in the work. Using simulated/experimental data, the radial dependence of LAP/SSP for e±, μ± and $$ h(\bar{h}) $$h(h¯) particles is investigated aiming for the measurement of chemical composition of primary cosmic rays (PCRs).

Rajat K. Dey, Sandip Dam, Manabindu Das, Sabyasachi Roy
Network Security in Big Data: Tools and Techniques

With the time, Big Data became the core competitive factor for enterprises to develop and grow. Some enterprises such as, information industrial enterprises will put more focus on the technology or product innovation for solving the challenges of big data, i.e., capture, storage, analysis and application. Enterprises like, manufacturing, banking and other enterprises will also benefit from analysis and manage big data, and be provided more opportunities for management innovation, strategy innovation or marketing innovation. High performance network capacity provides the backbone for high end computing systems. These high end computing systems plays vital role in Big Data. Persistent and Sophisticated targeted network attacks have challenged today’s enterprise security teams. By exploring each aspect of high performance network capacity, the major objective of this research contribution is to present fundamental theoretical aspects in analytical way with deep focus on possibilities, impediments and challenges for network security in Big Data.

Pushpak Verma, Tej Bahadur Chandra, A. K. Dwivedi
Design of Ripple Carry Adder Using 2-Dimensional 2-Dot 1-Electron Quantum-Dot Cellular Automata

Quantum-Dot Cellular Automata or QCA is an important name among the emerging technologies in the nanotechnology domain as it overcomes the serious technical limitations of CMOS. In this article, we have first designed a full adder using two-dimensional two-dot one-electron QCA cells. Then we have used this full adder to design a ripple carry adder. Finally, we have discussed the issues related to energy and power needed to drive the proposed architecture.

Kakali Datta, Debarka Mukhopadhyay, Paramartha Dutta
Advanced Congestion Control Techniques for MANET

Mobile Ad hoc Network (MANET) is a wireless infrastructure less network in which nodes communicates with each other by establishing temporary network. One of the most important issues in the MANET is the congestion. It leads to the network performance degradation. In order to transmit real time data reliably in MANET, TCP can be used. But the congestion control techniques used by the TCP is inadequate for such networks because of node mobility and dynamic topology. Also the routing protocols designed for MANET do not handle the congestion efficiently. In this paper we proposed a new TCP congestion control scheme called TCP-R for detecting the congestion and proposed ADV-CC (Adhoc Distance Vector with Congestion Control) as a new dynamic routing algorithm to control congestion in MANET. ADV-CC improves the network performance than AODV due to its congestion status attribute as an additional feature. The main strength of this paper is performance results and analysis of TCP-R and ADV-CC.

M. D. Sirajuddin, Ch. Rupa, A. Prasad
A Method of Baseline Correction for Offline Handwritten Telugu Text Lines

In the field of offline handwritten character recognition, baseline correction is an important step at preprocessing stage. In this paper, we propose a baseline correction method for Offline handwritten Telugu text lines. This method deals with both baseline skew and fluctuated characters of offline handwritten Telugu text lines. This method is developed based on the main component of the character. This method has been tested on various handwritten Telugu text lines written by different people. Experimental results show that the effectiveness of the proposed method to correct the baselines of the offline handwritten Telugu text lines.

Ch. N. Manisha, E. Sreenivasa Reddy, Y. K. Sundara Krishna
A Technique for Prevention of Derailing and Collision of Trains in India

Railways are widely used transportation in India because it is affordable to every class of the society. It serves as a great transport to cover large geographical distances over short period of time. It also contributes a large to the Indian economy. However, unexpected delays and accidents make it less reliable. In the era of innovation and technology, India must implement transport mechanism in railways to make it more reliable and attractive for investment, which incorporate such a system, in which the loco pilot can easily view the railway track and can also assess the presence of another train on the same track. This paper proposes a technique that will help the loco pilot to view the tracks and coordinates of the trains that could avoid derailing, delaying and collisions.

Shweta Shukla, Sneha Sonkar, Naveen Garg
Temporal Scintimetric Characterization of Skeletal Hotspots in Bone Scan by Dr. V. Siva’s Retention Ratio

Scintimetric Characterization of the distribution of the radiotracer Tc-99m MDP in the skeletal tissues such as Spines for the early detection of metastatic spread in the cases of Prostatic Carcinoma has been carried out. Temporal Scintimetry measures the counts in the region of interest in the scans done at two time intervals 4 and 24 h. A 4/24 h, Dr. V. Siva’s retention ratio derived using Temporal Scintimetry in the characterization of skeletal hotspots in benign and malignant skeletal conditions is proposed and discussed.

V. Sivasubramaniyan, K. Venkataramaniah
Executing HIVE Queries on 3-Node Cluster and AWS Cluster—Comparative Analysis

Cloud Database Management System (CDBMS) is one of the potential services provided by various Cloud Service Providers. Cloud providers cope with different users, different data and processing or analysis of different data. Traditional Database Management Systems are insufficient to handle such variety of data, users and their requirements. Hence, at the conceptual layer of CDBMS, traditional SQL, Oracle and many more Database Languages are insufficient to provide proper services to their users. HIVE and Pig are the different types of languages which are suitable for the cloud environment which can handle such huge amount of data. In this paper, performance comparison of 3-Node cluster and Cloud Based Cluster provided by the Amazon Web Services is being done. We have compared the processing of structured data with the help of different queries provided by HIVE tool on 3-Node cluster and Amazon Web Service (AWS) cluster. It has been concluded that HIVE queries on AWS cluster gives better results as compared to 3-Node cluster.

Shweta Malhotra, Mohammad Najmud Doja, Bashir Alam, Mansaf Alam, Aishwarya Anand
Performance of Speaker Independent Language Identification System Under Various Noise Environments

Language Identification has gained significant importance in recent years, both in research and commercial market place, demanding an improvement in the ability of machines to distinguish languages. Although methods like Gaussian Mixture Models, Hidden Markov Models and Neural Networks are used for identifying languages the problem of language identification in noisy environments could not be addressed so far. This paper addresses the capability of an Automatic Language Identification (LID) system in clean and noisy environments. The language identification studies are performed using IITKGP-MLILSC (IIT Kharagpur-Multilingual Indian Language Speech Corpus) databases which consists of 27 languages.

Phani Kumar Polasi, K. Sri Rama Krishna
Multi-level Fusion of Palmprint and Dorsal Hand Vein

A novel multilevel level fusion of palmprint and dorsal hand vein is developed in this work. First feature level fusion is done on left and right hand palmprint to get feature fused vector (FFV). Next, the scores of FFV and veins are calculated and score level fusion is done in order to identify person. Hence both the feature level as well as score level fusion techniques have been used in a hybrid fashion. In the present work, feature fusion rules have been proposed to control the dimension of FFV. For palmprint, IIT Delhi Palmprint Image Database version 1.0 is used which has been acquired using completely touchless imaging setup. In this feature level fusion of both left and right hand is used. For dorsal hand veins, Bosphorus Hand Vein Database is used because of the stability and uniqueness of hand vein patterns. The improvement of results verify the success of our approach of multilevel level fusion.

Gopal Chaudhary, Smriti Srivastava, Saurabh Bhardwaj
A Comparative Study on Multi-view Discriminant Analysis and Source Domain Dictionary Based Face Recognition

Human face images captured in real world scenarios using surveillance cameras won’t always contain single view, instead they usually contain multi-view. Recognizing multi-view faces is still a challenging task. Multi-view Discriminant Analysis (MDA) and Source Domain Dictionary (SSD) are two techniques which we have developed and analyzed in this paper to recognize faces across multi-view. In MDA the faces collected from various views are reflected to a discriminant general space by making use of transforms of those views. SSD on the other hand is based on sparse representation, which efficiently makes the dictionary model of source data. It also signifies each class of data discriminatively. Both the developed techniques are validated on CMU-Multi PIE face database which contains 337 people recorded under 15 different view positions and 19 different conditions.

Steven Lawrence Fernandes, G. Josemin Bala
Mining Maximal Efficient Closed Itemsets Without Any Redundancy

Mining more relevant itemsets from various information repositories, which is an essential task in knowledge discovery from data that identifies itemsets with more interestingness measures (support and confidence). Due to the availability of data over Internet, it may retrieve huge number of itemsets to user, which may degrades the performance and increases time complexity. This paper proposed a framework called Analyzing All Maximal Efficient Itemsets to provide a condensed and lossless representation of data in form of rule association rules. We proposed two algorithms Apriori-MC (Apriori-Maximal Closed itemsets) and AAEMIs (Analyzing All Efficient Maximal Itemsets) by deleting non-closed itemsets. The proposed method AAEMIs regains complete relevant itemsets from a group of efficient Maximal Closed Itemsets (MCIs) without specifying user specified constraint and overcoming redundancy.

L. Greeshma, G. Pradeepini
An Interactive Freehand ROI Tool for Thyroid Uptake Studies Using Gamma Camera

Thyroid Uptake is a procedure that requires an injection of radiotracer/radio-isotope into the patient’s blood stream. After injecting 2 millicuries of Technetium-99m pertechnetate radio-isotope, thyroid images are acquired. This uptake requires a special purpose camera called Gamma Camera. Thyroid uptake study provides both the functional, structural information. It is used for the diagnosis of various thyroid disorders. Thyroid uptake is calculated depending on the counts. Counts are nothing but the total number of intensity values present in the selected region of interest. LEAP (Low Energy All Purpose) collimator is used in the Gamma Camera which can handle only photons of lower energies. Technetium-99m pertechnetate is used having an emission energy of 140 keV. Thyroid uptake scan study using Gamma Camera has to be calibrated at each organization. In our super specialty hospital, it has been standardized that the uptake value greater than 2.5 % is considered as Hyperthyroidism, and value between 0.5 and 2.5 % is considered as Normal and the value less than 0.5 % is considered as Hypothyroidism. An Interactive freehand ROI tool was developed in Matlab R2013a as an alternative to the software existing in the Department of Nuclear Medicine, SSSIHMS. This ROI throws light on understanding the image data and calculating the Glomerular Filtration Rate. The GFR is calculated using GATES formula. The tracer uptake is obtained from both left and right thyroid lobes by manually drawing a ROI separately. Developed tool was tested on 30 real time thyroid cases with expected thyroid disorders. The uptake value obtained from the developed tool are compared with the values of the existing software in SSSIHMS.

Palla Sri Harsha, A. Shiva, Kumar T. Rajamani, Siva Subramanyam, Siva Sankar Sai
Literature Survey on Intrusion Detection Systems in MANETs

Mobile ad hoc networks are wireless networks consisting of mobile nodes with no boundary. Nodes are free to move and the network is dynamic. Unique features of these networks serve as benefits as well as drawbacks and give chances to attackers. Intrusion occurs when a malicious node tries to enter the network and misuses the resources. Several attacks and intrusion detection techniques are discussed in this paper.

Pooja Kundu, Neeti Kashyap, Neha Yadav
Apply of Sum of Difference Method to Predict Placement of Students’ Using Educational Data Mining

The purpose of higher education organizations is to offer superior education to its students. The proficiency to forecast student’s achievement is valuable in affiliated ways associated with organization education system. Students’ scores which they got in exam, can be used to invent training set for dominate learning algorithms. With the academia attributes of students such as internal marks, lab marks, age etc. it can be easily predict their performance. After getting predicted results, improvement in the performance of the student to engage with desirable assistance to the students has to be processed. Educational Data Mining (EDM) offers such information to educational organization from educational data. EDM provides various methods for prediction of student’s performance, which improve the future results of students. In this paper, by using their attributes such as academic records, age, and achievement etc., EDM is used for predicting the performance about placement of final year students. As a result, higher education organizations will offer superior education to its students.

L. Ramanathan, Angelina Geetha, M. Khalid, P. Swarnalatha
Robust Color Image Multi-thresholding Using Between-Class Variance and Cuckoo Search Algorithm

Multi-level image thresholding is a well known pre-processing procedure, commonly used in variety of image related domains. Segmentation process classifies the pixels of the image into various group based on the threshold level and intensity value. In this paper, colour image segmentation is proposed using Cuckoo Search (CS) algorithm. The performance of the proposed technique is validated with the Bacterial Forage Optimization (BFO) and Particle Swarm Optimization (PSO). The qualitative and quantitative investigation is carried out using the parameters, such as CPU time, between-class variance value and image quality measures, such as Mean Structural Similarity Index Matrix (MSSIM), Normalized Absolute Error (NAE), Structural Content (SC) and PSNR. The robustness of the implemented segmentation procedure is also verified using the image dataset smeared with the Gaussian Noise (GN) and Speckle Noise (SN). The study shows that, CS algorithm based multi-level segmentation offers better result compared with BFO and PSO.

Venkatesan Rajinikanth, N. Sri Madhava Raja, Suresh Chandra Satapathy
Social Media: A Review

This paper is an insight about the online social media which is the most common form of media now a days and is been used most widely. It throws light on types of social media, various types of users and its major functional blocks.

Gauri Jain, Manisha Sharma
Bayesian-Fuzzy GIS Overlay to Construe Congestion Dynamics

Complex systems such as transportation are influenced by several factors which are subjugated by uncertainty and therefore, it has non-linear characteristics. Consequently, transition in congestion degree from one class to another class can be abrupt and unpredictable. If the possibility of transition in congestion degree from one class to another can be determined, then adequate measures can be taken to make transportation system more robust and sustainable. The present paper demonstrates the efficacy of Bayesian-Fuzzy GIS Overlay to construe congestion dynamics on a test data set. The test data set consists of congestion indicators such as Average Speed (AS) and Congestion Index Value (CIV) of different routes for the test study area. The results succeeded in representing the probable transition in congestion degree from one class to another with respective Bayesian probabilities for different classes.

Alok Bhushan Mukherjee, Akhouri Pramod Krishna, Nilanchal Patel
A Hybrid Clustering Technique to Improve Big Data Accessibility Based on Machine Learning Approaches

Big data is called to a large or complex data from traditional ones, which is unstructured in many case. Accessing to a specific value in a huge data that is not sorted or organized can be time consuming and require a high processing. With growing of data, clustering can be a most important unsupervised approach that finds a structure for data. In this paper, we demonstrate two approaches to cluster data with high accuracy, and then we sort data by implementing merge sort algorithm finally, we use binary search to find a data value point in a specific range of data. This research presents a high value efficiency combo method in big data by using genetic and k-means. After clustering with k-means total sum of the Euclidean distances is 3.37233e+09 for 4 clusters, and after genetic algorithm this number reduce to 0.0300344 in the best fit. In the second and third stage we show that after this implementation, we can access to a particular data much faster and accurate than other older methods.

E. Omid Mahdi Ebadati, Mohammad Mortazavi Tabrizi
Efficient Iceberg Query Evaluation in Distributed Databases by Developing Deferred Strategies

With the rapid increase of the distributed databases the fast retrieval of the data from these databases are playing an important role. The bitmap index technique is well known data structure to provide fast search from large collections. The iceberg queries are frequently used where small output is required from large inputs. Recently, the bitmap indices with compression technique (WAH) are being utilized for efficient iceberg query evaluation. However, the results of these techniques are available only with standalone database. In this paper, we present an effective iceberg query evaluation by developing deferred strategy in distributed databases. This reduced the AND operations performed excessively in existing algorithm. The proposed algorithm executes in both the ways of data shipping and query shipping. The experimental results are verified and compared with existing algorithm.

Vuppu Shankar, C. V. Guru Rao
Unique Identifier System Using Aneka Platform

This paper is regarding the development of a console application named ‘Unique Identifier System (UIS)’ used to create a unique identification number (UID) for a person at the time of his/her birth. Instead of having various identity proofs like PAN card, Driving license, Aadhaar card, Voter ID card, Ration card etc., the person will be recognized by its UID all over the world. This UID will be stored on a cloud to be accessed all over the globe. For hosting the application on Cloud, we are using the Aneka tool, a Cloud Application Development Platform http://www.manjrasoft.com/manjrasoft_downloads.html which will make all the UIDs available globally.

Karishma Varshney, Rahul Johari, R. L. Ujjwal
A Novel Graphical Password Authentication Mechanism for Cloud Services

Password provides high security and confidentiality for the data and also prevents unauthorized access. So, the most popular authentication method which is the alphanumeric passwords that provides security to users which are having strings of letters and digits. Due to various drawbacks in text based passwords, graphical password authentication is developed as an alternative. In graphical password authentication, password is provided based on the set of images. For users it is easy to remember images than text and also graphical passwords provide more security when compared to text based. There are two techniques in graphical passwords. They are Recognition based technique and Recall based technique. To provide more security to user a new idea has been proposed by combining Recognition based and Recall based techniques in this paper.

M. Kameswara Rao, T. Usha Switha, S. Naveen
A Conceptual Framework for Big Data Implementation to Handle Large Volume of Complex Data

Globally industries, businesses, people, government are producing and consuming vast amounts of data on daily basis. Now-a-days, it’s become challenging to the IT world to deal with the variety and velocity of large volume of data. To overcome these bottlenecks, Big Data is taking a big role for catering data capturing, organizing and analyzing process in innovative and faster way. Big Data software and services foster organizational growth by generating values and ideas out of the voluminous, fast moving and heterogeneous data and by enabling completely a new innovative Information Technology (IT) eco-system that have not been possible before. The ideas and values are derived from the IT eco-system based on advanced data-analysis on top of the IT Servers, System Architecture or Network and Physical objects virtualization. In this research paper, authors have presented a conceptual framework for providing solution of the problem where required huge volume of data processing using different BIG data technology stack. The proposed model have given solution through data capturing, organizing data, analyzing data, finally making value and decision for the concern stakeholders.

Manas Kumar Sanyal, Sajal Kanti Bhadra, Sudhangsu Das
Path Reliability in Automated Test Case Generation Process

In software testing, the reliability of a path is an important factor and must be calculated to go forth in the testing process. This paper is the extended work (Cuckoo Search in Test Case Generation and Conforming Optimality using Firefly Algorithm, 2015) [1], previously test cases are generated using cuckoo search. The reliability is calculated with the help of control flow graphs and mathematical calculations for all the paths specified in flow graph (Integrating Path Testing with Software Reliability Estimation Using Control Flow Graph, 2008) [2]. We consider various test cases that are traversing different paths and accordingly compute reliability of the paths.

Kavita Choudhary, Payal Rani, Shilpa
A Study on Wii Remote Application as Tangible User Interface in Elementary Classroom Teaching

Elementary teaching in classes has always been a challenge. The cognitive understanding of children (4–11 years of age) are very different from a matured person. Research and studies (Froebel, The pedagogics of the kindergarten, 2001 and Montessori, The Montessori method scientific pedagogy as applied to child education in “The Children’s houses”, 1992) [1, 2] have been carried out on children’s behavior and problem solving ability which reveals that learning with actual physical objects produces better results than abstract representations. In this study, we have explored the benefits of TUI (Tangible User Interface) in children’s thinking and learning process with Wii Remote. By providing both visual and physical representation, TUIs helps in reducing the cognitive load of thinking among children and increase their learning capabilities. A low-cost effective tool, “Nintendo Wii Remote” is rapidly finding a place in the field of learning technologies. Analysis conducted in this paper have shown the possibilities of exploring Wii Remote aiding in learning environment for children can significantly affect the learning outcome.

Mitali Sinha, Suman Deb, Sonia Nandi
Enhanced Understanding of Education Content Using 3D Depth Vision

Perspective representation of three dimensional physical object and scientific content printed makes an educational content easily understandable by the learner. Creation of abstract content on any subject for teaching purpose is a hard task for the teachers. In this regard, for an enhanced teaching learning experience, a very popular gaming tool named kinect tried in a different way to make the students understand the educational content in an easy and interesting way. In this paper, it is tried to visualize the perspective view of the two dimensional printed content on book in a real life scenario. It is tried out to generate equivalent educational content as interesting as gaming content for better understanding and encouraging student and to grasp knowledge in an efficient way.

Sonia Nandi, Suman Deb, Mitali Sinha
AHP-Based Ranking of Cloud-Service Providers

The paper presents an approach to rank various Cloud-Service providers (CSPs) on the basis of four parameters, namely, Performance, Availability, Scalability and Accuracy, using Analytic Hierarchy Process (AHP). The CSPs were monitored by the cloud storage company named Nasuni and tests were conducted to evaluate the CSPs for the four parameters mentioned above. This paper makes use of the data provided by Nasuni to propose a method for ranking various CSPs. AHP has been selected for this purpose because it uses the foundations of mathematics and psychology to enable one to make complicated decisions. Instead of recommending a correct decision, AHP provides the decision makers with an opportunity to select the option that is most befitting to their goals.

Rajanpreet Kaur Chahal, Sarbjeet Singh
Hardware Prototyping for Enhanced Resilient Onboard Navigation Unit

Dependability of safety critical system by fault tolerant design approach use redundant architecture to tolerate hardware faults. Navigation, Guidance and Control units of onboard computers in Indian satellite launch vehicles rely on hot standby dual redundancy. Effective use of computational resources is desirable in such applications where weight, size, power and volume are critical. Resource augmentation based on task criticality can achieve an increased slack margin which further is used for software and transient fault handling and improved system performance. In this paper, design and development of a hardware prototype with fault injection on an LPC 1768 ARM processor, for validating and testing the fault tolerant resource augmented scheduling of onboard computers is presented. The resource augmented system with added flexibility has been evaluated for improved performance and superior management of faults. The system provides better slack margin and resource utilization which leads for tolerating increased number of transient and software faults.

Archana Sreekumar, V. Radhamani Pillay
Web Data Analysis Using Negative Association Rule Mining

Today era is combination of information and communication technology (ICT), everyone wants to share and store their information through the internet, so there is huge amount of data is searched every day, there is lots of web data is collected in every seconds and with the help of web usage mining, we can discover useful pattern from the web databases. For analyzing this huge amount of web data, we required one of the useful concepts is web site managements. In which we discover the useful pattern, discover or analyzing the useful information from the web database. Here we used the concept of negative association rule mining for analyzing the web log files, for finding the strong association between the web data’s.

Raghvendra Kumar, Prasant Kumar Pattnaik, Yogesh Sharma
Cloud Based Thermal Management System Design and Its Analysis

We are going to propose an advanced architecture sensing real time temperature of a particular location for transmitting the data to a cloud database. Current data have been analysed based on previously recorded data. If any abnormal data is observed, then the system produces an alarming message to the concerned authorities. Analytical data guide users to solve real time problems observing anomalies in the system.

Namrata Das, Anirban Kundu
ICC Cricket World Cup Prediction Model

The paper aims to predict the winner of the Cricket World Cup by taking into consideration the various factors which plays an important role in deciding the final outcome of a game. Of the several factors, five has been taken into consideration. These are whether team wins the toss or not, whether the team bats first or not, whether the match is a day match or day/night match, whether the team is playing in its home ground or away from home and at what round of the tournament the match has been played. This paper has used the method of Analytic Hierarchy Process (AHP) to compare the different parameters and come to the final conclusion.

Avisek Das, Ashish Ranjan Parida, Praveen Ranjan Srivastava
Towards Distributed Solution to the State Explosion Problem

In the life cycle of any software system, a crucial phase formalization and validation through verification or testing induces an identification of errors infiltrated during its design. This is achieved through verification by model checking. A model checking algorithm is based on two steps: the construction of state space of the system specification and the verification of this state space. However, these steps are limited by the state explosion problem, which occurs when models are large. In this paper, we propose a solution to this problem to improve performance in execution time and memory space by performing the exploration of state space in a distributed architecture consisting of several machines.

Lamia Allal, Ghalem Belalem, Philippe Dhaussy
Real Time Bus Monitoring System

The real time bus monitoring system may be designed to serve as a tracking system for the frequent local bus travelers using a GPS (Global positioning system) device and GPRS (General packet radio service) system. This paper focuses on system that help passengers locate the current location of the buses and expected arrival time of the buses to their nearest bus stop. The location and ETA (Estimated Time of Arrival) will be shown on the mobile app and can also be received through SMS (Short Messaging Service). The location can also be tracked by the network administrator through a web application which will keep the complete location history of the busses.

Jay Sarraf, Ishaani Priyadarshini, Prasant Kumar Pattnaik
High Performance DFT Architectures Using Winograd Fast Fourier Transform Algorithm

This paper presents area and latency aware design of Discrete Fourier Transform (DFT) architectures using Winograd Fast Fourier Transform algorithm (WFFT). WFFT is one of the Fast Fourier algorithms which calculate prime sized DFTs. The main component of DFT architectures are Adders and Multipliers. This paper presents DFT architectures using Winograd Fast Fourier Algorithm with Carry Look Ahead Adder and add/shift multiplier and also with Semi-complex Multipliers. In this paper, different prime size DFTs are calculated using polynomial base WFFT as well as conventional algorithm. Area and latency are calculated in Xilinx synthesizer. Polynomial WFFT include Chinese Remainder theorem which increases complexity for higher orders. This paper mainly focuses on prime size 5-point and 7–point WFFT architectures, implemented in Verilog and simulated using Xilinx ISE 13.1. Each sub module is designed using data flow style and finally top level integration is done using structural modeling. DFT architecture has wide range of applications in various domain includes use in Digital Terrestrial/Television Multimedia Broadcasting standard.

Shubhangi Rathkanthiwar, Sandeep Kakde, Rajesh Thakare, Rahul Kamdi, Shailesh Kamble
Dynamic Voltage Restorer Based on Neural Network and Particle Swarm Optimization for Voltage Mitigation

Dynamic Voltage Restorer (DVR) is one of the most widely implemented power devices used to mitigate voltage unbalance in the grid. The performance of DVR largely depends upon its control strategy, in which controller plays an important part. Literature has shown that the commonly used proportional integral (PI) and neural network (NN) controller have many inherent disadvantages including high total harmonic distortion (THD) and high delay time. In this paper, we have replaced the PI controller with a neural controller, whose weights are trained using Particle Swarm Optimization (PSO). A comparative analysis of the DVR performance is done in MATLAB SIMULINK environment for three controllers—PI, NN with back propagation and NN with PSO for 30 and 80 % voltage sag, 40 % voltage swell and unbalanced voltage (1-ɸ). The results obtained document that the hybrid neural controller with PSO has least distortions and is most robust of the three.

Monika Gupta, Aditya Sindhu
Analysis of Norms in Adaptive Algorithm on Application of System Identification

System identification is an important area in signal processing research. It aims to retrieve the system’s unknown specifications from its output only. This technology has a wide variety of applications in engineering and control, industries, as well as medical fields. Typically, the identification of models expressed as mathematical equations. Linear, Non-Linear, Non parametric and Hybrid models are few deciding factors on which different techniques for System Identification relies on. In this paper, we discuss in detail the LMS algorithm and NLMS algorithm. In particular, various types of norms are included in LMS algorithm and the NLMS algorithm is modified according to the norms. Considering different norms in LMS algorithm we have analyzed the application of System identification. Also, it has been verified for both linear and non-linear models. Finally, for non-linear system identification based on Wilcoxon norm has been proposed. The results as well as the comparison show that the Wilcoxon norm is one of the better norms than others and is applied for System identification. The results show its efficacy.

Sarthak Panda, Mihir Narayan Mohanty
SKT: A New Approach for Secure Key Transmission Using MGPISXFS

Cryptography is the concept used to enhance secure communication between two parties; these parties can be two persons in same building or two persons in different organizations of the world. Cryptography is said to be 100 % secure but every time attacker try to break the magic of cryptography. One loose point in cryptography is key transmission. It is most sensitive transmission in the field of cryptography. Many Techniques are proposed time to time to secure key transmission. Some techniques have shown good result up to some extent but not at all fully secured. In this paper we have proposed a key transmission technique to enhance security, Confidentiality and Integrity. We have described detailed algorithm to perform key transmission. Also we have compared the proposed algorithm with existing algorithms.

Kamal Kumar Gola, Vaibhav Sharma, Rahul Rathore
A Comparative Study of Different Approaches for the Speaker Recognition

A Comparative Study of different Approaches for the Speaker Recognition is presented in this paper. In this study speaker speech signal is normalized. This normalized signal for the different function are tested with some of the parameters and compared with the original signal. Some of them are tested in this study those are Hamming function; Gaussian function; Blackman Harris function; Bertlett Hanning function; Chebyshev function; Kaiser function; Hann function and Parzen function. All these functions are tested are compared in this paper.

Kanaka Durga Returi, Vaka Murali Mohan, Praveen Kumar Lagisetty
A Time Efficient Leaf Rust Disease Detection Technique of Wheat Leaf Images Using Pearson Correlation Coefficient and Rough Fuzzy C-Means

In agricultural sector diagnosis of crop disease is an important issue, since it has a marked influence on the production of agriculture of a nation. It is very essential to diagnose disease in an early stage to control them and to reduce crop losses. This paper presents a time efficient proposed technique to detect the presence of leaf rust disease in wheat leaf using image processing, rough set and fuzzy c-means. The proposed technique is experimented on one hundred standard diseased and non-diseased wheat leaf images and achieved 95 and 94 % success rate respectively depending on most three dominated features and single most dominated feature, Ratio of Infected Leaf Area (RILA). The three most dominated features and single most dominated feature are selected out of ten features by the Pearson correlation coefficient. A significant point of the proposed method is that all the features are converted into size invariant features.

Dhiman Mondal, Dipak Kumar Kole
Task Scheduling Algorithms with Multiple Factor in Cloud Computing Environment

Optimized task scheduling concepts can meet user requirements efficiently by using priority concepts. Increasing the resource utilization and reducing the cost, both are compulsory factors to be compromise in task scheduling algorithms of cloud computation for executing many tasks. With updating the technology many new features in cloud computing introduced such as fault tolerance, high resource utilization, expandability, flexibility, reduced overhead for users, reduced cost, required services etc., this paper discussed task scheduling algorithms based on priority for virtual machines and tasks. This algorithm performs good results with balance the load, but it’s not effective with cost performance. Secondly comparative study also has been done in this paper between various scheduling algorithms by CloudSim simulator.

Nidhi Bansal, Amit Awasthi, Shruti Bansal
The State of the Art in Software Reliability Prediction: Software Metrics and Fuzzy Logic Perspective

Every day a bulk of software are developed by industries to fulfill the customer and user requirements. Definitely, it has increased the facilities but on the other hand it also increase the probability of errors, faults, failures and also the complexity in the system that subsequently reduces the understandability of the software, make the software more error prone, highly complex and less reliable. As reliability in software based systems is a critical issue, its prediction is of great importance. In this paper, the state of the art in Software Reliability prediction has been presented with two perspectives; Software Metrics and Fuzzy Logic. The overall idea of the paper is to present, analyze, investigate and discuss the various approaches as well as reliability prediction models that are based on either reliability relevant metrics or Fuzzy Logic or both. At the end, paper presents a list of critical findings identified during literature review of various prediction models.

S. W. A. Rizvi, V. K. Singh, R. A. Khan
An Indexed Approach for Multiple Data Storage in Cloud

A cloud based data storage technique is going to be proposed in this research. Cloud based multiple data storage technique exhibits multiple data storage within a particular memory location using indexing. Proposed cloud based technique involves searching and storing data with less time consumption. Data analyzer, data transmission, and data acquisition in cloud have been introduced in this paper. Dynamic memory space allocation in cloud has been demonstrated. The paper introduces data searching and indexing techniques. Time and space analysis are represented graphically in this paper. Hit ratio in real-time scenario has been demonstrated in our work. Proposed cloud based multiple data Storage technique reduces memory access time. Comparisons have been shown for time difference realization.

Saswati Sarkar, Anirban Kundu
Design New Biorthogonal Wavelet Filter for Extraction of Blood Vessels and Calculate the Statistical Features

World health organization predicts that in year 2012 there are about 347 million people worldwide have diabetes, more than 80 % of diabetes deaths occur in different countries. WHO projects that diabetes will be the 7th major cause leading death in 2030. Diabetic Retinopathy caused by leakage of blood or fluid from the retinal blood vessels and it will damage the retina. For extraction of retinal blood vessels we have invent new wavelet filter. The proposed filter gives the good extraction result as compare to exiting wavelet filter. In proposed algorithm, we have extract the retinal blood vessels features like area, diameter, length, thickness, mean, tortuosity, and bifurcations. The proposed algorithm is tested on 1191 fundus images and achieves sensitivity of 98 %, specificity of 92 % and accuracy of 95 %.

Yogesh M. Rajput, Ramesh R. Manza, Rathod D. Deepali, Manjiri B. Patwari, Manoj Saswade, Neha Deshpande
Demand Side Management Using Bacterial Foraging Optimization Algorithm

Demand side management (DSM) is one of the most significant functions involved in the smart grid that provides an opportunity to the customers to carryout suitable decisions related to energy consumption, which assists the energy suppliers to decrease the peak load demand and to change the load profile. The existing demand side management strategies not only uses specific techniques and algorithms but it is restricted to small range of controllable loads. The proposed demand side management strategy uses load shifting technique to handle the large number of loads. Bacterial foraging optimization algorithm (BFOA) is implemented to solve the minimization problem. Simulations were performed on smart grid which consists of different type of loads in residential, commercial and industrial areas respectively. The simulation results evaluates that proposed strategy attaining substantial savings as well as it reduces the peak load demand of the smart grid.

B. Priya Esther, K. Shivarama Krishna, K. Sathish Kumar, K. Ravi
An Optimized Cluster Based Routing Technique in VANET for Next Generation Network

Since last few years, research in the field of vehicular networking has gained much attention and popularity among the industries and academia. Intelligent approach for such technology is the challenge. In this paper, we have taken an attempt to optimize the routing algorithm for vehicular adhoc networking (VANET). Ant Colony Optimization (ACO) is an optimization technique and is applied based on clustering technique. To improve the safety factor and efficiency and to develop an intelligent transport system, it is highly conceptual with the wireless technology. It is a special type of MANET, because of the variation of routing protocols. Even if the protocols of MANET are feasible, they are not able to provide the optimum throughput required for a fast changing vehicular ad hoc network. Positions of the vehicles create the zone and the optimization is zone based. Ant Colony algorithm is combined with zone based clustering algorithm to improve the result. This approach combines the advantages of both the techniques, the ant colony algorithm as well as the zone based routing algorithm. Routing overhead has been compared between AODV, MARDYMO and TACR protocols and depicted in the graphical plots.

Arundhati Sahoo, Sanjit K. Swain, Binod K. Pattanayak, Mihir N. Mohanty
AgroKanti: Location-Aware Decision Support System for Forecasting of Pests and Diseases in Grapes

Grape is an important crop in Indian agriculture. There are many pests occurring on Grapes which cause huge yield loss to farmers. The grapes development is driven mainly by temperature and many pests have direct relation with temperature. We propose a decision support system named AgroKanti for managing pests on table grapes like powdery mildew and anthracnose. The decision support system is location based i.e. farmer is provided with details of pests considering current weather conditions at farmer’s location. We support farmers with pest details like symptoms and management techniques for pests. We provide our system as an application on mobile phones. The knowledge base of pests is stored as ontology in OWL format. We have also developed a black box for agricultural experts where agricultural experts can generate pest ontology form text descriptions. We have used NLP techniques and AGROVOC library to extract pest details from text descriptions and generate ontology.

Archana Chougule, Vijay Kumar Jha, Debajyoti Mukhopadhyay
A Unified Modeling Language Model for Occurrence and Resolving of Cyber Crime

In the current scenario, distributed computing systems play significant role for accessing the various kinds of internet services. The different handheld devices like palmtop, laptop, mobile, etc. can be connected across the distributed network. People enjoy social networking websites, online purchasing websites, and online transaction websites in the daily routine life. On the other hand, hackers are regularly watching the activities of the people who are categorized as the authorized users connected across the globe. The present work is related to propose a model which is based upon the object-oriented technology for occurring of cyber crime across the distributed network. A well known Unified Modeling Language is used and one can easily write the code for implementation of model in any object-oriented programming language. After that a UML model is proposed for filing the FIR online against the cyber crime. The activities in the above procedure are represented by the UML activity diagram which is finally validated through the concept of finite state machine.

Singh Rashmi, Saxena Vipin
Cross Lingual Information Retrieval (CLIR): Review of Tools, Challenges and Translation Approaches

Today’s Web spreads all over the world and world’s communication over the internet leads to globalization and globalization makes it necessary to find information in any language. Since only one language is not recognized by all people across the world. Many people use their regional languages to express their needs and the language diversity becomes a great barrier. Cross Lingual Information Retrieval provides a solution for that barrier which allows a user to ask a query in native language and then to get the document in different language. This paper discusses the CLIR challenges, Query translation techniques and approaches for many Indian and foreign languages and briefly analyses the CLIR tools.

Vijay Kumar Sharma, Namita Mittal
Minimizing the Cost of Losses Due to Cyber Attack Through B. B. (Branch and Bound) Technique

The advancement of computer and digitization of information system, cyber crime is now becoming one of the most significant challenges in our society. Threat of cyber crime is a growing danger to the industry, business and economic field that are influenced by the cyber criminals along with common person of our society. Since cyber crime is often an aspect of more complex criminological reigns such as money laundering, trafficking and cyber terrorism, the true damage caused through cyber crime to society that may be unknown. This paper presents Branch and Bound (B&B) technique to minimize the losses due to cyber crime. Branch and Bound is the effective technique to solve assignment problems. B&B is, however, an algorithmic technique, which provides the solution for each specific type of problem. There are numerous choice exist to solve each type of problem but Branch and bound (B&B) is the best way.

Narander Kumar, Priyanka Chaudhary
Denoising Knee Joint Vibration Signals Using Variational Mode Decomposition

Analysis of knee joint vibration (VAG) signals using signal processing, feature extraction and classification techniques has shown promise for the non-invasive diagnosis of knee joint disorders. However for such techniques to yield reliable results, the digitally acquired signals must be accurately denoised. This paper presents a novel method for denoising VAG signals using variational mode decomposition followed by wiener entropy thresholding and filtering. Standard metrics: mean squared error, mean absolute error, signal to noise ratio, peak signal to noise ratio and CPU consumption time have been calculated to assess the performance our method. Metric: normalized root mean squared error has also been evaluated to estimate the effectiveness of our method in denoising synthetic VAG signals containing additive white gaussian noise. The proposed method yielded a superior performance in denoising raw VAG signals in comparison to previous methods such as wavelet-soft thresholding, empirical mode decomposition-detrended fluctuation analysis and ensemble empirical mode decomposition-filtering. Our method also yielded better performance in denoising synthetic VAG signals in comparison to other methods like wavelet and wavelet packet-soft thresholding, wavelet-matching pursuit algorithm, empirical mode decomposition-detrended fluctuation analysis and ensemble empirical mode decomposition-filtering. The proposed method although computationally more complex, yields the most accurate denoising.

Aditya Sundar, Chinmay Das, Vivek Pahwa
Data Size Reduction and Maximization of the Network Lifetime over Wireless Sensor Network

The main concept of this research is for increasing the network lifetime and decreases the data size over wireless sensor network. To perform this idea we proposed some novel technique which provides the reliable energy efficient routes and maximizing the network lifetime for finding the route that minimize the total energy for packet traversal. We also use the data compression model that reduce the size of data and joint balancing of nodes and optimize the dynamic compression for improving the lifetime of network. The data compression could be completed within some step, those are raw data could get broken in few branches and get compressed at distinct level of the compression, these compressed data could be decompressed at a certain level and again compressed with distinct level to forward directly or by using some base station. For transmitting the data to base station from the source node, every node has to be clustered and have to choose one cluster head in the group of every cluster, the CH (Cluster Head) is having the more energy in compared to the all other nodes. The CH (Cluster Head) is obtaining the entire message from the other neighbor’s nodes and transmits it to the Base station. From source to destination data transmission, the nodes are searching shortest path that provide a high computation of complexity.

Venu Madhav Kuthadi, Rajalakshmi Selvaraj, Tshilidzi Marwala
Backmatter
Metadaten
Titel
Information Systems Design and Intelligent Applications
herausgegeben von
Suresh Chandra Satapathy
Jyotsna Kumar Mandal
Siba K. Udgata
Vikrant Bhateja
Copyright-Jahr
2016
Verlag
Springer India
Electronic ISBN
978-81-322-2755-7
Print ISBN
978-81-322-2753-3
DOI
https://doi.org/10.1007/978-81-322-2755-7

Premium Partner