Skip to main content

2016 | Buch

Computational Intelligence, Cyber Security and Computational Models

Proceedings of ICC3 2015

herausgegeben von: Muthukrishnan Senthilkumar, Vijayalakshmi Ramasamy, Shina Sheen, C. Veeramani, Anthony Bonato, Lynn Batten

Verlag: Springer Singapore

Buchreihe : Advances in Intelligent Systems and Computing

insite
SUCHEN

Über dieses Buch

This book aims at promoting high-quality research by researchers and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security, and Computational Models ICC3 2015 organized by PSG College of Technology, Coimbatore, India during December 17 – 19, 2015. This book enriches with innovations in broad areas of research like computational modeling, computational intelligence and cyber security. These emerging inter disciplinary research areas have helped to solve multifaceted problems and gained lot of attention in recent years. This encompasses theory and applications, to provide design, analysis and modeling of the aforementioned key areas.

Inhaltsverzeichnis

Frontmatter

Keynotes

Frontmatter
The Game of Wall Cops and Robbers

Wall Cops and Robbers is a new vertex pursuit game played on graphs, inspired by both the games of Cops and Robbers and Conway’s Angel Problem. In the game, the cops are free to move to any vertex and build a wall; once a vertex contains a wall, the robber may not move there. Otherwise, the robber moves from vertex-to-vertex along edges. The cops capture the robber if the robber is surrounded by walls. The wall capture time of a graph G, written $$ W_{{c_{t} }} (G) $$Wct(G) is the least number of moves it takes for one cop to capture the robber in G. In the present note, we focus on the wall capture time of certain infinite grids. We give upper bounds on the wall capture time for Cartesian, strong, and triangular grids, while giving the exact value for hexagonal grids. We conclude with open problems.

Anthony Bonato, Fionn Mc Inerney
Smartphone Applications, Malware and Data Theft

The growing popularity of smartphone devices has led to development of increasing numbers of applications which have subsequently become targets for malicious authors. Analysing applications in order to identify malicious ones is a current major concern in information security; an additional problem connected with smart-phone applications is that their many advertising libraries can lead to loss of personal information. In this paper, we relate the current methods of detecting malware on smartphone devices and discuss the problems caused by malware as well as advertising.

Lynn M. Batten, Veelasha Moonsamy, Moutaz Alazab
Towards Evolutionary Multitasking: A New Paradigm in Evolutionary Computation

The design of population-based search algorithms of evolutionary computation (EC) has traditionally been focused on efficiently solving a single optimization task at a time. It is only very recently that a new paradigm in EC, namely, multifactorial optimization (MFO), has been introduced to explore the potential of evolutionary multitasking (Gupta A et al., IEEE Trans Evol Comput [1]). The nomenclature signifies a multitasking search involving multiple optimization tasks at once, with each task contributing a unique factor influencing the evolution of a single population of individuals. MFO is found to leverage the scope for implicit genetic transfer offered by the population in a simple and elegant manner, thereby opening doors to a plethora of new research opportunities in EC, dealing, in particular, with the exploitation of underlying synergies between seemingly unrelated tasks. A strong practical motivation for the paradigm is derived from the rapidly expanding popularity of cloud computing (CC) services. It is noted that CC characteristically provides an environment in which multiple jobs can be received from multiple users at the same time. Thus, assuming each job to correspond to some kind of optimization task, as may be the case in a cloud-based on-demand optimization service, the CC environment is expected to lend itself nicely to the unique features of MFO. In this talk, the formalization of the concept of MFO is first introduced. A fitness landscape-based approach towards understanding what is truly meant by there being underlying synergies (or what we term as genetic complementarities) between optimization tasks is then discussed. Accordingly, a synergy metric capable of quantifying the complementarily, which shall later be shown to act as a “qualitative” predictor of the success of multitasking is also presented (Gupta A et al., A study of genetic complementarity in evolutionary multitasking [2]). With the above in mind, a novel evolutionary algorithm (EA) for MFO is proposed, one that is inspired by bio-cultural models of multi-factorial inheritance, so as to best harness the genetic complementarity between tasks. The salient feature of the algorithm is that it incorporates a unified solution representation scheme which, to a large extent, unites the fields of continuous and discrete optimization. The efficacy of the proposed algorithm and the concept of MFO in general, shall finally be substantiated via a variety of computation experiments in intra and inter-domain evolutionary multitasking.

Yew-Soon Ong
Generating a Standardized Upper Ontology for Security of Information and Networks

A usable functional interface between ontology and security integrating related information is needed for security engineering as well as creating secure systems. That in turn necessitates ontologizing security of information and networks to start with and then standardization. Having involved in the fields of semantic technology and information assurance, I have strived to facilitate establishing an interface between them and for this reason SIN Conference Series I created included all interest areas of semantics, metadata and ontology aspects. In a keynote talk and its proceedings paper in SIN 2014, I took up this subject and drove to the point that generic ontology for security of information and networks is timely, and it should better be standardized. In the present paper I investigate through examples where available to drive the point that the standard upper ontology for security may be developed through community sourcing and then standardized through competent agencies.

Atilla Elçi

Computational Intelligence

Frontmatter
Analysis of Throat Microphone Using MFCC Features for Speaker Recognition

In this paper, a visual aid system has been developed for helping people with sight loss (visually impaired) to help them in distinguishing among several speakers. We have analyzed the performance of a speaker recognition system based on features extracted from the speech recorded using a throat microphone in clean and noisy environment. In general, clean speech performs better for speaker recognition system. Speaker recognition in noisy environment, using a transducer held at the throat results in a signal that is clean even in noisy. The characteristics are extracted by means of Mel-Frequency Cepstral Coefficients (MFCC). Radial Basis function neural network (RBFNN) and Auto associative neural network (AANN) are two modeling techniques used to capture the features and in order to identify the speakers from clean and noisy environment. RBFNN and AANN model is used to reduce the mean square error among the feature vectors. The proposed work also compares the performance of RBFNN with AANN. By comparing the results of the two models, AANN performs well and produces better results than RBFNN using MFCC features in terms of accuracy.

R. Visalakshi, P. Dhanalakshmi, S. Palanivel
Single-Pixel Based Double Random-Phase Encoding Technique

A new encryption technique based on single-pixel compressive sensing along with a Double Random-Phase encoding (DRPE) is proposed. As compared with the conventional way of image compression where the image information is firstly capture and then compress, the single-pixel compressive sensing collects only a few large coefficients of the data information and throws out the remaining which gives scrambled effect on the image. Further, to enhance the complexity of the image data, the double random phase encoding along with a fractional Fourier transform is implemented to re-encrypt it. The single-pixel based compressive sensing, DRPE and fractional Fourier transform act as a secret keys. At the receiver end, the original image data is reconstructed by applying the inverse of double random phase process and an $$l_{1}$$l1-minimization approach. The peak-to-peak signal-to-noise ratio and the minimum number of compressive sensing measurements to reconstruct the image are used to analyze the quality of the decryption image. The numerical results demonstrate the system to be highly complex, robust and secure.

Nitin Rawat
Kernel Online Multi-task Learning

Many real world learning problems can be divided into a number of dependent subtasks. The conventional machine learning strategy considers each learning problem as a single unit and does not incorporate information associated with the tasks that are closely related with it. Such anomalies have been rectified in the Multi-Task learning (MTL) paradigm, where the model learns a modeling problem, by making use of its associated tasks. Such modeling strategies improve the generalization capacity of the model. In this paper we proposed a mathematical framework for multi-task learning using kernel online learning approach. We applied the proposed algorithm on a synthetic dataset as well as real time data set and the results were found to be promising.

S. Sumitra, A. Aravindh
Performance Evaluation of Sentiment Classification Using Query Strategies in a Pool Based Active Learning Scenario

In order to perform Sentiment Classification in scenarios where there is availability of huge amounts of unlabelled data (as in Tweets and other big data applications), human annotators are required to label the data, which is very expensive and time consuming. This aspect is resolved by adopting the Active Learning approach to create labelled data from the available unlabelled data by actively choosing the most appropriate or most informative instances in a greedy manner, and then submitting to human annotator for annotation. Active learning (AL) thus reduces the time, cost and effort to label huge amount of unlabelled data. The AL provides improved performance over passive learning by reducing the amount of data to be used for learning; producing higher quality labelled data; reducing the running time of the classification process; and improving the predictive accuracy. Different Query Strategies have been proposed for choosing the most informative instances out of the unlabelled data. In this work, we have performed a comparative performance evaluation of Sentiment Classification in a Pool based Active Learning scenario adopting the query strategies—Entropy Sampling Query Strategy in Uncertainty Sampling, Kullback-Leibler divergence and Vote Entropy in Query By Committee using the evaluation metrics Accuracy, Weighted Precision, Weighted Recall, Weighted F-measure, Root Mean Square Error, Weighted True Positive Rate and Weighted False Positive Rate. We have also calculated different time measures in an Active Learning process viz. Accumulative Iteration time, Iteration time, Training time, Instances selection time and Test time. The empirical results reveal that Uncertainty Sampling query strategy showed better overall performance than Query By Committee in the Sentiment Classification of movie reviews dataset.

K. Lakshmi Devi, P. Subathra, P. N. Kumar
An Enhanced Image Watermarking Scheme Using Blocks with Homogeneous Feature Distribution

This paper proposes a gray scale image watermarking scheme based on Discrete Cosine Transform (DCT). In this scheme, the host image is divided into non-overlapping blocks. The pixel values within the blocks are permuted and DCT is applied for each block separately. The permutation makes the distribution of the DCT coefficient uniform within the block and hence embedding watermark in any region of the block produces similar result. Since, the embedding is done in blocks, multiple embedding is also possible. Experimental results also show that the performance of the scheme is good against various image processing operations and cropping attack.

Kavitha Chandranbabu, Devaraj Ponnaian
Performance Analysis of ApEn as a Feature Extraction Technique and Time Delay Neural Networks, Multi Layer Perceptron as Post Classifiers for the Classification of Epilepsy Risk Levels from EEG Signals

Epilepsy being a very common and chronic neurological disorder has a pathetic effect on the lives of human beings. The seizures in epilepsy are due to the unexpected and transient electrical disturbances in the cortical regions of the brain. Analysis of the Electroencephalography (EEG) Signals helps to understand the detection of epilepsy risk levels in a better perspective. This paper deals with the Approximate Entropy (ApEn) as a Feature Extraction Technique followed by the possible usage of Time Delay Neural Network (TDNN) and Multi Layer Perceptron (MLP) as post classifiers for the classification of epilepsy risk levels from EEG signals. The analysis is done in terms of bench mark parameters such as Performance Index (PI), Quality Values (QV), Sensitivity, Specificity, Time Delay and Accuracy.

Sunil Kumar Prabhakar, Harikumar Rajaguru
Suspicious Human Activity Detection in Classroom Examination

The proposed work aims in developing a system that analyze and detect the suspicious activity that are often occurring in a classroom environment. Video Analytics provides an optimal solution for this as it helps in pointing out an event and retrieves the relevant information from the video recorded. The system framework consists of three parts to monitor the student activity during examination. Firstly, the face region of the students is detected and monitored using Haar feature Extraction. Secondly, the hand contact detection is analyzed when two students exchange papers or any other foreign objects between them by grid formation. Thirdly, the hand signaling of the student using convex hull is recognized and the alert is given to the invigilator. The system is built using C/C++ and OpenCV library that shows the better performance in the real-time video frames.

T. Senthilkumar, G. Narmatha
H ∞ State Estimation of Discrete Time Delayed Neural Networks with Multiple Missing Measurements Using Second Order Reciprocal Convex Approach

This paper focuses on H state estimation for a general class of discrete-time nonlinear systems of the neural network with time-varying delays and multiple missing measurements which is described by the unified model. The H performance for the systems described by the unified model is analyzed by using sector decomposition approach together with the Lyapunov stability theory. By constructing triple Lyapunov-Krasovskii functional, a new sufficient condition is established to ensure the asymptotic mean square stability of discrete-time delayed neural networks. Second order convex reciprocal technique is incorporated to deal with partitioned double summation terms and the conservatism of conditions for the state estimator synthesis is reduced efficiently. Finally, a numerical example is given to demonstrate the effectiveness of the proposed design method.

K. Maheswari
A Fuzzy Methodology for Clustering Text Documents with Uncertain Spatial References

Fuzzy ERC (Extraction, Resolving and Clustering) architecture is proposed for handling the uncertain information that can be either queried explicitly by the user and the system can also cluster the documents based on the spatial keyword present in them. This research work applies fuzzy logic techniques along with information retrieval methods in resolving the spatial uncertainty in text and also finds the spatial similarity between two documents, in other words, the degree to which two or more documents talk about the same spatial location. An experimental analysis is performed with Reuter’s Data set. The results obtained from the experiment are based on the empirical evidence of the document clustering based on the spatial references present in them. It is concluded that the proposed work will provide users a new way in retrieving documents that have similar spatial references in them.

V. R. Kanagavalli, K. Raja
A Novel Feature Extraction Algorithm from Fingerprint Image in Wavelet Domain

The robustness of a fingerprint authentication system depends on the quality of the features extracted from the fingerprint image. For extracting good quality features, the quality of the image is to be improved through denoising and enhancement. In this paper, a set of invariant moment features are extracted from the approximation coefficient in the wavelet domain. Initially the fingerprint image is denoised using Stationary Wavelet Transform (SWT), a threshold based on Golden Ratio and weighted median. Then the denoised image is enhanced using Short Time Fourier Transform (STFT). A unique core point is then detected from the enhanced image by using complex filters to determine a Region of Interest (ROI), which is centered at the enhanced image. Then the ROI is decomposed using SWT at level one of Daubechies wavelet filter for extracting efficient features. The decomposed image is partitioned into four sub-images to reduce the effects of noise and nonlinear distortions. Finally a total of four sets of seven invariant moment features are extracted from four partitioned sub-images of an ROI of the approximation coefficient as it will contain low frequency components. To measure the similarity between feature vectors of an input fingerprint with the template stored in the database, the Euclidean Distance is employed for FVC2002 dataset. Using a simpler distance measure can substantially reduce the computational complexity of the system.

K. Sasirekha, K. Thangavel
Motor Imagery Classification Based on Variable Precision Multigranulation Rough Set

In this work classification based on Variable Precision Multigranulation Rough Set for motor imagery dataset is proposed. The accurate design of BCI (Brain Computer Interface) depends upon efficient classification of motor imagery movements of patients. In the first phase pre-processing is carried out with Chebyshev type2 filter in order to remove the noises that may exist in signal during acquisition. The daubechies wavelet is used to extract features from EEG Signal. Finally classification is done with Variable Precision Multigranulation Rough Set. An experimental result depicts higher accuracy according to variation of alpha and beta values in Variable Precision multigranulation rough set.

K. Renuga Devi, H. Hannah Inbarani
Fault Tolerant and Energy Efficient Signal Processing on FPGA Using Evolutionary Techniques

In this paper, an energy efficient approach using field-programmable gate array (FPGA) partial dynamic reconfiguration (PDR) is presented to realize autonomous fault recovery in mission-critical (space/defence) signal processing applications at runtime. A genetic algorithm (GA) based on adaptive search space pruning is implemented, for reducing repair time thus increasing availability. The proposed method utilizes dynamic fitness function evaluation, which reduces the test patterns for fitness evaluation. Hence, the scalability issue and large recovery time associated with refurbishment of larger circuits is addressed and improved. Experiments with case study circuits, prove successful repair in minimum number of generations, when compared to conventional GA. In addition, an autonomous self-healing system for FPGA based signal processing system is proposed using the presented pruning based GA for intrinsic evolution with the goal of reduced power consumption and faster recovery time.

Deepa Jose, Roshini Tamilselvan
A Two Phase Approach for Efficient Clustering of Web Services

Sorting out a desired web service is a demanding concern in service oriented computing as the default keyword search options provided by UDDI registries are not so promising. This paper deals with a novel approach of employing an unsupervised neural network based clustering algorithm namely ART (Adaptive Resonance Theory) for service clustering. The input to the algorithm includes both functional characteristics which are quantified using the basic user requirements in phase 1 and non functional characteristics which are derived by means of swarm based techniques through appropriate mapping of metadata to swarm factors and thereafter updating the input in phase 2. Taking the advantages in being an unsupervised clustering algorithm, ART in a more potential way groups services eliminating a number of irrelevant services returned over a normal search and facilitates to rearrange the registry. Clustering depends on a threshold value namely vigilance parameter which is set between 0 and 1. Flocking of birds is the swarm behaviour considered.

I. R. Praveen Joe, P. Varalakshmi
Elimination of Redundant Association Rules—An Efficient Linear Approach

Association rule mining plays an important role in data mining and knowledge discovery. Market basket analysis, medical diagnosis, protein sequence analysis, social media analysis etc., are some prospective research areas of association rule mining. These types of datasets contain huge numbers of features/item sets. Traditional association rule mining algorithms generate lots of rules based on the support and confidence values, many such rules thus generated are redundant. The eminence of the information is affected by the redundant association rules. Therefore, it is essential to eliminate the redundant rules to improve the quality of the results. The proposed algorithm removes redundant association rules to improve the quality of the rules and decreases the size of the rule list. It also reduces memory consumption for further processing of association rules. The experimental results show that, our proposed method effectively removes the redundancy.

Akilandeswari Jeyapal, Jothi Ganesan
Clustering Techniques from Significance Analysis of Microarrays

Microarray technology is a prominent tool that analyzes many thousands of gene expressions in a single experiment as well as to realize the primary genetic causes of various human diseases. There are abundant applications of this technology and its dataset is of high dimension and it is difficult to analyze the whole gene sets. In this paper, the SAM technique is used in a Golub microarray dataset which helps in identifying significant genes. Then the identified genes are clustered using three clustering techniques, namely, Hierarchical, k-means and Fuzzy C-means clustering algorithms. It helps in forming groups or clusters that share similar characteristics, which are useful when unknown dataset is used for analysis. From the results, it is shown that the hierarchical clustering performs well in exactly forming 27 samples in first cluster (ALL) and 11 samples in the second cluster (AML). They will provide an idea regarding the characteristics of the dataset.

K. Nirmalakumari, R. Harikumar, P. Rajkumar
Breast Region Extraction and Pectoral Removal by Pixel Constancy Constraint Approach in Mammograms

Nowadays, the breast cancer can be detected early with automated Computer Aided Diagnosis (CAD) system the best available technique to assist radiologist. For developing such an efficient computer-aided diagnosis system it is necessary to pre-process the mammogram images. Hence, this paper proposes a method for effective pre-processing of mammogram images. This method consists of two phases such as (i) Breast Region Extraction and (ii) Pectoral removal. In first phase, Adaptive Local Thresholding is used to binaries the image followed by morphological operations for removing labels and artifacts. Then the breast region is extracted by identifying and retaining the largest connected component of mammogram. The pectoral muscle which is the predominant density region of mammogram that should not carry any useful information and also affects the diagnosis is to be removed in phase two. A new method called Pixel Constancy Constraint at multi-resolution approach is introduced for pectoral removal. The proposed method is experimented with Mini-MIAS database (Mammographic Image Analysis Society, London, U.K.) and yields a promising result when compared with existing approaches.

S. Vidivelli, S. Sathiya Devi
Bridging the Semantic Gap in Image Search via Visual Semantic Descriptors by Integrating Text and Visual Features

To facilitate access to the enormous and ever–growing amount of images on the web, existing Image Search engines use different image re-ranking methods to improve the quality of image search. Existing search engines retrieve results based on the keyword provided by the user. A major challenge is that, only using the query keyword one cannot correlate the similarities of low level visual features with image’s high-level semantic meanings which induce a semantic gap. The proposed image re-ranking method identifies the visual semantic descriptors associated with different images and then images are re-ranked by comparing their semantic descriptors. Another limitation of the current systems is that sometimes duplicate images show up as similar images which reduce the search diversity. The proposed work overcomes this limitation through the usage of perceptual hashing. Better results have been obtained for image re-ranking on a real-world image dataset collected from a commercial search engine.

V. L. Lekshmi, Ansamma John
Adaptive Equalization Algorithm for Electrocardiogram Signal Transmission

The proposed work is based on the mathematical analysis of steepest descent stochastic gradient weight updating method for the adaptive cancellation of intersymbol interference in Electrocardiogram (ECG) signal transmission over wireless networks. The major challenge associated with the physiological signal transmission over high data rate band limited digital communication systems is prone to Intersymbol Interference (ISI). In those cases, adjacent symbols on the output of the signal overlap each other which results in irreducible errors as a result of ISI. This work investigates the performance of proposed weight updating method based adaptive equalization technique for estimating the original transmitted ECG signal from the noise corrupted channel output signal. Adaptive filters are designed and implemented to minimize the error at the receiver side, thus making data to be of error free. The results are investigated to validate the operational parameters such as Mean Square Error (MSE), computational complexity, correlation coefficient and convergence rate of the proposed method and their comparative performances over other equalization methods. Simulation results indicated that, the proposed adaptive linear equalization method has good extraction performance than other nonlinear and blind equalization methods.

L. Priya, A. Kandaswamy, R. P. Ajeesh, V. Vignesh
An Efficient Approach for MapReduce Result Verification

Hadoop follows a master-slave architecture and can process massive amount of data by using the MapReduce paradigm. The major problem associated with MapReduce is correctness of the results generated. Results can be altered and become wrong by the collaboration of malicious slave nodes. Credibility-based result verification is one of the effective methods to determine such malicious nodes and wrong results. The major limitation of the approach is that, it depends on the complete results of long-running jobs to identify malicious nodes and hence holds valuable resources. In this paper, we propose a new protocol called Intermediate Result Collection and Verification (IRCV) Protocol that prunes out unnecessary computations by collecting results for verification earlier in the execution line. In addition, unlike the previous approach, IRCV uses only a subset of nodes for the purpose. Our simulation experiments suggest that the new approach has improved performance and will lead to better utilization of resources.

K. Jiji, M. Abdul Nizar
Improving Lifetime of Memory Devices Using Evolutionary Computing Based Error Correction Coding

Error correction coding (ECC) plays an important role in the reliability improvement of circuits having application in space and mission critical computing-, low-power CMOS design-, microprocessor based computing-, and nanotechnology-based systems. Conventional ECC are not suitable for multiple bit detection and correction. A memory circuit holds both instruction and data of the given system and it is susceptible to multiple bit soft error problems. To mitigate such kind of problems in memory circuit, an evolutionary computing based new ECC called reconfigurable matrix code (RMC) is suggested in this paper. The proposed RMC are evaluated in terms of error correction coverage. The results show that the proposed RMC technique can drastically increase the Mean-Error-To-Failure (METF) and Mean-Time-To-failure (MTTF) up to 50 % and hence the life time of the memory devices is more compared to conventional coding techniques based memories.

A. Ahilan, P. Deepa
Comparison of Machine Learning Techniques for the Identification of the Stages of Parkinson’s Disease

Parkinson’s Disease (PD) is a degenerative disease of the central nervous system. This work performs a four-class classification using the motor assessments of subjects obtained from the Parkinson’s Progressive Markers Initiative (PPMI) database and a variety of techniques like Deep Neural Network (DNN), Support Vector Machine (SVM), Deep Belief Network (DBN) etc. The effect of using feature selection was also studied. Due to the skewness of the data, while evaluating the performance of the classifier, along with accuracy other metrics like precision, recall and F1-score were also used. The best classification performance was obtained when a feature selection technique based on Joint Mutual Information (JMI) was used for selecting the features that were then used as input to the classification algorithm like SVM. Such a combination of SVM and feature selection algorithm based on JMI yielded an average classification accuracy of 87.34 % and an F1-score of 0.84.

P. F. Deena, Kumudha Raimond
Security Constrained Unit Commitment Problem Employing Artificial Computational Intelligence for Wind-Thermal Power System

In this article, an effective hybrid nodal ant colony optimization (NACO) and real coded clustered gravitational search algorithm (CGSA) is involved in producing a corrective/preventive contingency dispatch over a specified given period for wind integrated thermal power system. High wind penetration will affect the power system reliability. Hence, the reliability based security-constrained unit commitment (RSCUC) problem is proposed and solved using bi-level NACO-CGSA hybrid approach. The RSCUC problem comprises of reliability constrained unit commitment (RCUC) as the master problem and the sub problem as a security constrained economic dispatch (SCED). NACO solves master problem and the sub problem is solved by real coded CGSA. The objective of RSCUC problem model is to obtain the economical operating cost, while maintaining the system security. The proposed solution for the hourly scheduling of generating units is based on hybrid NACO-CGSA. Case studies with IEEE 118-bus test system are presented in detail.

K. Banumalar, B. V. Manikandan, K. Chandrasekaran
Human Gait Recognition Using Fuzzy Logic

In this paper, an efficient technique has been implemented for gait based human identification. This paper proposes a human identification system based on human gait signatures extracted using topological analysis and properties of body segments. The gait features extracted are height, hip, neck and knee of the human silhouette and a model-based feature i.e. area under hermite curve of hip and knees. The experimental phase has been conducted on the SOTON covariate database, which comprises of eleven subjects. The database also takes into account different factors that vary in terms of apparel, carrying objects etc. Subject classification is performed using fuzzy logic and compared against the nearest neighbor method. From the conducted experimental results, it can be accomplished that the stated approach is successful in human identification while some analysis prove that specific number of input variables and membership functions help to elevate the accuracy level.

Parul Arora, Smriti Srivastava, Abhishek Chawla, Shubhkaran Singh
Detection and Diagnosis of Dilated and Hypertrophic Cardiomyopathy by Echocardiogram Sequences Analysis

Automating the detection and diagnosis of cardiovascular diseases using echocardiogram sequences is a challenging task because of the presence of speckle noise, less information and movement of chambers. In this paper an attempt has been made to classify the normal hearts, and hearts affected by dilated cardiomyopathy (DCM) and hypertrophic cardiomyopathy (HCM) by automating the segmentation of left ventricle (LV). The segmented LV from the diastolic frames of echocardiogram sequences alone is used for extracting features. The statistical features and Zernike moment features are obtained from extracted diastolic LV and classified using the classifiers namely support vector machine (SVM), back propagation neural network (BPNN) and probabilistic neural network (PNN). An intensive examination over 60 echocardiogram sequences reveals that the proposed method performs well in classifying normal hearts and hearts affected by DCM and HCM. Among the classifiers used the BPNN classifier with the combination of Zernike moment features gave an highest accuracy of 92.08 %.

G. N. Balaji, T. S. Subashini, N. Chidambaram, E. Balasubramaiyan
An Elitist Genetic Algorithm Based Extreme Learning Machine

Extreme Learning Machine (ELM) has been proved to be exceptionally fast and achieves more generalized performance for learning Single-hidden Layer Feedforward Neural networks (SLFN). In this paper, a Genetic Algorithm (GA) is proposed to choose the appropriate initial weights, biases and the number of hidden neurons which minimizes the classification error. The proposed GA incorporates a novel elitism approach to avoid local optimum and also speed up GA to satisfy the multi-modal function. The experimental results indicate the superior performance of the proposed algorithm with lower classification error.

Vimala Alexander, Pethalakshmi Annamalai
Formulation and Enhancement of User Adaptive Access to the Learning Resources in E-Learning Using Fuzzy Inference Engine

The learning resources used in teaching and learning process are highly important and so, are considered to be primary building blocks of the learning environment. This paper is devoted to propose a methodology to support and enhance adaptive access to the knowledge residing in a learning resource based on varying user contexts. In the proposed approach, user and learning resources are represented using ontology and fuzzy logic is employed to identify the knowledge level of the student to improve the adaptability of access to the learning resource. Experiment results show that this approach enhances adaptive access to the knowledge in learning resource. This proposed approach outperforms the baseline approach that uses ontology mapping.

V. Senthil Kumaran, RM. Periakaruppan

Cyber Security

Frontmatter
A Robust User Anonymity Preserving Biometric Based Multi-server Authenticated Key Agreement Scheme

Due to the intense evolution of IoT (Internet of Things) technology, in which currency notes to bicycles will form a vital part of Internet to exchange their information among the surrounding environments. IoT will definitely consequences in an augmented level of users connecting to the remote servers (via insecure public internet). Connecting through less resource and portable devices requires a light weight, identity preserving and highly secured key agreement and authentication mechanism. In this framework, recently Mishra et al. proposed an anonymity preserving biometric based authentication scheme and claimed that their scheme resists all major cryptographic attacks. In this, manuscript we will demonstrate that, on thorough analysis, Mishra et al. scheme is liable to ‘Forward Secrecy’ attack and based on that, the attacker can realize all key attacks. As a part of our contribution, we will propose a light weight, strongly secure user authentication and key agreement scheme which is best outfits for IoT environment.

Mrudula Sarvabhatla, M. Chandra Mouli Reddy, Kodavali Lakshmi Narayana, Chandra Sekhar Vorugunti
Extended Game Theoretic Dirichlet Based Collaborative Intrusion Detection Systems

Security has always been one of the key issues of any man-made system, this paved the way for a submodule or application or a device to monitor or system for malicious activities. This system or submodule or device is known as Intrusion Detection System (IDS). As technology evolves so does the associated threats and thus the intrusion detection system needs to evolve. Game theory throws in a different perspective which have not been looked upon much. Game theory provides a way of mathematically formalizing the decision making process of policy establishment and execution. Notion of game theory can be used in intrusion detection system in assisting in defining and reconfiguring security policies given the severity of attacks dynamically. We are trying to formulate a robust model for the theoretical limits of a game theoretic approach to IDS. The most important flaw of game theory is that it assumes the adversary’s rationality and doesn’t take into consideration multiple simultaneous attacks. Therefore, a collaborative trust and Dirichlet distribution based robust game theoretic approach is proposed which will try to resolve this issue. Reinforced learning approaches using Markov Decision Process will be utilized to make it robust to multiple simultaneous attacks.

Sayan Paul, Tushar Makkar, K. Chandrasekaran
Implementation of ECDSA Using Sponge Based Hash Function

Elliptic Curve Cryptography (ECC) is a public key cryptographic technique. Here the encryption and decryption are done in finite field either in prime mode or in binary mode. Goal of this work is to design a light weight and fast message authentication algorithm. The Digital signature Algorithm used in ECC i.e. Elliptic curve digital signature algorithm (ECDSA) uses SHA-1 as the algorithm for generating the hash code. In this paper we propose a technique of using Sponge hash for generating the hash code and signing the message with the newly generated hash code. This approach reduces the bytes per cycle time of the algorithm used in generating the hash code for authentication. when the bytes/cycle time is reduced then the energy consumption will also be reduced and the computation time is also reduced when used in resource constrained environments. abstract environment.

M. Lavanya, V. Natarajan
Contrast-Enhanced Visual Cryptography Schemes Based on Perfect Reconstruction of White Pixels and Additional Basis Matrix

The existing pixel patterns for the visual cryptography scheme are based on the Perfect Reconstruction of Black Pixels (PRBP). Mathematically in PRBP the white pixels are represented by 0 and the black pixel by 1. In the usual binary image, the number of white pixels is much larger than the number of black pixels. Therefore, the perfect reconstructions of black pixels in visual cryptography schemes can decrease the contrast. Here, a visual cryptography scheme which is focused on the Perfect Reconstruction of White Pixels (PRWP) and hence can provide better clarity is presented. As in the case of all existing binary image file formats, PRWP represents white pixel by 1 and black pixel by 0. The visual cryptography scheme with PRWP can improve the clarity of reconstructed images. But by analysing the experimental results, we know that the number of black pixels in the reconstructed image is very less compared to the original image. Therefore increasing the black pixels in the reconstructed image can improve the contrast. In order to increase the black pixels use Additional Basis Matrix (ABM) to represent the new pixel pattern.

Thomas Monoth, P. Babu Anto
Hash Based Two Gateway Payment Protocol Ensuring Accountability with Dynamic ID-Verifier for Digital Goods Providers

Mobile web payment for online shopping is done using only single gateway. If a customer wants to use balances of two different bank accounts in a single transaction then the customer has to transfer this fund to any one of the bank account and then he/she can start the transaction. This is a time consuming process. In this situation, the facility of making the payment via two gateways for a single transaction is more convenient to the customer. Our work is mainly devoted to design an efficient payment protocol that can be used in portable payment devices for making the payment via two gateways for digital goods providers. The proposed scheme is analyzed using an automated tool CPSA, based on different security features and computation overhead. Moreover, the computation overhead of this protocol is less than that of the other schemes.

Venkatasamy Sureshkumar, R. Anitha, N. Rajamanickam
Decrypting Shared Encrypted Data Files Stored in a Cloud Using Dynamic Key Aggregation

Cloud file sharing is a topic that has been in top flight for a long period of time and even in near future. Many methods like access control lists, attribute based encryption, identity based encryption and proxy re-encryption were introduced for sharing data stored in a cloud. But their drawbacks lead to the introduction of key aggregation based file sharing where the content provider generates an aggregate key based on files to be shared and distribute it among the users. Even though this mechanism is very much effective, it has the drawback that if once a user gets an aggregate key corresponding to a set of files from the data owner, then the data owner cannot revoke the permission for accessing the updated files to the user, who already possess the aggregate key. Our scheme in this paper provides an extension to this existing work which guarantees a dynamic generation of the aggregate key and thereby will not allow any unauthorised access to the updated files in the cloud.

Maria James, Chungath Srinivasan, K. V. Lakshmy, M. Sethumadhavan
A Lock and Key Share (2, m, n) Random Grid Visual Secret Sharing Scheme with XOR and OR Decryptions

Security of secret sharing schemes has become a significant issue as secret communication is now being widely used. Random grid based visual secret sharing (RG-VSS) schemes has drawn greater attention taking into consideration its own specific advantages. A (2, n) RG-VSS produces n shares with equal priorities and any of those two shares can be combined to restore the original image. The proposed system boosts the security of (2, n) RG-VSS by integrating a lock and key share concept. A lock and key share (2, m, n) scheme generates m lock and n key shares, and retrieval of original secret is possible if and only if one share is selected from the set of m lock shares and other from the set of n key shares. Both OR-based and XOR-based decryptions are implemented. Further, necessary theoretical proofs are supplied for the correctness of the implementation.

T. Farzin Ahammed, Sulaiman Mundekkattil
Multilevel Multimedia Security by Integrating Visual Cryptography and Steganography Techniques

Multimedia security facilitates the protection of information in multiple forms of digital data such as text, image, audio and video. Many approaches are available for protecting digital data; these include Cryptography, Steganography, and Watermarking etc. A novel approach is proposed for multimedia data security by integrating Steganography and Visual Cryptography (VC) with the goal of improving security, reliability and efficiency. The proposed technique consists of two phases. The first phase is used to hide the message dynamically in an Cover Image1 by changing the number of bits hidden in RGB channels based on the indicator value. VC schemes conceal the Cover Image2 into two or more images which are called shares. In the second phase two shares are created from a Cover Image2 and the stego image created in the first phase is hidden in these two shares. The shares are safe as they reveal nothing about the multimedia content. The Cover Image2, stego image and the hidden message can be recovered from the shares without involving any complex computation. Experimental results show that the new scheme is simple and retrieves many multimedia contents and accomplishes a high level of security.

M. Mary Shanthi Rani, G. Germine Mary, K. Rosemary Euphrasia
K Out of N Secret Sharing Scheme with Steganography and Authentication

With communication, preservation and processing of information becoming digital, security of such transactions is a matter of prime concern. This has resulted in the development of encryption and cryptography. The degree of security of confidential data is measured by the involvement of one or more people—at times security is guaranteed when a single hand is involved, while at others secrets kept in parts by many ensures security. This has given rise to secret sharing that has emerged as a hot area of research in computer science in the past decade. Existing schemes of secret sharing suffer from pixel expansion and degradation in visual quality of the recovered secret and cover images. This paper aims to develop a steganography and authenticated image sharing (SAIS) scheme for gray and color images with meaningful shares and the authentication ability to detect the manipulation of shadow images. The main features of this scheme is the use of simple Boolean and arithmetic operations and thus the computational complexity is reduced from O(nlog2n) to O(n).

P. Mohamed Fathimal, P. Arockia Jansi Rani
A Centralized Trust Computation (CTC) Model for Secure Group Formation in Military Based Mobile Ad Hoc Networks Using Stereotypes

Self-organized, distributed nature of mobile nodes, frequent changes in topology and shared wireless medium and other characteristics make MANET differ from other networks and encourage military communications that can run on the top of MANET. At the same time, such characteristics pose security threats in military communication. As trust computations play a vital role in decision making and suitable for resource constrained devices, in this paper we propose a Centralized Trust Computation model (CTC) to ensure authentication, based on own experiences, recommendations, social and sense making trusts of team members evaluated by commanders that create an efficient and secure teams based on stereotypes model hence the adversaries who intentionally harm the mission and selfish members due to lack of resources can be identified and isolated from the network by the way authentication can be achieved and secure group can be formed.

S. Sivagurunathan, K. Prathapchandran
Cost Effective Rekeying Approach for Dynamic Membership Changes in Group Key Management

Security is an important requirement in reliable group communication over open networks in order to prevent intruder attack. A common secret key called group key is generated for encrypting the group information. A distributed key management methodology based on dynamic decentralized group key agreement protocol is required to handle this issue. Rekeying or new group key generation is based on membership driven or time driven. Rekeying is needed whenever a new single member comes to the group or an existing member goes out. Individual rekeying operations in a large group of users leads to the increase the rate of message exchanges which leads to performance degradation. This paper investigates communication rounds, computation complexity and storage of keys in existing key management schemes and introduces an enhanced decentralized Sub Group Key Management (SGKM) approach which is efficient for rekeying. In the proposed work, a group is fragmented into sub groups and the sub groups are managed with one encryption key and multi decryption key protocol by the respective manager. The performance of proposed algorithm is analyzed under different join and leave possibilities.

Raja Lavanya, K. Sundarakantham, S. Mercy Shalinie
A Multipath Routing Protocol Which Preserves Security and Anonymity of Data in Mobile Ad Hoc Networks

Mobile Ad Hoc Networks (MANETs) have become a likely candidate for a wide range of threats. Each and every routing protocol in MANETs should preserve anonymity of mobile devices and data, along with unlinkability of data packets. With an aim of improving routing efficiency, many routing protocols either partially sacrificed anonymity or disregarded unlinkability. Here, we propose a multipath routing protocol that will enhance security as well as efficiency of routing in MANETs. Anonymity and unlinkability are completely preserved through the concepts of keyed hash chain and per-hop data packet appearance alteration. Also, the routing efficiency is enhanced through the formulation of a route selection parameter called Link Stability Metric (LSM) and the deployment of an elaborate route maintenance phase. The proposed system has been simulated and tested, and the results indicate that security as well as routing efficiency is enhanced considerably.

J Jaisooraj, M. K. Sulaiman, Manu J. Pillai
An Efficient Continuous Auditing Methodology for Outsourced Data Storage in Cloud Computing

Inspite of enormous advancement in cloud data storage and computation technologies, data security remains challenging. This paper presents an efficient auditing method for integrity verification of outsourced data. The proposed scheme combines scheduling with data integrity verification mechanisms. The experimental results on computation and communication cost have also been obtained. The result shows that the proposed scheme is efficient and intuitive choice for cloud storage.

Esther Daniel, N. A. Vasanthi

Computational Models

Frontmatter
A Study on Building Seamless Communication Among Vehicles in Vanet Using the Integrated Agent Communication Model (IACM)

The main objective of this research paper is to propose an Integrated Agent Communication Model for Vehicular Ad hoc Networks. The model integrates five predefined mobile agents that aids in performing specific tasks to establish congestion free, co-operative, efficient channel utilization, energy conservative and optimized communication between nodes in vehicular ad hoc networks. This architecture significantly improves the functionality of the control unit to co-ordinate the agents and to balance the load of mobile agents that carries the information. The approach is to integrate all the agents in a single architecture and apply size reduction techniques to create light weight mobile agents for efficient performance and to improve the Quality of Service in VANET.

N. Sudha Bhuvaneswari, K. Savitha
A Hybrid Approach for Data Hiding Through Chaos Theory and Reversible Integer Mapping

Steganography is the science which deals with hiding the message so that the intruder cannot even detect its existence. This paper proposes an efficient approach which embeds data into the cover image which is in the frequency domain. DCT is applied to the cover image using integer mapping and the encrypted secret is embedded into it using 3, 3, 2 LSB substitution. The encryption is achieved through chaos theory and ceaser cipher technique. The integer mapping is a method which transforms a given image into its DCT transform and transforms back into the spatial domain without any loss. The proposed method provides a high level of security since secret data is encrypted using chaos theory and embedded into DCT transformed cover image.

S. S. V. Nithin Kumar, Gunda Sai Charan, B. Karthikeyan, V. Vaithiyanathan, M. Rajasekhar Reddy
Fluid Queue Driven by an $$ M/E_{2}/1 $$ M / E 2 / 1 Queueing Model

This paper deals with the stationary analysis of a fluid queue driven by an $$ M/E_{2} /1 $$M/E2/1 queueing model. The underlying system of differential difference equations that governs the process are solved using generating function methodology. Explicit expressions for the joint steady state probabilities of the state of the background queueing model and the content of the buffer are obtained in terms of a new generalisation of the modified Bessel function of the second kind. Numerical illustrations are added to depict the convergence of the joint probabilities with the steady state solutions of the underlying queueing models.

K. V. Vijayashree, A. Anjuka
An Effective Tool for Optimizing the Number of Test Paths in Data Flow Testing for Anomaly Detection

Software testing plays an important role in the development process employed in industries. In the field of quality assurance, it is a critical element. Structural oriented testing methods which define test cases on the basis of internal program structure are widely used. We have designed a tool named EFTAD(Effective Tool for Anomaly Detection) based on structural testing, that detects the most effective test paths which actually comprised of data flow anomaly. This tool uses ant colony algorithm for optimizing statically detected paths. This tool minimizes the number of paths to be tested and covers maximum anomalies, thereby ensuring a reliable system. The recursive traversal of the control flow graph along with artificial ants is performed and it provides an efficient set of paths by implementing all def-uses (ADU) strategy of data flow testing technique. The tool also provides a test suite for the evaluation of test results. The test paths are prioritized based on the number of visits and sum value of iteration performed. The unpredictable happenings of the data in a program are noted effectively.

M. Prabu, D. Narasimhan, S. Raghuram
Venus Flytrap Optimization

In this paper, we intend to devise a Novel Nature-inspired Meta-Heuristic algorithm named Venus Flytrap Optimization (VFO) suitable for solving various optimization problems. This algorithm is based on the rapid closure behavior of the Venus Flytrap (Dionaea Muscipula) leaves. This is due to the continuous stimulation of the trigger hairs by the fast movement of the prey. The empirical study of the proposed algorithm is done using various test functions.

R. Gowri, R. Rathipriya
Zumkeller Cordial Labeling of Graphs

In this paper we introduce a new graph labeling called Zumkeller cordial labeling of a graph G = (V, E). It can be defined as an injective function f: V → N such that the induced function f* : E → {0, 1} defined by f*(xy) = f(x) f(y) is 1 if f(x) f(y) is a Zumkeller number and 0 otherwise with the condition $$ \left| {{\text{e}}_{\text{f}}^{*} ( 0 )- {\text{e}}_{\text{f}}^{*} ( 1 )} \right| \le 1. $$ef∗(0)-ef∗(1)≤1. We make use of a technique of generating Zumkeller numbers and the concept of cordiality in the labeling of graphs.

B. J. Murali, K. Thirusangu, R. Madura Meenakshi
Cuckoo Based Resource Allocation for Mobile Cloud Environments

The boom of Mobile Cloud Computing fosters a large volume of smart mobile applications enabling processing and data handlings in the remote cloud servers. An Efficient allocation of resources to a large number of requests in Mobile Cloud Computing environment is an important aspect that needs special attention in order to make the environment a highly optimized entity. In this paper, A Cuckoo based allocation strategy is proposed and the allocation is considered as an optimization problem with the aim of reducing the makespan and the computational cost meeting the deadline constraints, with high resource utilization. The proposed approach is evaluated using CloudSim framework and the results, indicate that the proposed model provides better quality of service to the mobile cloud customers.

S. Durga, S. Mohan, J. Dinesh, A. Aneena
Transient Analysis of an M/M/c Queue Subject to Multiple Exponential Vacation

In this paper, we consider an M/M/c queueing model subject to multiple exponential vacation wherein arrivals occur according to a Poisson distribution and the c servers provide service according to an exponential distribution. When the system is empty, all the c servers go on a vacation and the vacation times are assumed to follow exponential distribution. Further arrivals are allowed to join the queue when servers are in vacation. Explicit analytical expressions for the time dependent probabilities of the number in the system are presented using matrix geometric method.

K. V. Vijayashree, B. Janani
Fractional Filter Based Internal Model Controller for Non Linear Process

Recently, fractional calculus found its applications in the field of process automation. Here the mathematical perspectives of fractional property are utilized to control the process. This article presents a design of fractional filter based internal model controller (IMC) for the nonlinear process. The process chosen is pH process plays significant role in the industrial applications. Control of pH process is highly challenging due to its severe nonlinear characteristics and process uncertainty as echoed in the titration curve. An attempt has been made to design the fractional filter based IMC controller for pH process based on desired phase margin and cross over frequency using MATLAB-SIMULINK. Here Oustaloup approximation technique is used to approximate the Fractional filter. Servo and regulatory performance of the process are analyzed and reflects that better control actions are achieved by the fractional filter based IMC through simulation and it is found to be satisfied.

P. Anbumalar, C. Barath Kanna, J. Janani
A Novel Approach for Solving Triangular and Trapezoidal Intuitionistic Fuzzy Games Using Dominance Property and Oddment Method

In this Paper, We examine Intuitionistic Fuzzy Game Theory problem in which cost co-efficients are triangular and trapezoidal intuitionistic fuzzy numbers. In conventional game theory problem, cost is always certain. This paper develops an approach to solve an intuitionistic fuzzy games where cost is not deterministic numbers but imprecise ones. Here, the elements of the costs (profits) matrix of the games are triangular and trapezoidal intuitionistic fuzzy numbers. Then its membership and non-membership functions are defined. A ranking technique is used to compare the intuitionistic fuzzy numbers so that Dominance property in Intuitionistic Fuzzy Games may be applied and later Dominance property in Intuitionistic Fuzzy Oddments Method is used to solve the intuitionistic fuzzy games. Numerical examples show that an intuitionistic fuzzy ranking technique offers an effective tool for handling an intuitionistic fuzzy games.

M. Joseph Robinson, S. Sheela, A. Sudha Rani
Backmatter
Metadaten
Titel
Computational Intelligence, Cyber Security and Computational Models
herausgegeben von
Muthukrishnan Senthilkumar
Vijayalakshmi Ramasamy
Shina Sheen
C. Veeramani
Anthony Bonato
Lynn Batten
Copyright-Jahr
2016
Verlag
Springer Singapore
Electronic ISBN
978-981-10-0251-9
Print ISBN
978-981-10-0250-2
DOI
https://doi.org/10.1007/978-981-10-0251-9