Skip to main content

2015 | Buch

Artificial Intelligence and Evolutionary Algorithms in Engineering Systems

Proceedings of ICAEES 2014, Volume 2

herausgegeben von: L Padma Suresh, Subhransu Sekhar Dash, Bijaya Ketan Panigrahi

Verlag: Springer India

Buchreihe : Advances in Intelligent Systems and Computing

insite
SUCHEN

Über dieses Buch

The book is a collection of high-quality peer-reviewed research papers presented in Proceedings of International Conference on Artificial Intelligence and Evolutionary Algorithms in Engineering Systems (ICAEES 2014) held at Noorul Islam Centre for Higher Education, Kumaracoil, India. These research papers provide the latest developments in the broad area of use of artificial intelligence and evolutionary algorithms in engineering systems. The book discusses wide variety of industrial, engineering and scientific applications of the emerging techniques. It presents invited papers from the inventors/originators of new applications and advanced technologies.

Inhaltsverzeichnis

Frontmatter
Web Information Extraction on Multiple Ontologies Based on Concept Relationships upon Training the User Profiles

There is a need of personalized Web information extraction. Mining vast information across the Web is not an easy task. We need to undergo various reduction techniques to remove unwanted data and to grab the useful information from the Web resources. Ontology is the best way for representing the useful information. In this paper, we have planned to develop a model based on multiple ontologies. From the constructed ontologies based on the mutual information among the concepts the taxonomy is constructed, then the relationship among the concepts is calculated. Thereby, the useful information is extracted. An algorithm is proposed for the same. The results show that the computation time for data extraction is reduced as the size of the database increases. This shows a healthy improvement for quick access of useful data from a huge information resource like the Internet.

S. Vigneshwari, M. Aramudhan
A Study on Competent Crawling Algorithm (CCA) for Web Search to Enhance Efficiency of Information Retrieval

Today’s Web is very huge and evolving continually in dynamic nature. Search engines are the interface to retrieve information from huge repository of the World Wide Web. Due to the difficulty in accessing the information from massive storage of Web, search engines depend on the crawlers for locating and retrieving relevant Web pages. A Web crawler is a software system, which systematically finds and retrieves Web pages from the Web documents. Crawlers use many Web search algorithms for retrieving Web pages. This paper proposes a competent Web search crawling algorithm, which is derived from page rank and BFS Web search algorithm to enhance the efficiency of the relevant information search. In this paper, an attempt has been made to study and examine the work nature of crawlers and crawling algorithms in search engines for efficient information retrieval.

S. Saranya, B.S.E. Zoraida, P. Victor Paul
High Payload Reversible Watermarking for Securing Medical Images in a Cloud Environment

This paper proposes a high payload reversible data hiding technique in integer lifting transform domain, with a special application to medical images stored in a cloud-based medical enterprise archive. Owing to the nature of these image statistics, the neighbor pixel values are similar mostly, and hence, their differences are observed to be close or equal to zero. A histogram constructed out of this difference factor is exploited for reversible data embedding. Further, data are embedded at multiple levels, and hence, the proposed scheme facilitates higher payload capacity than the conventional single-level histogram-based techniques. The distortion introduced due to secret payload embedding is alleviated to the minimum, and hence, the perceptual quality of the stego images is exceptional, since the embedding is done in the integer lifting wavelet transform domain. The experimental results with medical images demonstrate that the proposed scheme provides a better security with larger payload and a better image quality than some advanced prior schemes.

R. Sukanesh, N. Karthikeyan
Enhanced Security Framework for Data Integrity Using Third-party Auditing in the Cloud System

Cloud computing is an evolving paradigm that has been resulted as an adoption of available technologies. Although cloud technology allows users to take more benefits from available infrastructures, and virtualization, which is an enabling technology provided by the cloud allows users to manage and use the resources in an efficient and easiest manner, it does not guarantee data integrity and security over the resources stored in the cloud. Though many security frameworks have been developed for the cloud, still there may be loss of data or loss of control over data uploaded into the cloud. And also, many solutions for data integrity by third-party auditor are available but they are semi-trustable, if third-party auditors (TPA) compromise unauthorized access over the resource in the cloud. Hence, our proposed scheme focuses on extended framework that guarantees data integrity by involving data owner to perform auditing on the outsourced data in the cloud. And so, our proposed scheme achieves data integrity and guarantees security to the data owners for their resource in the cloud to major extent. Therefore, this type of TPA approach creates awareness to the data owner of their resource and thereby guarantees data integrity for every resources stored in the cloud.

Balamurugan Balusamy, P. Venkatakrishna, Abinaya Vaidhyanathan, Meenakshi Ravikumar, Nirmala Devi Munisamy
An Intelligent Cloud Security System for Critical Applications

Cloud has proven to a cost-effective technology for computing and storage for IT industry. Cloud security has often referred as app security and used to solve the communication issues between the cloud user and the cloud. Cloud security technology will be more attentive. While transforming critical application into a cloud, there is a need for a privacy and security. Cloud technology has not yet compromised with the security especially malicious insider. This paper introduces a novel idea, an intelligent agent system, which is used for an automated security. Whenever there is an alert of hackers/snoopers, the secret intelligent agent will be automatically generated to secure the data from malicious insider attacks. This paper has validated using weighted undirected mathematical model.

Balamurugan Balusamy, P. Venkatakrishna, Gomathi Palani, Umamageshwari Ravikumar
An Efficient Framework for Health System Based on Hybrid Cloud with ABE-Outsourced Decryption

Cloud computing is the emerging paradigm that allows the user to access the data using Internet [

1

]. Growing medical records and difficult data management intends to health department to move toward cloud. The security and privacy are major issues in cloud. The mission-critical applications are limited in cloud though it has numerous small- and medium-sized business applications. We proposed the hybrid cloud-based framework for overall healthcare system with attribute-based encryption with verifiable outsourced decryption for efficient data access and integration of hospitals using community cloud. Secure data transfer and integration of hospital are made possible using hybrid cloud. Our framework overcomes security flaws by achieving data integrity and data confidentiality with secure authentication and authorization. Thus, cloud provides scalable, efficient data access with cost-effective approach.

B. Balamurugan, P. Venkata Krishna, N. Saravana Kumar, G. V. Rajyalakshmi
A Knowledgeable Feature Selection Based on Set Theory for Web Intrusion Detection System

Web intrusion detection systems are security programs to decide whether events and activities occurring in a Web application or network are legitimate. The objective of Web IDS is to identify intrusions with high false alarms and low detection rate while consuming minor properties. Similarly, intelligent Web IDS have snags of concert efficiency, false positive, and false negative, while today’s advance Web page creation approaches are also facing training/learning in the clouds, great false alarms, and truncated detection rate. In this paper, an efficient feature selection approach is proposed by selecting an optimum subset of features. Hybrid feature selection relevance algorithm is used for optimum subset feature selection that decouples relevance and redundancy analysis. Empirical results show that the new proposed system gives better and robust representation of an ideal intrusion detection system while having the reduced total number of features, truncated false alarms, great detection rate, and least computation cost.

Nalini Priya Ganapathi, Vivek Duraivelu
FLC-Based Adaptive Neuro-Fuzzy Inference System for Enhancing the Traveling Comfort

In this paper, pioneering adaptive neuro-fuzzy inference system (ANFIS) which is trained with the data obtained from well-known intelligent control technique fuzzy logic controller (FLC) for half-car (HC) model is proposed to improve the traveling comfort. In automobile industries, the traveling performance of the vehicle is tested at the design stage by simulating the vehicle response to various road excitations under different loading conditions. In this work, the disturbance from the road is assumed to be a dual bump. Initially, a FLC is designed to give better performance. Secondly, a flexible machine learning approach artificial neural network (ANN) that is trained with the FLC data by considering the performance measure as mean square error (MSE) is designed and used. At last, an ANFIS with the adaptive and generalizing features of ANN and intelligence of FLC is used for control purpose. In the modeling of system, simulation with and without controllers is carried out in MATLAB/Simulink environment. A comparison is made among the responses of the system with these controllers, and it shows that the system with FLC-based ANFIS gives significant reduction of the body acceleration (BA) and thus improves the traveling comfort.

K.K. Sneha, Lakshmi Ponnusamy, R. Kalaivani
State Variable Filter Design Using Improvised Particle Swarm Optimization Algorithm

State variable filter design using particle swarm optimization algorithm proves to be better when compared to the conventional design method. It gives several solutions to the component values which are useful in designing the state variable filter. The automatic termination technique gives the best possible solution in lesser time. This technique has several advantages in terms of a quicker convergence rate and efficient computation toward the suitable output, where an added advantage gives the user a control over the output’s precision. The performance parameter here can be defined as the trade-off between the convergence time and accuracy of the resulting solution, which is determined by the precision value. The results also indicate that the solution with a predefined precision level can be obtained with the minimum number of iterations in minimum time.

Aakash Indoria, Varatharajan Varrun, Akshay, Murali Krishna Reddy, Tejaswi Sathyasai, Baskaran Anand, Nirmala M. Devi
A Novel Algorithm on IP Traceback to Find the Real Source of Spoofed IP Packets

With the availability of Internet at the doorsteps in recent years, there has been a wide range of invasions from strangers such as distributed denial of service (DDoS) attacks. DDoS can be launched from any location, draining resources of the victim machine or network. The original IP address of the attacker is more often spoofed; hence, an IP traceback scheme is needed to trace the source of a packet. In this paper, we propose a novel marking algorithm which provides a single packet traceback directly at the victim’s location. The marking algorithm is simple to use with negligible computation and no storage overhead, compared to existing system. Further, the traceback is in convenience to the victim as the entire network traversal or out of band message to identify the attack source is not needed.

M. Vijayalakshmi, N. Nithya, S. Mercy Shalinie
A Queueing Model for e-Learning System

There has been much written about e-Learning practice; however, little attention has been given to come out with a mathematical model for e-Learning. As the lack of a proper mathematical model will hinder providing better service to the customers, we have come up with an attempt to make a study on which of the existing mathematical models could fit e-Learning. We argue with statistical data that (M/M/C): (∞/FIFO) is one of the models which best fit e-Learning. This paper aims to provide inputs that the suggested queuing model can be used for e-Learning system in real conditions.

T. Senthil Kumar, K. I. Ohhm Prakash
Development of Common Parallel Programming Platform for MPI and PVM

Parallel virtual machine (PVM) and message-passing interface (MPI) are the most successful message-passing libraries to map parallel algorithm onto parallel computing platform. Configuration of MPI and PVM in the nodes present in parallel computing environment (desktop PC’s interconnected using ethernet LAN) is the time-consuming task for a user. This configuration procedure requires lot of knowledge about the steps to be followed to make them work properly. Configuration becomes the difficult task when there are more number of nodes in the parallel computing environment. Our work aims on developing a common parallel programming platform, which allows a user to get MPI and PVM in the nodes without any time-consuming configuration steps. This is done by integrating the recent version of PVM: PVM3.4.6 into the MPI’s MPICH2 packages in order to simplify the time-consuming task of configuring them separately.

Veerabasavantha Swamy, S. Sampath, B. R. Nanjesh, Bharat Bhushan Sagar
Association Rule Mining and Refinement Using Shared Memory Multiprocessor Environment

Rules that represent an association between the values of certain attributes and those of others are called association rules. The process of extracting such rules from a given dataset is called association rule mining (ARM). The work aims at effective utilization of all the cores present in the system with less time wastage and also balance the workload among them. Full fledged use of system resources and load balance can be achieved by perfect scheduling and providing efficient parallel algorithms. This paper discusses such a parallel ARM algorithm and rule generation based on the frequent combinations obtained. As the rules generated are also more in number as well as redundant, insignificant, and unproductive, it is compulsory to filter and refine the rules.

P. Asha, T. Jebarajan
Design of Low-Power Multiplier Using UCSLA Technique

Multiplication is one of the major fundamental operations and key hardware blocks in any digital system. This paper presents the comparison of the VLSI design of uniform carry select adder (UCSLA)-based multiplier technique with the variable carry select adder (VCSLA)-based multiplier technique. The analysis is carried out on the different bit sized values of unsigned inputs, and output results show that the area, power, and delay are reduced in the UCSLA-based multiplier technique compared to VCSLA-based technique. The timing delay in 64-bit VCSLA-based multiplier technique is 95.25 ns for performing the multiplication, which is reduced by 11.11 % in the UCSLA-based multiplier technique. In the same manner, area is reduced by 39.42 % and power also reduced by 19.28 % in UCSLA-based multiplier technique. The simulation works of multipliers are carried out in Verilog-HDL (Modelsim). After the simulation, the results are obtained using cadence tool.

S. Ravi, Anand Patel, Md Shabaz, Piyush M. Chaniyara, Harish M. Kittur
Hyperspectral Image Compression Algorithms—A Review

Satellite-based remote sensing applications require collection of high volumes of image data of which hyperspectral images are a particular type. Hyperspectral images are collected by high-resolution instruments over a very large number of wavelengths on board a satellite/airborne vehicle and then sent onwards to a ground station for further processing. Compression of hyperspectral images is undertaken to reduce the on-board memory requirement, communication channel capacity, and the download time. Compression algorithms can be either lossless or lossy. The purpose of this paper is to review a number of compression techniques employed for onsite processing of hyperspectral image data, to reduce the transmission overhead. A review of the theory of hyperspectral images and the compression techniques employed therein with emphasis on recent research developments is presented. Recent research on video compression techniques for hyperspectral imaging (HSI) is also discussed.

K. Subhash Babu, V. Ramachandran, K. K. Thyagharajan, Geeta Santhosh
Reliable Virtual Cluster-Based Opportunistic Routing Protocol for Mobile Ad hoc Networks

Geographic opportunistic routing protocols overcome the failure of traditional routing protocols which fail due to the dynamic nature of mobile ad hoc networks. When link stability is considered for forwarder selection, the hop count and delay become unpredictable. Hence, a new protocol, delay-aware cluster-based opportunistic Routing (DACOR) protocol, has been proposed. In DACOR, nodes with high transmission range form loose clusters with surrounding nodes. When such high-power nodes are not available, nodes with lower transmission range enable the routing process by opportunistically selecting the best forwarder based on link stability. When no neighbors are found, a routing hole handling mechanism is also proposed. Simulation results show that the presence of high transmission nodes reduces the hop count, thus ensuring low delay, and opportunistic routing guarantees reliable data delivery.

E. Sahaya Rose Vigita, E. Golden Julie, S. Tamil Selvi
Weighted Euclidean Distance Based Sign Language Recognition Using Shape Features

This paper proposes a real-time static hand gesture recognition system for American Sign Language alphabets. The input hand gestures from a simple background are captured by a camera and an image database is created. The proposed system consists of four stages namely preprocessing, segmentation, feature extraction, and classification. In the training phase, the hand region is detected and segmented from the gesture database images and various shape-based features such as area, perimeter, and roundness are extracted. The extracted features form a unique feature vector for a particular gesture. In the testing phase, the feature vector of an input test image is compared with each of the feature vectors of database images using weighted Euclidean distance. The gesture is correctly recognized if the distance is the least. This system is tested using a dataset of twenty-four ASL alphabets with three different signers. The experimental results show that the proposed system offers the recognition rate of 91.6 %.

S. Nagarajan, T. S. Subashini
Event Monitoring for Adaptive Multi-priority Streaming Time Sensitive-Based EDF Scheduling

Real-time systems are bounded with strict time constraints. To accomplish this, task scheduling is needed. Earlier approaches are restricted to fixed priority scheduling policies, which follows static priority algorithm. It assigns a priority statically and schedules dynamically. It does not support dynamic priority requests. To overcome this, preemptive earliest deadline first (EDF) scheduling is used, which is a dynamic priority scheduling algorithm. It ensures that higher priority requests are executed first and they experience lower mean waiting time, without leading lower priority requests to overstarvation. But preemptive EDF leads to increase in runtime overhead. Hence, proposed method uses limited preemption EDF scheduling, which assigns an approximate deadline for each request, and the requests are serviced with limited preemption. It splits the request into multiple jobs and assigns fixed preemption points (FPP) to each sub job. Only at FPP position, preemption is allowed. Hence, it is proved experimentally that the mean waiting time for higher and lower priority tasks are the minimum with less runtime overhead.

P. Leela, S. Sathees babu, K. Balasubadra
How Does Consciousness Overcome Combinatorial Complexity?

Mind is a complex information-processing mechanism which causes consciousness—the processed information content of mind. Models are the basic units of information processed in the mind. Comparison is one of the operations that causes consciousness. When we compare two models, the problem of combinatorial complexity (C.C.) arises. In this paper, we describe the way our mind overcomes the difficulty of C.C. and its consequences.

Reji Kumar
Website Re-organization for Effective Latency Reduction Through Splay Trees and Concept-Based Clustering

Interest in the analysis of user behavior on the Web has been increasing rapidly. This increase stems from the realization that added value for visitor of the Website is not gained merely through larger quantities of data on a site, but through easier access to the required information at the right time and in the most suitable form. Hence, understanding users’ navigation on the Web is important toward improving the quality of information and the speed of accessing large-scale Web data sources. As the interests of the user change over the time, a static Website will soon become outdated. Hence, the usage of the Website needs to be monitored and structure of the Website has to be modified to suit the user requirements periodically. In this paper, we propose a novel splay tree-based approach that reduces the latency in accessing the Web page by reorganizing the Website for group of user’s interest rather than single user, such that the most recently and frequently accessed pages by the user group that belongs to some concept/category are placed nearer to the root. Experimental results show that splaying along with the concept-based clustering gives better performance for seasonal Websites that need a change periodically.

M. B. Thulase, G. T. Raju
Algorithms for Zumkeller Labeling of Full Binary Trees and Square Grids

Let

G

= (

V

,

E

) be a graph. An injective function

f

:

V

N

is said to be a Zumkeller labeling of the graph

G

, if the induced function

f

*

:

E

N

defined as

f

*

(

xy

) =

f

(

x

)

f

(

y

) is a Zumkeller number for all

xy

E

,

x

,

y

V

. A graph

G

= (

V

,

E

) which admits a Zumkeller labeling is called a Zumkeller graph. In this paper, we provide algorithms for Zumkeller labeling of full binary trees and grid graphs.

B. J. Balamurugan, K. Thirusangu, D. G. Thomas
Memetic Framework Application—Analysis of Corporate Customer Attitude in Telecom Sector

Natural and cultural evolutionary processes shall be well implemented in the real-time applications by using memetic computing process. Popular researches based on the evolutionary processes have been dealing with the universal criteria. So the need for location-dependent population searches lead to the research based on the cultural traits of the individual, i.e., memetic computational applications. In the telecom sector, the decision-making process of the corporate customers is taken for study with the applications based on the memetic computation. This paper presents an innovative approach to analyze the customer attitude with objective, subjective, and inter-subjective criteria in the multi-attribute deterministic environment. The two metrics, viz. value of business (VOB) and number of services (NOS), are taken as reference using the memetic attributes. Experimental analysis shows that with respect to the telecom sector, memetic framework has improvised the corporate customer attitude toward the services in the betterment of customer relation management.

V. Balakumar, C. Swarnalatha
Contourlet-Based Multiband Image Fusion for Improving Classification Accuracy in IRS LISS III Images

Unsupervised classification plays a vital role in overseeing the transformations on the earth surface. Unsupervised classification has an indispensable role in an immense range of applications such as remote sensing, motion detection, environmental monitoring, medical diagnosis, damage assessment, agricultural surveys, and surveillance. In this paper, a novel method for unsupervised classification in multitemporal optical images based on image fusion and Gaussian RBF kernel K-means clustering is proposed. Here, the image is generated by performing contourlet-based multiband image fusion on the red, green, and near-IR images. On the finest image generated by collecting the information from three bands, Gaussian RBF kernel K-means clustering is performed. In Gaussian RBF kernel K-means, nonlinear clustering is performed, as a result the false alarm rate is reduced and accuracy of the clustering process is enhanced. The aggregation of image fusion and RBF kernel K-means clustering is seen to be more effective in detecting the changes than its preexistences.

K. Venkateswaran, N. Kasthuri, K. Balakrishnan, K. Prakash
Computer-Aided Diagnosis of Breast Elastography and B-Mode Ultrasound

Ultrasound (US) elastography, a new technique that images the elasticity of tissues, is now into the course of breast cancer diagnosis. The purpose of this study was to assess the diagnostic performance of a neural network using a combination of US elastography technique and US B-mode. A back-propagation neural network (BPN) is used to classify the breast masses as benign cyst, benign solid mass, or malignant solid mass using texture, strain, and morphological features computed from the segmented lesions. Sixty-two breast lesions in US elastography and US B-scan images that are biopsy proved are examined. A classification accuracy using a combination of US elastography and B-scan images is 87.09 %, sensitivity 89.29 %, specificity 85.29 %, positive predictive value 83.33 %, and the negative predictive value is 90.63 %. With statistically significant features, the classification accuracy using a combination of US elastography and B-scan images is reported to be 82.25 % with sensitivity 92.86 %, specificity 73.53 %, positive predictive value 74.29 %, and negative predictive value 92.59 %. The classification results indicate that US elastography in combination with US B-mode improves both sensitivity and specificity.

Shirley Selvan, S. Shenbagadevi, S. Suresh
A Convivial Energy Based Clustering (CEBC) Solution for Lifetime Enhancement of Wireless Sensor Networks

Wireless sensor network requires robust and energy efficient communication protocols to minimize the energy consumption as much as possible. Numerous energy-based cluster head election algorithms have been proposed and implemented. However, the capacities and workloads of the neighbors of cluster heads have not been considered in large wireless sensor networks. A convivial energy-based cluster (CEBC) head selection scheme where cluster heads are elected based on the energy value of a node and the energy values of its neighbors is proposed in this paper. This is to ensure that the neighbor nodes within the one hop range of a cluster head do not drain off their energy while forwarding the data to the other member nodes, especially in large networks. Simulations in network simulator have proved that CEBC cluster head selection scheme has improved the lifetime of the network compared to the LEACH protocol.

Betty Madhurya Vallapuram, Gokul P. Nair, Kaliraja Thangamani
Wind Power in India—An Overview

The present scenario in the world including India experiences increase in electrical power demand due to the recent advancement in the technologies. Fossil fuels are predominantly used to meet the increase in demand, leading to the depletion. Henceforth, it is necessary that an alternative source of energy must be utilized along with the fossil fuels to meet the demands. Renewable energy sources, an eco-friendly source of energy, could be effectively used in meeting the crises faced by the world in terms of power demand, emission of greenhouse gases, and so on. Wind energy, a type of renewable energy sources, could be used in an effective manner, compared to other renewable energy sources, to meet the crisis. The problem faced in dispatching and integrating wind energy with the grid is the randomness of the wind, which makes the wind prediction, a necessary factor to determine. This paper explores the difficulties and challenges faced in integrating and dispatching wind power generation with the existing grid in India.

T. V. Gowtham, G. R. Venkatakrishnan, R. Rengaraj
Series Resonant Converter-based Unified Power Quality Conditioner Using Fuzzy Controller

The improvisation of power quality at grid is widely done by one of the custom power device, i.e., unified power quality conditioner (UPQC). This paper presents a three-phase unified power quality conditioner based on fuzzy controlled resonant converters, which has the capability to obtain a good transient current response and voltage response. The fuzzy logic controller is used to control the resonant converter. The simulation results based on MATLAB/Simulink are given to illustrate the effectiveness of the proposed UPQC based on resonant converter. The obtained result has been compared to other topologies which were proposed earlier.

R. Anand, P. Melba Mary
MRI Brain Image Classification Using Haar Wavelet and Artificial Neural Network

A combined approach with MRI brain image denoising and abnormality detection process is proposed in this paper. The proposed technique is comprised of three stages, namely (i) image preprocessing, (ii) feature extraction, and (iii) image classification. Initially, in the preprocessing stage, denoising is performed on the input brain MRI image. The denoising process on the input image increases the accuracy of feature extraction stage. In feature extraction phase, the image features such as mean, variance, and multilevel 2D Haar wavelet decomposition are extracted for classifying the images in the database into normal and abnormal. By using these extracted features, the MRI brain images are classified by the well-known classification technique such as feed forward back propagation neural networks (FFBNN). The implementation of the proposed method shows improvements in classification of MRI images.

J. C. Smitha, S. Suresh Babu
Adaptive Modified Hysteresis Current Control for Switching Loss Reduction in Photovoltaic-Fed Dual-Function Grid-Connected Inverters

Current control logic plays a very important role in the overall performance of grid-connected inverters. Adaptive modified hysteresis current control is used in this work for switching loss reduction, optimization of inverter switching frequency, and reduction in total harmonic distortion of supply current. Photovoltaic-fed grid-connected inverter injects real power from photovoltaic array into the grid, controls real power flow in the grid, and functions as a shunt active filter. Good DC bus stabilization, reactive power compensation, satisfactory performance under unbalanced source and load conditions, and good dynamic response is also achieved.

Preethi Thekkath, S. U. Prabha
Low-complexity Power Spectral Density Estimation

This paper presents a method of feature extraction to detect seizure in epileptic patients . Epileptic seizures are characterized by high amplitude and synchronized electrocephalogram (EEG) waveforms. Power spectral density (PSD) of the EEG signal plays an important role in diagnosis of epilepsy. Many automated diagnostic systems for epileptic seizure detection have emerged in recent years. This paper proposes a method of extracting PSD of EEG sub-bands using low-complex PSD estimation method which would reduce the automatic diagnostic system complexity and also enhances the speed. Low-complexity PSD estimation method was implemented in digital signal processor (TMS320C6713), and the result was very much similar to traditional Welch PSD estimation method with 30 % reduction in computation time.

N. Balasaraswathy, R. Rajavel
Intensified Scheduling Algorithm for Virtual Machine Tasks in Cloud Computing

Scheduling of jobs is essential with distribution of load on processors and dynamic allocation of resources in order to get maximum benefit in terms of make-span. In scheduling the mapping of tasks are done based on its characteristics and user requirements. Many task parameters such as cost, load and required resources for the task completion are to be considered while scheduling. In cloud, the resources should be utilized efficiently and hence scheduling should consider the resource utilization to reduce the execution time and thereby increasing the throughput of the system. In this paper, we proposed a new scheduling algorithm supporting load balancing in cloud with respect to various types of quality services based on resources. To evaluate the scheduling algorithm, the performance metrics such as execution time, average execution time of each resource and number of tasks assigned to each resource are taken into consideration.

K. A. Saranu, Suresh Jaganathan
Design of Neural Network Model for Emotional Speech Recognition

Human–computer interaction (HCI) needs to be improved for the field of recognition and detection. Exclusively, the emotion recognition has major impact on social, engineering, and medical science applications. This paper presents an approach for emotion recognition of emotional speech based on neural network. Linear predictive coefficients and radial basis function network are used as features and classification techniques, respectively, for emotion recognition. Results reveal that the approach is effective in recognition of human speech emotions. Speech utterances are directly extracted from audio channel including background noise. Totally, 75 utterances from 05 speakers were collected based on five emotion categories. Fifteen utterances have been considered for training and rest are for test. The proposed approach has been tested and verified for newly developed dataset.

H. K. Palo, Mihir Narayana Mohanty, Mahesh Chandra
Intelligent Decision Making with Improved Energy Detection for Precise Spectrum Sensing in Cognitive Radios

Energy detection is best suited for the spectrum sensing when prior knowledge about the licensed user is unavailable. The performance of this technique is primarily influenced by the available test statistic, number of samples used to compute the test statistic and the decision threshold and described in terms of probability of detection and false alarm. This paper focuses on an intelligent sensing scheme with improved energy detection algorithm in which the test statistic is computed using an arbitrary positive power instead of squaring operation. The detection performance is found to be considerably improved compared to the traditional energy detection algorithm. Simulations are performed, and the results confirm the accuracy of the analysis.

K. Muthumeenakshi, S. Radha, R. Sudharsana, R. Tharini
Human Action Recognition by Employing DWT and Texture

Human action recognition is a very challenging task due to the great variability with which different people may perform the same action. It involves in the development of applications such as automatic monitoring, surveillance, and intelligent human–computer interfaces. We propose an action recognition scheme to classify human actions based on positive portion using template-based approach from a video. We first define the accumulated motion image (AMI) using frame differences to represent the spatiotemporal features of occurring actions. Then, the direction of motion is found out by computing motion history image (MHI). Texture and spatial information are extracted from AMI and MHI using (LBP) local binary pattern and (DWT) discrete wavelet transform, respectively. The detection of object and extraction of moving objects are done by feature extraction over LBP and DWT. The feature vectors are computed by employing the seven Hu moments. The system is trained using nearest neighbor classifier, and the actions are classified and labeled accordingly. The experiments are conducted on Weizmann dataset.

V. Thanikachalam, K. K. Thyagharajan
Modeling and Performance Evaluation of Switched Reluctance Motor Drives in MATLAB/Simulink Atmosphere with Estimation of SRM Parameters Using Finite Element Analysis

This manuscript deals with modeling and performance analysis of switched reluctance motor drive using MATLAB. The per phase equivalent circuit of switched reluctance motor (SRM) which includes voltage, current, torque, and electro-mechanical equations is used to build the mathematical model of SRM drives. Finite element analysis (FEA-Version 4.2) is used to predict the torque produced at various currents and rotor positions as well as to calculate the phase inductances. With the aid of the developed model, characteristics of speed and torque in addition to voltages and currents of SRM can be efficiently examined and evaluated. This proposed model can be projected to trouble-free design tool for the development of SRM drives.

P. Magdelin Jennifer Princy, S. Ram Prasath, P. Ramesh Babu
Classification of Right Bundle Branch Block and Left Bundle Branch Block Cardiac Arrhythmias Based on ECG Analysis

Heart is a vital organ of the human body which plays an important role in the circulation of the blood throughout the body and also serves as the power source of the electrical impulses that generate the rhythmicity of the heart, thus resulting in the successful circulation of the blood. Now, any disturbance in the proper functioning of the heart results in some type of diseases termed as cardiovascular diseases or arrhythmias. These diseases can be diagnosed and consequently treated. The diagnosis is done by an efficient technique known as electrocardiogram (ECG). This paper focuses on the area of biomedical signal analysis, where a method for detection of two types of cardiac arrhythmias, namely right bundle branch block (RBBB) and left bundle branch block (LBBB), is discussed. The signal processing and analysis have been carried out on the data collected from MITBIH (Berbari in Principles of Electrocardiography, pp 2–11, 2000, [

1

]) database. Implementation is done on the MATLAB platform. Signal analysis is done through a number of steps like preprocessing, feature extraction, and classification, and the results are generated in the form of waveforms, thus classifying the cardiac arrhythmias RBBB and LBBB.

Sukanta Bhattacharyya, U. Snekhalatha
Content-based Algorithm for Color Image Enhancement Using Fuzzy Technique
Research centre: LBS Institute for Science and Technology

Fuzzy technique offers interesting and challenging frame for developing new methods in the field of image processing. Nowadays, enhancing color images is considered as one of such demanding work in image processing. This paper introduces the nonlinear and knowledge-based behavior of fuzzy technique to enhance color images referred as ‘content-based algorithm.’ The resulting image not only exposes the fine details but also enhances the images by processing the approximate components of the image in human visual system with content-based algorithm in fuzzy domain. The knowledge-based characteristics of both ‘fuzzy technique’ and ‘color’ coincide effectively to get better experimental results in this field. Also, the subjective and objective evaluations listed over here show that this algorithm performs better than any other existing fuzzy and non-fuzzy approach.

C. Reshmalakshmi, M. Sasikumar
Frequent Itemset Mining with Elimination of Null Transactions Over Data Streams

A data stream is an input massive data that arrives at high speed and it is unbounded. The sliding window model is used to extract the recent frequent patterns by adjusting the window size containing only the recent transactions and eliminating the old transactions. Another acute challenge in frequent pattern mining is the presence of null transactions. Null transaction is a transaction which contains only a single item and its presence does not contribute toward frequent pattern discovery. Most of the existing streaming algorithms did not consider the overhead of null transactions, and hence, they fails to discover the frequent patterns faster during mining process. To overcome these issues, a new algorithm called frequent itemset mining using variable size sliding window with elimination of null transactions (FIM-VSSW-ENT) is used for extracting recent frequent patterns from data streams. Experimental results using synthetic and real datasets show that our proposed algorithm gives better result in terms of processing time and memory storage.

B. Subbulakshmi, A. Periya Nayaki, C. Deisy
Protograph-Based Design of Non-Binary LDPC Codes

This paper presents construction of non-binary LDPC codes using protograph method. A new class of LDPC codes is constructed from a template called a protograph. The protograph serves as a blueprint for constructing LDPC codes of arbitrary size in which the performance can be predicted by analyzing the protograph. The significance of this approach is the reduction of number of nodes of computation in the decoding process. Furthermore, they also benefit for low memory requirements, simple design procedure, as well as hardware-friendly implementation. ARA-based code is a kind of linear class codes having self-correcting capabilities. It is used to transmit messages over noisy transmission channel. Due to this, the information loss can be made as small as possible. These codes constitute a subclass of LDPC codes with very fast encoder structure. They also have a projected graph or protograph representation that allows for high-speed decoder implementation. Because of the unique feature the, ARA code is mainly used in supporting remote sensing, digital video broadcasting, and data delay applications. The decoding performance of the ARA-based LDPC codes with varied punctured patterns and repetition rates is analyzed. Computation of weighted enumerators for the design of non-binary LDPC codes is done. The simulation results of protograph-based LDPC codes are calculated by bit error rate (BER) performance.

I. Divya, M. Anbuselvi
Extraction of Binary Sequences in a Frequency Shift Keying-Modulated Signal by Empirical Mode Decomposition Algorithm Against Ambient Noises in Underwater Acoustic Channel

Frequency shift keyed acoustic signals transmitted in underwater sensor networks are affected by numerous factors such as ambient noise, ocean interference, and other random sources which make them nonlinear and non-stationary in nature. In recent years, the use of empirical mode decomposition (EMD) technique to analyze modulated acoustic signals has gained importance. In this paper, an EMD-based approach is proposed to extract frequency shift keying (FSK)-modulated acoustic stationary signals in the underwater channel that are affected by above-mentioned factors in shallow water over a range of 100 Hz to 10 kHz. EMD is an experimental technique which decomposes a signal into a set of oscillatory modes known as intrinsic mode functions (IMF). The proposed algorithm makes use of FFT for the identification and extraction of oscillatory signal. To validate the proposed algorithm, computer simulation is carried out by exploiting the real-time data perturbed by ambient noises. It is observed from the simulation results that the proposed EMD approach identifies and extracts two FSK stationary signals against various ambient noises present in the channels of the underwater sensor network.

L. Suvasini, S. Prethivika, S. Sakthivel Murugan, V. Natarajan
A Model for the Effective Steganalysis of VoIP

The latest studies and applications in steganography are based on Internet, particularly voice over IP (VoIP). VoIP has proved itself to be a perfect carrier for hidden data. Most of the approaches for VoIP steganalysis focus on detecting hidden data in the payload of the packets, by extending audio steganalysis methods. Here, we propose an effective steganalysis method which considers the speech behavior as well as network protocol structure to detect hidden communication. The common statistical features, like mean, covariance, etc., are also considered for effective classification.

N. Jayasree, P. P. Amritha
Automatic Genre Classification from Videos

In recent decades, there has been a huge growth in the amount of multimedia content stored in networked repositories. Many video hosting websites exists in today’s scenario such as youtube, metacafe and google video etc., where people are uploading and downloading their videos. At present, indexing and categorization of these videos is a tiresome job. Either the system asks the user to suggest tags for the videos which they upload or people are employed to tag the video manually. Manual tagging has been done based on the views of users, search terms, etc. In order to eliminate this problem, this paper proposes a model that automatically categorizes the videos based on their genres. The main aim of this work was to categorize the videos broadly on major domains such as sports, music and news using temporal, textural, motion, and color features. In sports, the videos have been classified further into cricket and football. The hierarchical SVM has been used for automatic training and selection of the genre of the video. A total of 350 videos from various Web sites have been used for training the classification system. This system achieves an overall average detection ratio up to 98 % while maintaining very low false detection rate of 2 %.

S. Karthick, S. Abirami, S. Murugappan, M. Sivarathinabala, R. Baskaran
Computer-aided Diagnosis of Breast Cancer by Hybrid Fusion of Ultrasound and Mammogram Features

Ultrasound images are increasingly being used as an important adjunct to X-ray mammograms for diagnosis of breast cancer. In this paper, a computer-aided diagnosis system that utilizes a hybrid fusion strategy based on canonical correlation analysis (CCA) is proposed for discriminating benign and malignant masses. The system combines information from three different sources, i.e., ultrasound and two views of mammogram, namely, mediolateral oblique (MLO) and craniocaudal (CC) views. CCA is employed on ultrasound-MLO and ultrasound-CC feature pairs to explore the hidden correlations between ultrasound and mammographic view. The two pairs of canonical variates are fused at the feature level and given as input to support vector machine (SVM) classifiers. Finally, decisions of the two classifiers are fused. Results show that the proposed system outperforms unimodal systems and state-of-the-art fusion strategies.

R. Lavanya, N. Nagarajan, M. Nirmala Devi
A Machine Learning Approach to Cluster the Users of Stack Overflow Forum

Online question and answer (Q&A) forums are emerging as excellent learning platforms for learners with varied interests. In this paper, we present our results on the clustering of Stack Overflow users into four clusters, namely naive users, surpassing users, experts, and outshiners. This clustering is based on various metrics available on the forum. We use the X-means and expectation maximization clustering algorithms and compare the results. The results have been validated using internal, external, and relative validation techniques. The objective of this clustering is to be able to trace and predict the activity of a user on this forum. According to our results, majority of users (71 % of 40,000 users under consideration) fall in the ‘experts’ category. This indicates that the users in Stack Overflow are of high quality thereby making the forum an excellent platform for users to learn about computer programming.

J. Anusha, V. Smrithi Rekha, P. Bagavathi Sivakumar
Data Storage Optimization in Cloud Environment

Data de-duplication is a process which stores a single copy of the data in the storage by eliminating the redundant copies of the data and provides a reference to the existing unique data. On the other hand, cloud storage is growing day by day due to the large volumes of data generated every day. The users make use of cloud to store the large amount of data available with them. Many Internet services such as blogs and social networks which produces huge amount of data may contain a lot of redundancies between them. To efficiently store and manage such kind of data, de-duplication comes into existence. This paper intends to apply data de-duplication framework in the cloud environment and to assess their performance of compressed storage area with respect to two de-duplication strategies such as file level and chunk level. The combination of performing de-duplication along with compression has also improved the compression rate of the storage device. This research achieves efficiency in terms of storage in large. Also it is obvious from the experiments that the performance of the chunk level is better than the file-level data de-duplication.

M. Deivamani, Rashmi Vikraman, S. Abirami, R. Baskaran
Study on Fingerprint Images Using Delaunay Patterns to Identify Hereditary Relations Among Family Members of Three Generations

In this work, an effort is taken to identify hereditary relation among intraclass family members using fingerprint minutiae features. These minutiae features include endings and bifurcations. The fingerprint images are collected from 324 subjects of 54 different families, of three generations, comprising three members in each family. As the quality of the fingerprint images needs to be improved for minutiae feature extraction from thinned ridge image, a series of preprocessing and postprocessing steps are sequentially applied. Using structural elements, thinned ridge pattern was obtained from the preprocessed image. Then, morphological operations are performed over the thinned ridge pattern to remove spurious elements such as spurs, bridges, breaks, and dots. After removing the spurious elements, valid minutiae points (endings and bifurcations) are obtained using morphological operators. The minutiae points are analyzed to study the hereditary relation with the help of Delaunay patterns which is formed with the help of the coordinate points of the minutiae features. The result shows that the intrafamily members are having the similar type of Delaunay patterns and it proves the presence of hereditary relation among intrafamily members.

C. Karthikeyini, V. Rajamani
Evaluation of Worst-Case Execution Time of Tasks on Multi-core Processor

In hard real-time systems, it is required to compute the worst-case execution time (WCET) of each task that has become a difficult problem. The increasing complexity of modern processor architectures makes achieving this objective more and more challenging. To measure the WCET of a task, it is necessary to develop a framework on multi-core platform considering the nature of a program and shared resources. Path analysis, one of the static-based approaches is used. It is important to take into account the analysis of paths as it calculates the execution time of each basic block with respect to the cache behavior. The WCET estimates of this analysis are performed using SimpleScalar simulator.

R. Suraj, P. Chitra, S. Madumidha
Back-Emf-Based Sensorless Field-Oriented Control of PMSM Using Neural-Network-Based Controller with a Start-Up Strategy

This paper describes back-emf-based sensorless field-oriented control (FOC) of permanent magnet synchronous motor (PMSM) of surface-mounted type, employing neural-network-based controller for current and speed control. The dynamic response is improved. Further, rotor position is estimated by back-emf method. To overcome the shortcoming of back-emf-based control in zero and low speed, a start-up strategy is proposed, to predict the initial rotor position. The PMSM drive model is simulated in MATLAB/Simulink environment, and the results show improved dynamic response with a start-up strategy.

V. S. Nagarajan, M. Balaji, V. Kamaraj
FPGA-based Intelligent Control of AC Motors

The most commonly used controller in the industry field is the proportional-plus-integral (PI) controller, which requires a mathematical model of the system. Fuzzy logic controller (FLC) provides an alternative to conventional PI controller, especially when the available system models are inexact or unavailable. Also, rapid advances in digital technologies have given designers the option of implementing controllers using field programmable gate array (FPGA) which depends on parallel programming. This method has many advantages over classical microprocessors. In this research work, a FLC, which is fabricated on modern FPGA card (Spartan-3A, Xilinx Company), is proposed to implement a prototype of a speed controller for three-phase induction motor (squirrel cage type). The FLC and the PWM inverter strategies which have been built in FPGA appeared fast speed response and good stability in controlling the three-phase induction motor. For comparison purpose, a conventional PI controller has been implemented in the same FPGA card to examine the performance of the FLC. These controllers have been tested using MATLAB/Simulink program under various reference speeds. The designed FPGA-based closed-loop FLC’s performance is weighed against with that of a conventional PI controller. The system has been simulated in MATLAB/Simulink, and the results have been attached. The simulation and experimental results obtained using a FPGA-based conventional PI controller and the FPGA-based FLC have been compared in terms of settling time, and it has been found out that the proposed FPGA-based FLC shows a better performance than the conventional PI controller.

Valantina Stephen, L Padma Suresh
Secretly Shared QR Code and Its Applications

Quick response codes (QR codes) are widely being used for various applications such as user authentication and advertisements. The data in QR codes can easily be read with the help of a QR code reader application installed in a smartphone with camera. Here, we cannot neglect the possibility of an attacker tampering the QR code data. For example, in authentication, the attacker can forge the text password. In order to avoid such attacks as well as hide this data from the public, we propose a secret sharing scheme with reversing. This scheme not only splits the QR code into different shares, but also allows us to perfectly reconstruct back the original QR code.

G.S. Devisree, K. Praveen
Phish Indicator: An Indication for Phishing Sites

Phishing is a simple social engineering technique that functions by creating a fake Web site, often imitating a legitimate site. Despite many anti-phishing tools are developed, the phishing attacks are still a drift of trust in Internet security. In this paper, the indicator acts as an enhancement toward the usability of the cyber trust mechanisms. Here, the phish indicator is developed as a browser extension, which will detect and classify the URL as phishing or genuine site. The classification of the URL visited by the user is done using Levenshtein algorithm and some heuristic criterions. It will alert the user with a message of whether the URL visited is a phishing site or a genuine site. Thus, the indication will help whenever the user attempts to give away his information to a Web site that is considered untrusted.

S. Aparna, Kandasamy Muniasamy
Using Signature for Hidden Communication Prevention in IP Telephony

This paper prevents the steganographic method for IP telephony called transcoding steganography (TranSteg). Typically, in TranSteg, it is the overt data that is to compress to make space for steganogram. TranSteg finds appropriate codec that will result in the voice quality similar to the original. In signature-based system, the signature of the voice payload is appended to the packet to provide integrity. The signature of the data becomes invalid when there is some change in voice data. This method detects the hidden communication between a VoIP call. The signature-based scheme concept is described in this paper.

Ajvad Rahman, P. P. Amritha
Classification Using Fractal Features of Well-Defined Mammographic Masses Using Power Spectral Analysis and Differential Box Counting Approaches

Computer-aided diagnosis (CAD) of mammograms assists the radiologists to detect the mammographic mass presence. Since the variability of shapes of such breast masses occurs very frequently, it is more difficult to classify them into benign and malignant stages. One efficient approach to classify them is the fractal analysis by deriving their shape features. Various methods have been proposed for the fractal dimension (FD) computation of region of interest (ROI) in biomedical images. Among those, two methods, namely power spectral analysis (PSA) and differential box counting method (DBCM), are used here for the FD computation of breast contour margins. Fractal analysis by PSA method is a frequency-domain approach which is applied to the one-dimensional (1D) signatures of the two-dimensional (2D) breast mass contours. However, the DBCM model assigns the smallest number of boxes that cover the whole image surface. Finally, a comparative analysis is performed between the above-said two methods which show the PSA method yields better accuracy than the DBCM.

P. S. Menon Dhanalekshmi, A. C. Phadke
Efficient Silhouette-based Input Methods for Reliable Human Action Recognition from Videos

In this paper, we deal the problem of classification of activities in videos by learning in a different way of modifying the usual input features to enrich the efficiency by reducing the space and time consumption. For efficient action recognition, representing important information is important for reliability. The motivation here is to extract binary silhouette (shape) information from real or raw videos. Most of the researchers are using sequence matching scheme consists of a set of successive silhouette frame that consumes much time and space for learning the system. Here, we adopted recently proposed subspace learning method: spectral regression discriminant analysis (SRDA) for dimensionality reduction. Median Hausdorff distance was used for similarity measures to match the embedded action trajectories. Then, action classification is achieved in a nearest neighbor framework. For efficient learning using SRDA, we use (i) raw silhouette, (ii) images obtained after doing distance transform (DT) in the raw silhouette, (iii) images obtained after extracting edges from the raw silhouettes—edge representation (iv) silhouette history image (SHI), and (v) silhouette energy image (SEI) that gives shape and motion information as an input feature. Using these different input methods, we achieved 100 % correct recognition rate (CRR) for each cases. From the result, it is evident that SHI and SEI is found effective input method than other methods in terms of time and space consumption.

G. Glorindal Selvam, D. Gnanadurai
Vertical Handoff Scheme in Integrated WiMAX and WLAN Networks

This paper deals with a proactive vertical handoff method to provide the QoS in the integration of WLAN and WiMAX networks. The handoff process is controlled by the vertical handoff management scheme. For every neighboring node in the destination network, the QoS metrics such as bandwidth and delay is measured. Based on the QoS metrics measured, the handoff process is initiated. This work is simulated in NS-2.34 network simulator. The simulation result shows a high performance increase in the increase in speed. This work also analyzed the performance of the integration of networks from WLAN–WiMAX and WiMAX–WLAN. From the analysis, the minimum delay is achieved in WLAN–WiMAX and high bandwidth in WiMAX–WLAN networks.

A. Anitha, J. JayaKumari
An Analysis of Various Edge Detection Techniques on Illuminant Variant Images

One of the key challenges of face recognition is varying illumination. Many feature-based methods were developed in the recent years to detect the illuminant invariant facial features. This paper mainly focuses on the effect of various edge detection algorithms on illuminant variant images. The conventional sobel and adaboost are applied. Along with that, the proposed NSCT integrated ant colony optimization (ACO) approach is also used. The proposed method comprises a normal shrink filter in NSCT domain which produces illuminant invariant for the given image. Then, to capture the important geometrical structures and to reduce the feature dimensionality, ACO algorithm is performed. This combined approach fairly detects the edges with improved quality. Finally, for recognition, a graph matching algorithm is employed. This algorithm utilizes a group of feature points to explore their geometrical relationship in a graph arrangement. While applying the three methods to the YaleB database, experimental results show that the proposed work yields the better recognition.

S. H. Krishna Veni, L Padma Suresh
CBAODV: An Enhanced Reactive Routing Algorithm to Reduce Connection Breakage in VANET

Vehicular ad hoc networking (VANET) is a network that helps a vehicle to send information to nearby vehicles, and messages can be routed from vehicle to vehicle. In VANET, connection breakage (CB) is an important issue in the existing routing protocols such as ad hoc on-demand distance vector (AODV), DSR, and DSDV. In this paper, an enhanced reactive routing algorithm to reduce CB is proposed called as CBAODV based on AODV. The CBAODV allows route discovery and route maintenance, and it also reduces the CB. The experimental results are very promising as the proposed algorithm exhibits superior performance with respect to traditional AODV routing algorithm in terms of QoS metrics like end-to-end delay and packet delivery ratio. NS2 and MOVE tools are used as simulator to simulate the network and to prove the effectiveness of the research work.

N. Arulkumar, E. George Dharma Prakash Raj
An Enhanced Associative Ant Colony Optimization Technique-based Intrusion Detection System

There are several intrusion detection models that are presented till now as per my study. There are several approaches including data mining, neural network, naïve basis, etc., that are applied for finding the intrusions. But there is still a need for betterment in this direction. Our paper focuses on the limitation faced in the traditional approach. In this paper, we suggest a hybrid framework based on association rule mining (ARM) and ant colony optimization (ACO). Combining the properties of association and ant colony may provide better classification in comparison with the previous methodology. In our approach, we consider the dataset of NSL-KDD. It is a dataset that does not include redundant record, and test sets are reasonable which is mentioned in [

1

]. Then, we consider equal proportion of 10,000 dataset from the whole dataset. We first divide it into two parts based on normal establishment and termination. Then, we consider the normal dataset, and for finding the intrusions, we calculate the support value based on the matching factor. Then, we apply ACO technique to check the global optimum value. If the value crosses the limit value, then the node will be added into the final attack category. Finally, based on the attack category of denial of service (DoS), user to root (U2R), remote to user (R2L), and probing (Probe), we find the final classification. Our results support better classification in comparison with the previous techniques used in several research papers as per our study.

Chetan Gupta, Amit Sinhal, Rachana Kamble
Fuzzy-Based Sign Language Interpreter

Sign language is a language through which communication is possible for the deaf–dumb community. It is the only communication mean for that physically challenged people. But the hearing people never try to learn the sign language. So the deaf people cannot interact with the normal people without a sign language interpreter. Sign language depends on sign patterns, i.e., orientation and movements of the arm to facilitate understanding between people. This paper presents a methodology which recognizes the sign language using fuzzy logic controller and translates into a normal text and speech. This system uses glove fitted with flex sensors, inertial measurement unit (IMU) to gather data on each finger’s position to recognize their sign symbol using fuzzy control algorithm. After defuzzifying the output, it sends a unique set of signals to the PIC microcontroller with speak jet IC which is preprogrammed to speak desired sentences and also speech recognizing module is interfaced with microcontroller for converting voice to text.

P. Anitha, S. Vijayakumar
Optimized Distributed Text Document Clustering Algorithm

Due to scientific progression, a variety of challenges exist in the field of information retrieval (IR) . These challenges are due to the increased usage of large volumes of data. These enormous amounts of data are available from large-scale distributed networks. Centralization of these data to perform analysis is difficult. There exists a need for distributed text document clustering algorithms that overcomes challenges in clustering. The two main challenges are clustering accuracy and clustering quality. In this paper, an optimized distributed text document clustering algorithm is proposed that uses a distributed particle swarm optimization (DPSO) algorithm for the purpose of optimizing and generating initial centroids for the distributed K-means (DKMeans) clustering algorithm. This improves the quality of clustering. Similarity is determined using Jaccard coefficient that generates coherent clusters, thus improving the accuracy of the proposed algorithm. Extensive evaluations based on simulation are carried out with the given data sets to demonstrate the effectiveness of the algorithm. Data sets such as Reuters-21578 and 20 Newsgroups are used for evaluation.

J. E. Judith, J. Jayakumari
A Study on Medical Data Hiding Using Wireless Capsule Endoscopy

Recently wireless body area networks (WBAN) has been designed to monitor and collect patient’s vital parameters continuously and send them to the remote healthcare center or hospital. It is very important to protect the confidential data while transferring them. Different steganography techniques have been proposed for that. In a wavelet-based steganography, ECG signals were used to hide the patient’s confidential data and there is system degradation and it is not suitable to see the smoothness along the contours of the image. Here, contourlet transform is proposed, where image can be used as a host medium to improve the system performance. It combines encryption and scrambling technique to protect patient’s confidential data. The proposed method allows images to efficiently hide the patient’s confidential data and other physiological information. Typically, patient’s biological signals and other physiological readings are collected using sensors. The host image can be obtained from the technique called wireless capsule endoscopy (WCE). Capsule endoscopy is a way to record images of the digestive tract which is used in medical field. The capsule looks like a pill and contains a tiny camera. After a patient swallows the capsule, it takes pictures of the inside of the gastrointestinal tract. Encryption of patient’s confidential data prevents an unauthorized person who does not have the shared key from accessing patient’s confidential data. Therefore, the proposed contourlet technique guarantees secure medical data transfer.

R. Suji Pramila, A. Shajin Nargunam
Video Data Mining Information Retrieval Using BIRCH Clustering Technique

Nowadays, many applications with massive amount of data caused limitation in data storage capacity and processing time. Traditional data mining is not suitable for this kind of application, so they should be tuned and changed or designed with new algorithms. With the advance technology of multimedia and networking, the digital video contents are widely available over the Web. Thus, it is growing in a faster manner for a wide usage of multimedia applications. It can be downloaded and played using various devices such as cell phones, palms, and laptops with networking technologies such as Wi-Fi, HSDPA, UMTS, and EDGE. The successive Web sites such as Google Video, YouTube, and iTunes are used to download/upload the videos. In such a scenario, a tool would be really required for performing video browsing. Recently, many applications are created for categorizing, indexing, and retrieving the digital video contents. These applications are used to handle large quantity of video contents. The proposed method facilitates the discovery of natural and homogeneous clusters.

D. Saravanan, S. Srinivasan
Handwritten Tamil Character Recognition Using Zernike Moments and Legendre Polynomial

Optical character recognition systems have been effectively developed for recognizing the printed characters of many non-Indian languages such as English and Chinese. At early stages, few research works were carried out for recognizing the handwritten characters, and now, various efforts are on the way for the development of efficient systems for recognizing the Indian languages, especially for Tamil, a south Indian language widely used in Tamilnadu, Pudhucherry, Singapore, and Srilanka. In this paper, an OCR system is developed for the recognition of basic characters in handwritten Tamil language, which can handle different font sizes and font types. Zernike moments and Legendre polynomial which have been used in pattern recognition are used in this system to extract the features of handwritten Tamil characters. A comparative study is performed in this paper. Feed forward back propagation neural classifier has been used for the classification of Tamil characters.

Amitabh Wahi, S. Sundaramurthy, P. Poovizhi
Encrusted CRF in Intrusion Detection System

Intrusion detection system’s aim is to note malicious activities in a network. But it has to tackle many challenges against its goal. Intrusion detection is the action of detecting inapt, inexact, and abnormal activity in the network data. Network security is a major worry in large organization. Data integrity, secrecy, and ease of use must be conserved in order to make certain network security. In this paper, we consider the accuracy as the first issue and efficiency as the second issue using conditional random field and encrusted method. The proposed technique performs well than best-known methods such as naïve Bayes and Decision tree. The probe layer attacks stops network service. Remote to local (R2L), User to root (U2R) attacks and denial-of-service attacks (DOS) are widely known attacks that make impact on network assets. Improved attack detection efficiency can be obtained through CRF and high efficiency by implementing encrusted approach.

S. Vinila Jinny, J. Jaya Kumari
Effective Pattern Discovery and Dimensionality Reduction for Text Under Text Mining

Huge data mining techniques have been used for mining useful pattern in text document. Text mining can be used to extract the data in document. It is effectively use and update the discovered pattern; still the research is not yet completed. The existing approach is term-based approach; they suffer the problem of polysemy and synonymy. In the past years, people have used pattern-based approaches for hypothesis, which perform better than the term-based ones, but many of the experiments do not support this hypothesis. This paper presents a new idea about the effective pattern discovery technique which involved the processes of pattern deploying and pattern evolving, to improve the effectiveness of using and updating discovered patterns for finding relevant and useful information.

T. Vijayakumar, R. Priya, C. Palanisamy
A Cross-Layer Analysis for Providing Mobility in Wireless Body Area Networks

A wireless body area network consists of a network of implanted and wearable sensor nodes to transmit the biological information for wireless healthcare monitoring, elderly care, and many other innovative applications. To support mobility in these resource-constrained networks, suitable modifications have to be performed both at the medium access control and routing layers. Low-duty-cycle MAC protocols that avoid energy problems in the MAC Layer have been taken, and the performance with the RPL routing protocol has been evaluated. RPL is routing protocol proposed by IETF ROLL working group for low-power and lossy networks. In this paper, the performance of XMAC and ContikiMAC is analyzed with respect to RPL parameters and the results show that ContikiMAC consumes 49 % less power compared to XMAC.

R. Venkateswari, S. SubhaRani, V. Sudharshan, R. Umadharshini
Multiview Clustering in Heterogeneous Environment

Integrating multiview cluster is a crucial issue in heterogeneous environment. In multiview clustering, objects from the multiple sources can be limited up to homogeneous environment by sharing the same dimensions. In this paper, novel-based tensor methods are used. They are (1) multiview clustering based on the integration of the Frobenius-norm objective function (MC-FR-OI) and (2) matrix integration in the Frobenius-norm objective function (MC-FR-MI). These frameworks worked by using tensor decomposition. Experimental results demonstrate that proposed methods are effective in multiview data integration. Here, higher-order data are used. The performance by using higher-order data is better when compared with the two-dimensional data.

A. Bharathi, S. Anitha
Implementation of Isolated Two Inductor Boost Converter for Induction Motor Drive Applications

This paper proposes modified two inductor boost converter (TIBC) along with a three-phase inverter circuit for induction motor drive application. The modified TIBC is for the first stage of DC/DC converter. The input current is distributed through two boost inductors having its current ripple amplitude halved at the twice of PWM frequency. The voltage doubler is applied to reduce turn’s ratio of the transformer. This work is further improved with non-isolated recovery snubber with constant duty cycle control to improve its efficiency. The main reason of using snubber, TIBC, is not suitable for motor drive system. The reason is motor demand low power at low speed and during start up and stop conditions. Moreover, TIBC requires minimum operation load to maintain an output voltage, i.e., below certain load condition energy transferred to output capacitor is not completely transferred load causing an increase in output voltage, because the inductors are charged even if there is no output current. TIBC is able to turn on both the active switches at zero voltage to reduce their switching losses and raises the conversion efficiency. Since the two parallel-operated boost units are identical, operation, analysis, and design for the converter module becomes quite simple.

S. Lavanya, T. Annamalai
A Generic Bio-inspired Framework for Detecting Humans Based on Saliency Detection

Even with all its advancement in technology, computer vision system cannot competes with nature’s gift—the brains, that arranges the objects quickly and extract the necessary information from huge data. A bio-inspired feature selection method is proposed for detecting the humans using saliency detection. It is performed by tuning prominent features such as color, orientation, and intensity in bottom-up approach to locate the probable candidate regions of humans in an image. Further, the results improved in detection phase that makes use of weights learned from training samples to ignore non-human regions in the candidate regions. The overall system has an accuracy rate of 90 % for detecting the human region.

R. Aarthi, J. Amudha, Usha Priya
Efficient Approach to Detect and Localize Text in Natural Scene Images

Text in natural scene images may provide important information based on the application. Detecting text from natural scene should be effective; for that, segmenting text from natural scene images should use a high-performance method. In this paper, an efficient segmentation and classification technique is used. Given system takes natural scene images as input. After converting the color image to grayscale image, histogram of oriented gradients (HOG) features is used to find the edge values. Image is segmented using Ni-Black local binarization, which identifies the edge on suppressing image’s background. Image is classified using CRF which blocks the text in the natural scene images. This system provides better segmentation of text and classifies with high detection accuracy.

S. R. Surem Samuel, C. Seldev Christopher
Enhanced Certificate-Based Authentication for Distributed Environment

The unsecured open network is full of threats, viruses, and malicious Trojans. Digital certificates are one fundamental approach for providing safe and sound online security. In this paper, we propose a system that enables organizations to act as a CA and issue digital certificates to their clients. The client uses the certificates to access the services from the organization. The advantage of the proposed system is that it is more secure and faster than the other traditional systems. The various attacks relating to digital certificates were analyzed, and appropriate measures were suggested. These measures were considered in developing the proposed system. The paper also describes the procedure for generating, issuing, and revoking certificates and how it is implemented using Java platform.

A. Jesudoss, N. P. Subramaniam
Diagnosis and Segmentation of Brain Tumor from MR Image

Abnormal growth of tissues inside the brain leads to the formation of brain tumor. In order to decrease the mortality rate due to brain tumor, efficient techniques for the earlier detection of the tumors are required. The present-day technological advancements include the magnetic resonance imaging (MRI) scans, computed tomography (CT) scans, and advanced X-rays which help us in tumor detection. The MRI scans are widely used nowadays because of its noninvasive radiation and the accuracy of the images. This paper proposes a strategy for efficient detection of a brain tumor in MRI brain images. The system proposed in this paper is a handy tool for accurate prediction and segmentation of brain tumor. The general properties of the images called the gray-level co-occurrence matrix (GLCM) features and the spatial feature (mean) are combined with the transform domain-based discrete cosine transform (DCT) features that are extracted from the MR images are used as feature sets. Support vector machine (SVM) is used as the classifier which classifies the images as tumorous or non-tumorous. Once the classification is done using SVM classifier, the tumorous images are alone subjected to watershed transform-based segmentation to exactly extract the tumor region alone. The entire experiment is conducted on the images of various patients acquired from MEDALL DIAGNOSTICS, Tiruchirappalli, India. The method gives accuracy of over 99 %, hence improving the chances for the patients’ survival by proper detection of tumors at an early stage.

S. V. Srinivasan, K. Narasimhan, R. Balasubramaniyam, S. Rishi Bharadwaj
Improving the Accuracy of Latent Fingerprint Matching Using Texture Descriptors

Fingerprint matching is a process used to check whether two sets of fingerprint come from the same finger of a person. There are three types of fingerprints in law enforcement applications such as rolled, plain, and latent. Latent fingerprints are partial fingerprint, obtained from the surfaces of objects where a person has touched. It may or may not be an accidental touch. Latent fingerprint contains small area of prints as compared to full fingerprints. We cannot apply a full fingerprint matching algorithm for the latent fingerprint matching. Matching between a latent and a rolled print is a complex task because the number of minutia points will be less. Enhancement of fingerprint is necessary due to low quality of latents and sensor noise. We have done latent fingerprint matching using Hough transform algorithm. Experimental results on NIST latent fingerprint database show an accuracy of 54.43 %. We have enhanced the accuracy by incorporating texture-based features like entropy, correlation, contrast, homogeneity, and energy.

V. Dhanusha, T. R. Swapna
Control Strategy for PQ Improvement in an Autonomous Microgrid Using Bacterial Foraging Optimization Algorithm

This paper bestows a global optimization algorithm-based optimal power control stratagem for an island microgrid. The foremost aim is to improve the power quality of the microgrid. The primary performance parameters that are considered are voltage regulation and frequency regulation, especially starting of island mode. An inner loop of current control and an outer loop of power control are combined to form the projected control strategy. Bacterial foraging optimization algorithm (BFOA) is an intellectual search algorithm which is employed for self-tuning the control parameters. To validate the performance of the controllers, simulation is performed with the help of MATLAB/Simulink software.

N. Chitra, A. Senthil Kumar, P. Priyadharshini, K. M. Shobana
Maximum Intensity Block Code for Action Recognition in Video Using Tree-based Classifiers

Human action recognition is a broad research area in computer vision community. Human actions are identified by connotation of body movements. In this paper, an action recognition approach based on maximum motion identification is proposed. Maximum Intensity Block Code (MIBC) is extracted as features. The experiments were carried out using Weizmann action dataset considering nine activities viz (walking, running, jumping, side, bend, waving one hand, waving both hands, jump-in-place-on-two-legs or pjump and skip) and the various tree based classifier like Random Forest, Naive Bayes, Random Tree, and Decision Tree (J48) utilized. In the experimental results, random forest classifier showed the best performance with an overall accuracy rate of 95.3 % which outperforms other algorithms.

J. Arunnehru, M. Kalaiselvi Geetha
An Efficient Coding Method for Indexing Hand-based Biometric Databases

Biometric identification systems capture biometric (i.e., fingerprint, palm, and iris) images and store them in a central database. During identification, the query biometric image is compared against all images in the central database. Typically, this exhaustive matching process (linear search) works very well for the small databases. However, biometric databases are usually huge and this process increases the response time of the identification system. To address this problem, we present an efficient technique that computes a fixed-length index code for each biometric image. Further, an index table is created based on the indices of all individuals. During identification, a set of candidate images which are similar to the query are retrieved from the index table based on the values of query index using voting scheme that takes a constant time. The technique has been tested on benchmark PolyU palm print database and NTU Vein pattern database. The technique performs with lower penetration rates for 100 % hit rate for both the databases. These results show a better performance in terms of response time and search speed compared to the state-of-the-art indexing methods.

Ilaiah Kavati, Munaga V. N. K. Prasad, Chakravarthy Bhagvati
Optic Disk and Optic Cup Segmentation for Glaucoma Screening

Glaucoma is a chronic eye disease that leads to visionless. As it cannot be cured, detecting the disease in time is important. Current tests using intraocular pressure (IOP) are not sensitive enough for population-based glaucoma screening. Optic nerve head assessment in retinal fundus images is both more promising and superior. This paper proposes optic disk and optic cup segmentation using super pixel classification for glaucoma screening. In optic disk segmentation, histograms, and center surround statistics are used to classify each super pixel as disk or non-disk. A self-assessment reliability score is computed to evaluate the quality of the automated optic disk segmentation. For optic cup segmentation, in addition to the histograms and center surround statistics, the location information is also included into the feature space to boost the performance. The proposed segmentation methods have been evaluated in a database of 650 images with optic disk and optic cup boundaries manually marked by trained professionals. Experimental results show an average overlapping error of 9.5 and 24.1 % in optic disk and optic cup segmentation, respectively. The results also show an increase in overlapping error as the reliability score is reduced, which justifies the effectiveness of the self-assessment. The segmented optic disk and optic cup are then used to compute the cup to disk ratio for glaucoma screening. Our proposed method achieves areas under curve of 0.800 and 0.822 in two data sets, which is higher than other methods. The methods can be used for segmentation and glaucoma screening. The self-assessment will be used as an indicator of cases with large errors and enhance the clinical deployment of the automatic segmentation and screening.

G. Veerasenthilkumar, S. Vasuki, R. Rajkumar
Deadline-based Priority Management in Cloud

In cloud computing, the word “cloud” is used as metaphor for “the Internet,” so the phrase cloud computing means “a type of Internet-based computing” where different services such as servers, storage, and applications are delivered to an organization’s computer and devices through the Internet. Today, cloud computing is on demand that means cloud providers provide their services in pay-as-you-manner. So, there must be a provision that all resources are made available to requesting users in efficient manner to satisfy their needs. Since this resource allocation schemes offered by cloud computing, effective scheduling algorithms are important to utilize these benefits. There are many scheduling algorithms such as task grouping, priority aware, and SJF (shortest job first) to reduce the waiting time and maximize the resource allocation. In this paper, a new algorithm has been proposed namely deadline–first scheduling algorithm, which considers deadline as a crucial factor. Deadline is a major concept in negotiation of SLA in cloud computing to prevent penalties.

E. Iniya Nehru, Saswati Mukherjee, Abhishek Kumar
Countering Flood Attacks and Routing Misbehavior in DTN

Disruption-tolerant networks are prone to flood attacks in which an attacker floods the network by forwarding as many packets into the network or forwarding replica of packets into network for selfish or malicious purpose. Also selfish or malicious nodes may drop received packets intentionally even when their buffer is not full. Such routing misbehavior reduces the packet delivery ratio and wastes system resources. The proposed approach provides measures to detect flood attack using a method called claim construction. For detecting routing misbehavior, it is required that a node keeps a few signed contact logs of its previous contacts. This signed contact log consists of information of number of packets in node’s buffer before this contact. Based on this contact log, the next contacted node can detect whether this node has dropped any packet.

J. Karunya Priyadarshini, P. Jesu Jayarin, J. Visumathi
Static ATC Estimation Using Fully Complex-Valued Radial Basis Function Neural Network

In the deregulated power systems, the available transfer capability (ATC) should be known for secure and reliable operation. ATC mainly depends on load for a particular transaction. Due to complex nature of load, it is better if the ATC estimator is able to handle this complex nature. This paper presents fully complex-valued radial basis function (FC-RBF) neural network approach for ATC estimation for bilateral transaction under normal condition. The training patterns for neural network are generated using differential evolution algorithm (DEA). The important feature of the proposed method is the use of input reduction techniques to improve the performance of the developed network. Differential evolution feature selection (DEFS) technique is proposed to reduce the complexity and training time of neural network. The proposed method is tested on IEEE 118 bus system, and results are compared with DEA and repeated power flow (RPF) results. The test results show that the proposed method reduces the training time and it is suitable for online application.

M. Karuppasamypandiyan, R. Narmatha Banu, P. M. Manobalaa
Big Data and Cloud: A Survey

Today, the world has become closer due to the development of Internet. More people communicate via Internet, and the volume of data to be handled also grows. Nowadays, we talk about peta- and zettabytes of data and this volume of data needs to be processed and analyzed further which had led to the research field of big data storage and analysis. Cloud computing is another emerging area in which the services such as infrastructure, storage, and software are provided to the consumers on demand basis. In this paper, we discuss about the big data, cloud computing, and how big data are handled in cloud computing environment.

K. S. Sangeetha, P. Prakash
An E-mail Application on Named Data Networking

Named data networking (NDN) is a network layer protocol that aims to replace the IP layer in the current Internet architecture. Currently, e-mails are sent from one mail server to another by identifying each server by its IP address. In this paper, we propose to develop an e-mail application that runs on NDN. The challenge in developing an e-mail application on NDN is to completely abandon the usage of IP addresses and concentrate on the ‘content’ to be sent or received. We intend to send a mail from sender’s computer to receiver’s computer over NDN. We propose to adapt the protocols used in e-mail transfer such as SMTP and POP to suit the characteristics of NDN.

Sandhya Giri, Anupama Josyula, V. Vetriselvi
Hierarchical Character Grouping and Recognition of Character Using Character Intensity Code

This paper presents an approach for grouping and recognition of handwritten characters. The approach uses an efficient feature called character intensity code (CIC). A hierarchical recognition methodology based on the structural details of the characters is adopted. At the first level, similar structured characters are grouped together, and the second level is used for individual character recognition. Support vector machine is used for classification which achieves an overall accuracy of 93.61 %.

V. C. Bharathi, M. Kalaiselvi Geetha
Multi-view Face Expression Recognition—A Hybrid Method

Facial expressions play a significant role in human communication, and their automatic recognition has several applications especially in human–computer interaction. Recognizing facial actions is very challenging due to uneven facial deformation, angle of face poses, ambiguous and uncertain face component measurements. This paper proposes a hybrid approach for face pose detection and facial expression recognition. To speed up expression evaluation process, pose estimation is carried out prior to feature extraction to select appropriate shape model. The major contribution of this paper is introducing a hybrid classification method which uses Ada-Boost for Action Unit classification and Temporal Rule-based classification for correcting Action Unit errors. The experimental results show that this hybrid classification method produces better performance than other classifier which ideally improves overall performance of the system.

Prakash Natarajan, Vijayalakshmi Muthuswamy
A Scheme to Create Secured Random Password Using Markov Chain

Habitually, access to computer systems is based on the use of alphanumeric passwords. Password affords the foremost line of protection against illicit access to computer. Password security is of course only one factor of overall system security. Even though it is an essential component, passwords are measured as the fragile link in computer security. Most users will make use of simple passwords. Simple passwords are easy to memorize, but in the same sense easy to crack!! However, the common user is likely to use simple password and more often the same password for different login. This makes them vulnerable to various types of cyber attacks. To create/generate secured random passwords, in this paper, we describe a new scheme of creation/generation of passwords using Markov chain technique. The tree structure of creation/generation of passwords using Markov chain technique is also specified.

S. Vaithyasubramanian, A. Christy
Analysis of Shortest Route for Heterogeneous Node in Wireless Sensor Network

In this paper, we deeply investigate a routing model for heterogeneous nodes in wireless sensor networks using Voronoi cell. We estimate the actual traffic among the sensor node, which is defined clearly as the traffic packets, controlled at each server. Network load is monitored using the traffic inbound rules, and the estimation is defined in the circular pattern in the form of Voronoi cell. Each functional patterns of the traffic are classified as source and destination in asymptotic rule. Each sensor nodes traffic are redirected to the centralized server acting in the real world, where the sensor data are patched periodically, and the data packets traveling from the node to node are updated. Each traffic patterns and sensor nodes are classified, and the node communication regions are known to the base station by drawing the pattern in Voronoi. The experimental results show the actual working model, and our routing model yields 78 % accuracy.

L. Lakshmanan, T. C. Tomar
Design of a Triple Band Slit-loaded Patch Antenna with DGS for Bandwidth Enhancement

This paper demonstrates the design and result of a low profile wide band miniaturized slit-loaded microstrip patch antenna with defected ground structures (DGS) for wireless applications. The proposed antenna is coaxially probe fed that operates at 5.5 GHz and has an array of rectangular slits etched on the patch and a pair of square-shaped dumbbell DGS loaded into the ground plane. The bandwidth of the slit-loaded microstrip patch antenna is 12 %, whereas it increases up to 17 % when DGS is loaded into the ground plane. The proposed compact antenna resonates at multiple frequencies between 1 and 9 GHz frequency range, at 4.84, 5.96, and 6.34 GHz with a bandwidth of 330, 150, and 460 MHz having return loss of −17.2, −18.5, and −27.3 dB, respectively. All the antenna geometries in this paper are simulated using an electromagnetic simulation tool named CADFEKO. The proposed multiband antenna finds useful applications with an improved performance simultaneously in W-LAN (4.8–5.8 GHz) and satellite communication bands (5.92–6.42 GHz). The plots of various antenna parameters such as return loss, voltage standing wave ratio (VSWR), gain, and bandwidth of the proposed antenna have been observed with and without DGS and the analysis of the simulated results shows that the proposed slit-loaded antenna with DGS is more efficient than conventional rectangular microstrip patch antennas (CRMPA) in terms of bandwidth of operation and miniaturization.

Pristin K. Mathew, Sneha Mohan
Segmentation of Nuclei from Breast Histopathology Images Using PSO-based Otsu’s Multilevel Thresholding

Automated histopathology image analysis involves segmentation of nuclei from the surrounding tissue structures to develop a computer-aided diagnosis (CAD) system. In this paper, we propose the use of particle swarm optimization (PSO)-based Otsu’s multilevel thresholding technique to automatically segment the nuclei from hematoxylin and eosin (H&E)-stained breast histopathology images. Otsu’s threshold selection problem is modeled as an optimization problem by designating the discriminant criterion as the objective or fitness function that has to be maximized. PSO is used to compute the optimal threshold value that maximizes the objective function. This paper studies the effectiveness of the proposed technique to segment nuclei from breast histopathology images.

J. Angel Arul Jothi, V. Mary Anita Rajam
Wavelet Transform-Based Land Cover Classification of Satellite Images

Detection of urban expansion from land cover remote sensing images is a challenge due to the complexity of urban landscapes. Initially, the original satellite image is preprocessed and then segmented to have segments from different land classes such as of hilly land regions, vegetation area, building area, and water bodies. Different feature extraction methods such as first-order statistics, gray-level co-occurrence matrix (GLCM), and wavelet transform-based technique were applied in this paper, and the results are compared. The features of the segmented area are extracted, and then, final classification is carried out using the proposed probabilistic neural network (PNN) classifier. The classified satellite image is obtained and compared with the original image. The proposed technique is evaluated by means of accuracy parameter and produces better results using wavelet transform first-order statistics combined with PNN classifier.

D. Menaka, L Padma Suresh, S. Selvin Premkumar
Tamil Speech Recognizer Using Hidden Markov Model for Question Answering System of Railways

The research on speech and natural language is in progress for more than two decades. Recently, researchers are focused on developing speech interfaces to their corresponding automated system. For voice-based question answering system, there is the need for developing speech recognizer. Although automatic speech recognition (ASR) systems are already available, most of them have been built for English language. This paper aims to build Tamil speech recognizer for question answering system of railway information using natural language processing. Most feasible approach for speech recognition so far has been hidden Markov model (HMM) which is implemented in this research work. HMM-based recognition component is trained automatically and computationally realistic to use. The recognition accuracy of the system using HMM has up to 85 % for different speakers.

G. Vignesh, S. Sankar Ganesh
Constrained Optimal Bidding Strategy in Deregulated Electricity Market

A precise and comprehensive model is designed for optimum power trading among a number of generating companies (GenCo) having thermal generating units. These GenCos are offering different bidding prices for gaining maximum profit. There may be different category of bidding situations. Possible bidding profile combinations along with some constraints are considered in case studies. The power balance constraint includes generation, demand and transmission loss, power generation limit, and fuel cost constraints. Thermal generating units are considered here. As scheduling is interconnected with bidding, economic power scheduling has been solved using Newton–Raphson method for standard IEEE 9 bus test system. Bidder’s profits and market price have been calculated in each case. For optimization of the power bidding problem, theory of dominance of game theory has been applied and justified for searching the Nash equilibrium.

D. Palit, N. Chakraborty
Metadaten
Titel
Artificial Intelligence and Evolutionary Algorithms in Engineering Systems
herausgegeben von
L Padma Suresh
Subhransu Sekhar Dash
Bijaya Ketan Panigrahi
Copyright-Jahr
2015
Verlag
Springer India
Electronic ISBN
978-81-322-2135-7
Print ISBN
978-81-322-2134-0
DOI
https://doi.org/10.1007/978-81-322-2135-7

Premium Partner