Skip to main content
Top

2018 | Book

Progress in Intelligent Computing Techniques: Theory, Practice, and Applications

Proceedings of ICACNI 2016, Volume 2

Editors: Dr. Pankaj Kumar Sa, Dr. Manmath Narayan Sahoo, Dr. M. Murugappan, Dr. Yulei Wu, Dr. Banshidhar Majhi

Publisher: Springer Singapore

Book Series : Advances in Intelligent Systems and Computing

insite
SEARCH

About this book

The book focuses on both theory and applications in the broad areas of communication technology, computer science and information security. This two volume book contains the Proceedings of 4th International Conference on Advanced Computing, Networking and Informatics. This book brings together academic scientists, professors, research scholars and students to share and disseminate information on knowledge and scientific research works related to computing, networking, and informatics to discuss the practical challenges encountered and the solutions adopted. The book also promotes translation of basic research into applied investigation and convert applied investigation into practice.

Table of Contents

Frontmatter
Erratum to: A Browser-Based Distributed Framework for Content Sharing and Student Collaboration
Shikhar Vashishth, Yash Sinha, K. Haribabu
Erratum to: Progress in Intelligent Computing Techniques: Theory, Practice, and Applications
Pankaj Kumar Sa, Manmath Narayan Sahoo, M. Murugappan, Yulei Wu, Banshidhar Majhi

Cloud Computing, Distributed Systems, Social Networks, and Applications

Frontmatter
Review of Elasticsearch Performance Variating the Indexing Methods

In today’s world, data is increasing rapidly. Users mostly refer to internet for any information. Also a recent study shows that most of the users go to a search engine to refer to any other site also. So search has become an inseparable activity in internet. Elasticsearch is a java-based search engine that works efficiently in cloud environment. It mainly serves purpose of scalability, real-time search and efficiency that relational databases were not able to address. In this paper, we represent our involvement with Elasticsearch, an open source, Apache Lucene-based, full-text search engine that gives near real-time search ability, as well as a RESTful API for the simplicity of access to users in the various fields like education and research.

Urvi Thacker, Manjusha Pandey, Siddharth S Rautaray
Understanding Perception of Cache-Based Side-Channel Attack on Cloud Environment

Multitenancy is the biggest advantage of computing, where physical resources are shared among multiple clients. Virtualization facilitates multitenancy with the help of the hypervisor. Cloud providers virtualize the resources like CPU, network interfaces, peripherals, hard drives, and memory using hypervisor. In a virtualization environment, many virtual machines (VMs) can run on the same core with the help of the hypervisor by sharing the resources. The virtual machines (VMs) running on the same core are the target for the malicious or abnormal attacks like side-channel attacks. Cache-based attack in the cloud is one of the side-channel attacks. Cache is one of the resources shared among different VMs on the same core. The attacker can make use cache behavior and can perform the cache-based side-channel attack on the victim. In this paper, we explore different types of cache designs, categories of cache-based side-channel attacks and existing detection and mitigation techniques for cache-based side-channel attacks.

Bharati S. Ainapure, Deven Shah, A. Ananda Rao
A Semantic Approach to Classifying Twitter Users

Social media has grown rapidly in the past several years. Twitter in particular has seen a significant rise in its user audience because of the short and compact Tweet concept (140 characters). As more users come on board, it provides a large market for companies to advertise and find prospective customers by classifying users into different market categories. Traditional classification methods use TF–IDF and bag of words concept as the feature vector which inevitably is of large dimensions. In this paper we propose a method to improve the method of classification using semantic information to reduce dimensions of the feature vectors and validate this method by feeding them into multiple learning algorithms and evaluating the results.

Rohit John Joseph, Prateek Narendra, Jashan Shetty, Nagamma Patil
Timeline-Based Cloud Event Reconstruction Framework for Virtual Machine Artifacts

Traditionally, scaling the resources to meet the high dynamic needs of consumers is a challenge for organizations. Alongside cost, maintenance overheads, availability issues are contemplated. A scientific solution that considers all these is cloud computing. Moreover, recent advancements in cloud also allured many small and medium scale enterprises. But, the extent of security and privacy provided to the tenant’s data is not apparent and proper. Contemporary attacks on the cloud strengthen this argument. A reactive approach to handle the occurred incident in cloud is through performing forensics. But the domain of cloud forensics is still in its infancy state. In mid 2014, National Institute of Standards and Technology (NIST) released a draft which contains various legal, organizational, architecture, and technical challenges to perform forensics in the cloud environment. In this paper, our focus is on one of the technical challenges namely Event Reconstruction. We consider cloud virtual machine artifacts to achieve the same.

B. K. S. P. Kumar Raju, G. Geethakumari
eCloud: An Efficient Transmission Policy for Mobile Cloud Computing in Emergency Areas

Due to the less resources and battery power of mobile devices or intermittent connectivity between mobile devices and cloud, the users may face huge difficulties. There are a very few work have been proposed which can solve these problems. But, to select best cloud for mobile devices as well as to minimize the transmission latency upon mobility of mobile nodes, in this work an efficient transmission policy for Mobile Cloud Computing (MCC) named as eCloud is proposed. In eCloud, mobile nodes can select their best cloud for sending requests upon mobility of mobile nodes. eCloud can solve the problem of battlefield situation or any emergency condition like earthquake or terrorists attack.

Bibudhendu Pati, Joy Lal Sarkar, Chhabi Rani Panigrahi, Shibendu Debbarma
A Multidimensional Approach to Blog Mining

Blogs are textual web documents published by bloggers to share their experience or opinion about a particular topic(s). These blogs are frequently retrieved by the readers who are in need of such information. Existing techniques for text mining and web document mining can be applied to blogs to ease the blog retrieval. But these existing techniques consider only the content of the blogs or tags associated with them for mining topics from these blogs. This paper proposes a Multidimensional Approach to Blog Mining which defines a method to combine the Blog Content and Blog Tags to obtain Blog Patterns. These Blog Patterns represent a blog better when compared to Blog Content Patterns or Blog Tag Patterns. These Blog Patterns can either be used for Blog Clustering or used by Blog Retrieval Engines to compare with user queries. The proposed approach has been implemented and evaluated on real-world blog data.

K. S. Sandeep, Nagamma Patil
Medicinal Side-Effect Analysis Using Twitter Feed

As the use of social media network has been increasing, people tend to share health-related information on social sites. Twitter is used by large number of users and it is a wide source of information to analyze the drug related side effect. In this paper, we have developed an approach to analyze the contents of tweets to identify the adverse effects of a drug. An annotated dataset is used to train SVM classifier to identify the tweets showcasing medicinal side effects. The use of feature selection and dimensionality reduction techniques have allowed us to enhance the performance of the classifier in terms of accuracy by 10.34% as well as efficiency by nearly 66.31% as compared to the previous similar approaches.

Priyanka S. Mane, Manasi S. Patwardhan, Ankur V. Divekar
A Study of Opinion Mining in Indian Languages

Opinion mining is an area in natural language processing that is concerned with the determination of the opinion conveyed in a document text by a computer rather than through human intervention. This is extremely beneficial since it saves on resources required to inspect the document manually. Most of the research work done in this field is restricted to the English language. Opinion mining in Indian languages poses several challenges to researchers. This paper outlines some of the works done in this field for Indian languages and provides a brief description on each of them.

Diana Terezinha Miranda, Maruska Mascarenhas

Applications of Informatics

Frontmatter
A Pragmatics-Oriented High Utility Mining for Itemsets of Size Two for Boosting Business Yields

Retail market has paced with an enormous rate, sprawling its effect over the nations. The B2C companies have been putting lucrative offers and schemes to fetch the customers’ attractions in the awe of upbringing the business profits, but with the mindless notion of the same. Knowledge discovery in the field of data mining can be well harnessed to achieve the profit benefits. This article proposes the novel way for determining the items to be given on sale, with the logical clubs, thus extending the Apriori algorithm. The dissertation proposes the high-utility mining for itemsets of size two (HUM-IS2) Algorithm using the transactional logs of the superstores. The pruning strategies have been introduced to remove unnecessary formations of the clubs. The essence of the algorithm has been proved by experimenting with various datasets.

Gaurav Gahlot, Nagamma Patil
ILC-PIV Design for Improved Trajectory Tracking of Magnetic Levitation System

This paper puts forward the hybrid control algorithm, which integrates the iterative learning control (ILC) scheme with proportional integral velocity (PIV) control, for improved trajectory tracking of magnetic levitation system. ILC is a type of model-free controller, which is used for systems that perform repetitive tasks. Adjusting the control inputs based on the error information obtained during previous iterations, ILC tries to enhance the transient response of the closed-loop system. One of the striking features of ILC is that even without the full dynamic model of the plant, it can yield perfect trajectory tracking by learning the plant dynamics through iterations. Adopting this learning control feature of ILC, this paper aims to synthesize ILC with PIV for both improved tracking and better robustness compared to conventional PIV. The efficacy of the proposed ILC-PIV controller framework is assessed through a simulation study on the magnetic levitation plant for reference following application.

Vinodh Kumar Elumalai, Joshua Sunder David Reddipogu, Santosh Kumar Vaddi, Gowtham Pasumarthy
Performance Analysis and Optimization of Spark Streaming Applications Through Effective Control Parameters Tuning

High-speed data stream processing is in demand. Performance analysis and optimization of streaming applications are hot research areas. Apache Spark is one of the most extensively used frameworks for in-memory data stream computing and capable of handling high-speed data streams. In streaming applications, controlling, and processing of data streams for optimized and stable performance within the available resources is of utmost requirement. There are various parameters that can be tuned to achieve the optimum performance of streaming applications deployed on Spark. This work explores the performance of stream applications in the light of various tunable parameters in Spark. Further, a relationship among the performance response and controlling parameters is established using linear regression. This regression model enables the prediction of performance response before actual deployment of a streaming application. The work determines an interrelationship between block interval and number of threads for optimized performance of streaming application also.

Bakshi Rohit Prasad, Sonali Agarwal
A Browser-Based Distributed Framework for Content Sharing and Student Collaboration

The utilization of the networks in education system has become increasingly widespread in recent years. WebRTC has been one of the hottest topics recently when it comes to Web technologies for distributed systems as it enables peer-to-peer (P2P) connectivity between machines with higher reliability and better scalability without the overhead of resource management. In this paper, we propose a browser based, asynchronous framework of a P2P network using distributed, lookup protocol (Chord), NodeJS and RTCDataChannel; which is scalable and lightweight. The design combines the advantages of P2P networks for better and sophisticated education delivery. The framework will facilitate students to share course content and discuss with fellow students without requiring any centralized infrastructure support.

Shikhar Vashishth, Yash Sinha, K. Haribabu
Seed Point Selection Algorithm in Clustering of Image Data

Massive amount of data are being collected in almost all sectors of life due to recent technological advancements. Various data mining tools including clustering is often applied on huge data sets in order to extract hidden and previously unknown information which can be helpful in future decision-making processes. Clustering is an unsupervised technique of data points which is separated into homogeneous groups. Seed point is an important feature of a clustering technique, which is called the core of the cluster and the performance of seed-based clustering technique depends on the choice of initial cluster center. The initial seed point selection is a challenging job due to formation of better cluster partition with rapidly convergence criteria. In the present research we have proposed the seed point selection algorithm applied on image data by taking the RGB features of color image as well as 2D data based on the maximization of Shannon’s entropy with distance restriction criteria. Our seed point selection algorithm converges in a minimum number of steps for the formation of better clusters. We have applied our algorithm in different image data as well as discrete data and the results appear to be satisfactory. Also we have compared the result with other seed selection methods applied through K-Means algorithm for the comparative study of number of iterations and CPU time with the other clustering technique.

Kuntal Chowdhury, Debasis Chaudhuri, Arup Kumar Pal
Comparative Analysis of AHP and Its Integrated Techniques Applied for Stock Index Ranking

Selection of stock index is a crucial task in financial decision-making process, especially when selection criterion is conflicting in nature. Multicriteria decision-making (MCDM) method like analytical hierarchy process (AHP) is one of the most widely used method, which may be utilized in the financial domain. This paper utilizes AHP and its integrated approaches using technique for order preference by similarity to ideal solution (TOPSIS) and simple additive weighting (SAW) for ranking of stock index. Three financial years data of six indices with six criteria are considered in the selection process. Experimental results reveals that S&P BSE SENSEX index is performing consistently well for all three financial years in case of all the techniques.

H. S. Hota, Vineet Kumar Awasthi, Sanjay Kumar Singhai
A Proposed What-Why-How (WWH) Learning Model for Students and Strengthening Learning Skills Through Computational Thinking

In-spite of facilities and ambiances available in an organization, novice students often face challenges to develop a right attitude and framework for productive learning. Through a case study in our university, we explore some difficulties and challenges faced by the freshers to strengthen and enhance their learning skills. We focus on generating questions from trivial to nontrivial level in a systematic way to explore the learning patterns. We propose a learning model, which we popularly call as What-Why- How (WWH) model, for providing a framework to strengthen learning skills. Computational thinking will be a fundamental skill, which can be used by everyone in future to strengthen and enhance learning. It is a thought process that involves formulating problems so that solutions can be represented as computational steps and algorithms. In our work, we integrate the computational thinking approach in our proposed WWH model of learning and develop a novel framework to resolve some of the challenges associated with learning skills of freshers in educational institutions.

Rakesh Mohanty, Sudhansu Bala Das
Band Power Tuning of Primary Motor Cortex EEG for Continuous Bimanual Movements

Comprehension of natural intelligent systems’ capability to exhibit bimanual coordination facilitates the process of building systems with better coordination dynamics. Dark side of the bimanual coordination, ‘Bimanual interference’, is that execution of continuous bimanual movements is heavily constrained by spatiotemporal coupling. Ability of callosotomy patients to draw circle and square patterns with two different hands simultaneously with perfect uncoupling after surgical removal of Carpus callosum, makes the point clear that Carpus callosum plays key role in bimanual interference. This paper introduces a new viewpoint of this phenomenon. While the right-handed subjects were drawing asymmetric clockwise and anticlockwise circles and symmetric circle-square patterns, neural activity of Primary motor cortex, which controls movement execution, is recorded. Here major emphasis is placed on how different frequency band powers of EEG signals from primary motor cortex are altered with respect to different continuous bimanual movements. Results in this study demonstrate the essence of understanding feedback loop connections between Corpus callosum and primary motor cortex.

Manikumar Tellamekala, Shaik Mohammad Rafi

Authentication Methods, Cryptography and Security Analysis

Frontmatter
Probabilistically Generated Ternary Quasigroup Based Stream Cipher

Presently the crypto-research based on n-quasigroup for n = 3 or higher is at its nascent stages. The recent ternary quasigroup cipher was illustrated using ternary quasigroups of order 4. Practically the ternary quasigroup needs to be of order 256. Stream ciphers to be used for real-world applications need to have ternary quasigroup of order 256. The present paper is an extension of Ternary Quasigroup Stream Cipher for practical applicability. The current work introduces the concept of probabilistically generated quasigroup. The probabilistically generated ternary quasigroup of order 256 improves the cryptographic strength of the cipher.Ternary quasigroups are more desirable options over quasigroup, but they impose serious memory constraints, particularly for large orders, e.g., 256 or more. The current study dynamically generates 3-quasigroup of the order 256 without any requisite to store them. The current study employs a selection criterion to choose suitable probabilistically generated ternary quasigroup with improved cryptographic strength.

Deepthi Haridas, K. C. Emmanuel Sanjay Raj, Venkataraman Sarma, Santanu Chowdhury
Dynamic Access Control in a Hierarchy with Constant Key Derivation Cost

While providing access control in a hierarchical access structure, a partially ordered set of security classes can be used to depict an access hierarchy. Data accessible to descendants of a particular security class should also be accessible to the users of that security class. Towards this, an access control scheme is proposed for providing dynamic hierarchical access control. In the proposed solution, the storage at the users is constant. The public key storage is equal to the size of the hierarchy. Also, deriving the decryption key of a descendant class involves constant cost at the users in the security class.

Nishat Koti, B. R. Purushothama
An Efficient LWE-Based Additively Homomorphic Encryption with Shorter Public Keys

Public key encryption schemes developed based on learning with error problem became popular for homomorphic encryption, and are proved as secured schemes based on the worst-case hardness of short vector problems. Homomorphic encryption allows computations over the cipher text without decryption. Implementation of the scheme is not considered to be practical because of its larger public keys and larger cipher texts. Large public key and cipher texts require huge space in the cloud storages. However, there are some approaches proposed to shorten the cipher texts and public keys of LWE-based homomorphic encryption scheme. The objective of the paper is to introduce an idea to shorten the public keys and cipher texts in the storage. Also, support homomorphic addition operation on reduced cipher texts. The aim of the paper is not only to give the practical implementation of standard LWE-based homomorphic encryption operation (Addition) on the reduced cipher texts and also the performance of the proposed scheme.

Ratnakumari Challa, VijayaKumari Gunta
An Enhanced Remote User Authentication Scheme for Multi-server Environment Using Smartcard

Authentication is required to permit authorized users to access the resources and restrict unauthorized users from accessing any legal resources. The earlier schemes were used to address the security-related issues for a single server environment. Nowadays, more than one server are providing services to the users, so authentication protocols in multi-server domain are in use for real-time applications. Authentication scheme proposed by Lee et al. is susceptible to various attacks, namely forgery and server spoofing, and also unsuccessful in providing mutual authentication appropriately. Li et al. have overcome the flaws of the Lee et al.’s scheme in their scheme. Unfortunately, their scheme is not secured against the forgery attack and replay attack. To address the weakness and enhance the security, we propose a more practical scheme for authenticating a remote user in an environment consisting of multiple servers. Here we use smart card and dynamic identity of the user to fulfil all the requirements of multi-server architecture. The server requires no password table for verifying the user credentials and moreover, password can be selected freely by the user. Furthermore, performance analysis shows that our scheme provides comparatively high performance.

Ashish Kumar, Hari Om
A Proposed Bucket Based Feature Selection Technique (BBFST) for Phishing e-Mail Classification

Phishing e-mail is a common problem faced nowadays by the e-mail users, which is an attempt to acquire sensitive information like password, credit cards details, etc. by sending malicious e-mail to the users. Classification of these types of e-mail is necessary to protect the e-mail users against harmful activities. This paper proposed to develop a classification model with the help of a new feature selection technique (FST) known as bucket-based feature selection technique (BBFST) in combination of C4.5. As the name suggested, this FST removes the feature one by one from original feature space of phishing e-mail data and puts into the three buckets based upon importance of the features as relevant feature, less relevant feature and irrelevant feature, and a new feature sub set is created. Classification technique C4.5 is then applied with data of new feature subsets and compared with existing FST. Results obtained reveal that BBFST is superior to those of existing FST with 99.008% accuracy with 12 features of phishing e-mail data.

H. S. Hota, Akhilesh Kumar Shrivas, Rahul Hota
A Novel Security Mechanism in Symmetric Cryptography Using MRGA

Cryptography is a primary requirement in any type of area. Cryptography is used to secure the data and communication between two parties. Some organization may have large set of data and some may have small set of data. Sometimes large data needs low security and small data needs high security. For that purpose various symmetric and asymmetric algorithms are used like DES, 3DES, AES, BLOWFISH, IDEA, RSA. These algorithms are used for encryption and decryption and measure the performance and throughput according to speed, time, and memory. In the proposed algorithm novel security mechanism is used for increased security and throughput. For security mechanism array, some arithmetic and logical operations, Magic Rectangle Generation Algorithm (MRGA) algorithm have been used in the algorithm. MRGA table is size of 16 × 24. Finally, we have done encryption and decryption using MRGA and also we have compared its throughput for different sizes of database.

Bhoomika Modi, Vinitkumar Gupta
Techniques for Enhancing the Security of Fuzzy Vault: A Review

Biometric Systems are the personal identification systems that use behavioral and physiological characteristics of a person. One of the main concerns in biometrics systems is template security. Fuzzy vault, a bio-cryptosystem, is used to provide security to the stored templates. Fuzzy vault has proven to be a very good security technique, nonetheless it lacks in providing revocability and security against correlation attacks. Thus for the enhancement of the security of fuzzy vault and to overcome the limitation of correlation attack, techniques like hybrid model and multimodal biometrics can be used. This paper gives a review of the above mentioned techniques, viz. hybrid and multimodal, and how they can be effective in enhancing the security of the system.

Abhay Panwar, Parveen Singla, Manvjeet Kaur
An Efficient Vector Quantization Based Watermarking Method for Image Integrity Authentication

This paper presents a two-stage watermarking technique for image authentication adapting advantages of vector quantization (VQ). In the present algorithm robust watermark and semifragile watermark are embedded independently in VQ compressed image in two successive stages. Robust watermark and VQ enhances the security of the system by providing double protection to designed system. A quantitative threshold approach using pixel surrounding error pixel is suggested for identification of attacks as acceptable or malicious. Experimental results demonstrate the capabilities of the method in classifying attacks and correctly locating tamper location. It is possible to detect and determine tamper with very high sensitivity. Present scheme outperforms previous algorithms and can distinguish malicious tampering from acceptable changes, and tampered regions are localized accurately.

Archana Tiwari, Manisha Sharma
XSS Attack Prevention Using DOM-Based Filter

Cross-site scripting (XSS) is one of the most critical vulnerabilities found in web applications. XSS vulnerability present in web application that takes untrusted data and sends it to a web browser without proper input validation. XSS attack allows the adversary to execute scripts in the victim browser which can deface web sites, hijack user sessions, or redirect the user to malicious contents. Some of the proposed methods to XSS attack include the use of regular expressions to identify the presence of malicious content. However, this can be bypassed using parsing quirks and client-side filtering mechanisms such as Noscript and Noxes tool. The existing solutions are comparatively slow and cannot withstand against all attack vectors. Some of the existing approaches are too restrictive resulting in loss of functionality. In this paper, an API for server-side response filtering has been developed. The proposed method allows the HTML to pass through but blocks the harmful scripts. Unlike other approaches it requires a minor modification in existing web application. The performance evaluation shows that the proposed technique is having high fidelity and comparatively less response time.

Asish Kumar Dalai, Shende Dinesh Ankush, Sanjay Kumar Jena

Big Data and Recommendation Systems

Frontmatter
Friendship Recommendation System Using Topological Structure of Social Networks

Popularity and importance of Recommendation System is being increased day by day in both commercial and research community. Social networks (SNs) like Facebook, Twitter, and LinkedIn draw more attention since without any previous knowledge a lot of connections have been established. The creation of relationship between users is the key feature of a social network. Therefore, it is important for researchers to look for a new way to provide recommendations with more relevance. This paper proposes two algorithms for recommending a new friend in online social networks. The first algorithm is based on the number of mutual friends and second is based on influence score. These recommendation algorithms use collaborative filtering and provide the idea of doing recommendations (e.g., Facebook recommend friends, Netflix suggest movies, Amazon recommend products, etc.). Obtained results and analysis indicate that influence-based recommendation system is more accurate as compared to mutual friend-based recommendation. These proposed recommendation algorithms can be used for the development of an effective social networking or e-commerce site and thereby providing a better experience to users.

Praveen Kumar, G. Ram Mohana Reddy
Personalized Recommendation Approach for Academic Literature Using High-Utility Itemset Mining Technique

As the size of digital academic library is increasing enormously, it has become arduous for the researchers to identify the papers of their interest from this repository. This has escalated researcher’s attention toward the implementation of Recommender Systems (RS) in academic literature domain. The content-based and collaborative filtering-based techniques when applied in the academic literature domain, fail in reflecting the researcher’s personalized preferences in terms of recentness, popularity, etc. This article presents a Personalized Recommendation Approach for Academic Literature which is based on High-Utility Itemset Mining (HUIM) Technique. This approach uses the content of the paper along with user’s personalized preference, for making recommendations. Here, we have utilized a highly efficient HUIM algorithm, EFIM, which has been recently introduced in the literature, to mine the papers having higher utility to the user. Experimental evaluation proves that our work satisfies the researcher’s personalized requirements and also outperforms the existing personalized research paper recommender systems in terms of its time and space complexities.

Mahak Dhanda, Vijay Verma
An Estimation of User Preferences for Search Engine Results and its Usage Patterns

The fields where an exhaustive understanding of user preferences can be applied include web page ranking, web search personalization, and search engine adaptation. The most important use of an understanding of how people use search engines and what they want from them is the immense scope it creates for system improvement. System improvement means evolving search engines to constantly exceed user expectations. Various approaches like creating user profiles, saving logs of user search patterns, evaluating users’ browsing behavior, etc., have been used to determine user preferences. This research work aims to determine different aspect of user preferences in respect of Search Engines. The authors present a method to analyze and evaluate user preferences for search engines based on an experiment, which was conducted on working professionals employed in various domains like software companies, law firms, banks, educational institutes, government, etc. The sample of the study has 120 working professionals working in different domains.

Nidhi Bajpai, Deepak Arora
A Comparative Analysis of Various Spam Classifications

Bandwidth, time, and storage space are the major three assets in computational world. Spam emails affect all the three, thus degrade the overall efficiency of the system. Spammers are using new tricks and traps to land these frivolous mails into our inbox. To make mailboxes more intelligent, our effort will be to devise a new algorithm that will help to classify emails in much smarter and efficient way. This paper analyzes various spam classification techniques and thereby put forward a new way of classifying spam emails. This paper thoroughly compares the results that various authors have got while simulating their architectures. Our approach of classification works efficiently and more accurately on varied length and type of datasets during training and testing phases. We tried to minimize the error ratio and increase classifier efficiency by implementing Genetic Algorithm concept.

Nasir Fareed Shah, Pramod Kumar
Review Spam Detection Using Opinion Mining

Nowadays with the increasing popularity of Internet, online marketing is going to become more and more popular. This is because, a lot of products and services are easily available online. Hence, reviews about all these products and services are very important for customers as well as organizations. Unfortunately, driven by the will for profit or promotion, fraudsters used to produce fake reviews. These fake reviews written by fraudsters prevent customers and organizations reaching actual conclusions about the products. These fake reviews or review spam must be detected and eliminated so as to prevent deceptive potential customers. In this paper, we have applied supervised learning technique to detect review spam. The proposed work uses different set of features along with sentiment score to build models and their performance were evaluated using different classifiers.

Rohit Narayan, Jitendra Kumar Rout, Sanjay Kumar Jena
Review Spam Detection Using Semi-supervised Technique

Today because of the popularity of e-commerce sites, spammers have made their target to these sites for review spam apart from other spams like email spam or web spam. These fake reviews written by fraudsters prevent customers and organizations reaching actual conclusions about the products. Hence, these must be detected and eliminated so as to prevent deceptive potential customers. In this paper, we have used semi-supervised learning technique to detect review spam. The proposed work is based on PU-learning algorithm which learns from a very few positive example and unlabeled data set. Maximum accuracy we have achieved is of 78.12% with F-score 76.67 using only 80 positive example as a training set.

Rohit Narayan, Jitendra Kumar Rout, Sanjay Kumar Jena

Emerging Techniques in Computing

Frontmatter
Dimensionality Reduction Using Decision-Based Framework for Classification: Sky and Ground

In this paper, we address the problem of dimensionality reduction for classification. Classification of data is challenging if its dimension size is high. We propose a decision-based framework for dimensionality reduction using confidence factor as an evaluation measure for generating a relevant feature subset for a specific target. Confidence factor is generated for all features competent for classification using evidence parameters. Evidence parameters are computed based on intersection of classes in the distribution of feature vector and distance between peaks of distribution of feature vector and are combined using Dempster Shafer combination rule. We demonstrate the results of the proposed framework for sky and ground classification using various datasets. The classification in low dimension space is performed retaining the classification accuracy and optimizing computational time.

Ramesh Ashok Tabib, Ujwala Patil, T. Naganandita, Vinita Gathani, Uma Mudenagudi
A Decision Tree-Based Middle Ware Platform for Deploying Fog Computing Services

Cloud computing has become a cost-effective and reliable distributed computing model for the end users to share IT resources from a pool of computational resources based on their demand in real time. Cloud computing offers advantages of rapid provisioning and release of resources with minimal effort. While cloud computing offers advantages of cost-effective access to sophisticated hardware and software, the turn around time is a hindrance because of unreliable physical layer connectivity especially for business located in a remote location. Not all business solutions need cloud services to the same extent and at all times as the local IT resources available may be sufficient to handle a business issue such as performing analytics on the locally stored data. Fog computing and Grid computing helps in achieving this. Fog computing aims to bring computational power to the edge of the network and by doing so reduces the operational cost and execution time at the cost of accuracy of results. However, a middle ware is required to determine whether an information query needs to be executed in the cloud or if it can be executed on a group of computers that are geographically closer to the business. In this paper, we propose and develop an intelligent middle ware platform that is based on decision trees to optimization the execution of any information query.

Mahesh Sunkari, Raghu Kisore Neelisetti
Development of 3D FULL HD Endoscope Capable of Scaling View of the Selected Region

The stereo vision ability of human beings enables the surgeon with depth perception, and thus can allow for organ localization in human body. In this work, we propose an approach of developing a 3D prototype for stereo endoscopes. The stereo calibration and rectification processes have been implemented in order to nullify the effects of lens distortions. The processed output was in interlaced format and can be visualized in 3D format on a passively polarized monitor using polarized glasses. The proposed system also provides the snapshot, video write, and retrieval in individual left and right streams along with live process display. The scaling feature provides a detailed view of the selected region of interest.

Dhiraj, Priyanka Soni, Jagdish Lal Raheja
An -Norm Based Optimization Approach for Power Line Interference Removal in ECG Signals

Accurate analysis and proper interpretation of electrophysical recordings like ECG is a real necessity in medical diagnosis. Presence of artifacts and other noises can corrupt the ECG signals and can lead to an improper disease diagnosis. Power line interferences (PLI) occurring at 50/60 Hz is a major source of noises which could corrupt the ECG signals. This motivates the removal of PLI from ECG signals and is a foremost preprocessing task in ECG signal analysis. In this paper, we deal an $${\ell _1}$$ norm based optimization approach for PLI removal in ECG signals. The sparsity inducing property of $${\ell _1}$$ norm is used for efficient removal of power noises. The effectiveness of this approach is evaluated on ECG signals corrupted with power line interferences and random noises.

Neethu Mohan, S. Sachin Kumar, K. P. Soman
Exploration of Many-Objective Feature Selection for Recognition of Motor Imagery Tasks

Brain–Computer Interfacing helps in creation of a communication pathway between brain and external device such that the biological modality of performing the task could be bypassed. This necessitates fast and reliable decoding of brain signals which mandate feature selection to play a crucial role. The literature discloses the improvement in performance of left/right motor imagery signal classification with many-objective feature selection where several classification performance metrics have been maximized for obtaining a good quality feature set. This work analyses the classification performance by varying the feature dimension and number of objectives. A recent many-objective optimization coupled with objective reduction algorithm viz. $$\alpha $$-DEMO has been used for modeling the feature selection as an optimization problem with six objectives. The results obtained in this work have been statistically validated by Friedman Test.

Monalisa Pal, Sanghamitra Bandyopadhyay
Euler-Time Diagrams: A Set Visualisation Technique Analysed Over Time

Over the years, there is a large amount of data being generated in almost all the fields such as engineering, medical, bioscience and social network. To visualise set relationships for a large data is always a challenge, especially with the data evolving over time. There is no tool that supports the set visualisation represented over time. So, we explored this impossibility by developing a novel visual method and a software tool to generate Euler-time diagrams, which will represent set relations with respect to time. The idea was taken from the well-known visualisations: Euler diagrams and time-series. Euler diagrams represent set relations and time-series represent a sequence of events happened over a time. We merged the idea of Euler diagrams and time-series to spark the novelty. Pattern discovery not only plays an important role in graph analysis but also in the set analysis process. In this paper, we took a case study from the World Health Organisation (WHO) who is constantly trying to understand relationships between various diseases people are affected over a period of time. This motivated us to develop the set relationship time tool, by considering two levels: aggregation and relationships, using data-driven documents (D3) and Google developing tool kit. This prototype tool can be enhanced by considering gestalt principles, topological properties, perceptual and cognitive theories which will help in analysing and interpreting data efficiently.

Mithileysh Sathiyanarayanan, Mohammad Alsaffar
A Framework for Goal Compliance of Business Process Model

In this paper, we propose a framework toward formal representation and validation of goal compliance for a business process model. All the tasks, postconditions, constraints, and goals are captured using first-order logic (FOL). We have used theorem prover (Prover9) for goal entailment. An experimental validation for goal compliance is presented considering a use case on health care domain. We start with an exhaustive solution space of all possible business process models for all possible activities on a particular domain and derive a reduced solution space of goal complied process models.

Dipankar Deb, Nabendu Chaki
Comparative Analysis of Adaptive Beamforming Techniques

Adaptive beamforming (ABF) techniques are used to produce higher gain in the user directions and lower gain in the interferer directions by calculating the excitation weights. It tries to reduce the difference between the desired and actual signal and maximize the signal-to-interference ratio (SIR). But in severe interference environment when the actual signal is weak, the effect of SIR on the radiation pattern needs to be considered. This paper describes the effect of signal-to-interference ratio on different adaptive beamforming techniques such as non-blind least mean square (LMS) and evolutionary particle swarm optimization (PSO). The performance and validation of beamforming algorithms are studied through MATLAB simulation by varying SIR parameter for desired and interference direction. Different weights are obtained using this beamforming algorithm to optimize the radiation pattern. The parameters for comparison are the main beam and null placement keeping signal-to-noise (SNR) constant for specified user and interferer.

Smita Banerjee, Ved Vyas Dwivedi

Internet, Web Technology, Web Security, and IoT

Frontmatter
Securing an External Drive Using Internet with IOT Concept

In the computer world, data security is the main issue. Many attacks and techniques are used for stealing data. Instead of considering on internet, o ine storage like in hard disk, etc., should also get secured enough so that in case of theft, user information should not get accessed. Till now, third party software is available for securing but they all are easily cracked. For securing hard disk in such a way that attacker not get information from device, a small chip will be mounted inside the hard disk which is always concatenated with user email and always sends location and information of the attached machine. It also contains double password security and can also receive message from concatenated email for blocking or encrypting the data inside the disk. This approach is like using IOT for securing data in hard disk.

Rajneesh Tanwar, K. Krishnakanth Gupta, Purushottam Sharma
An Intelligent Algorithm for Automatic Candidate Selection for Web Service Composition

Web services have become an important enabling paradigm for distributed computing. Some deterrents to the continued popularity of the web service technology currently are the nonavailability of large-scale, semantically enhanced service descriptions and limited use of semantics in service life cycle tasks like discovery, selection, and composition. In this paper, we outline an intelligent semantics-based web service discovery and selection technique that uses interfaces and text description of services to capture their functional semantics. We also propose a service composition mechanism that automatically performs candidate selection using the service functional semantics, when one web service does not suffice. These techniques can aid application designers in the process of service-based application development that uses multiple web services for its intended functionality. We present experimental and theoretical evaluation of the proposed method.

Ashish Kedia, Ajith Pandel, Adarsh Mohata, S. Sowmya Kamath
A Quality-Centric Scheme for Web Service Ranking Using Fuzzified QoS Parameters

Service composition, the process of combining already available basic services to provide a new, enhanced functionality, helps in serving diverse user requirements and promotes rapid application deployment. One of the premises for achieving service composition is to consider the quality of service parameters like availability, response times etc., of the constituent services, so that effective ranking can be obtained. However, based on user need, multiple criteria may need to be considered during QoS-based ranking, due to which it may be difficult to provide accurate and precise values with respect to a particular QoS parameter. In this paper, we address this problem by incorporating the theory of fuzzy logic using fuzzy variables. We propose a new scheme that focuses on computing the combined values of various QoS parameters, for enhancing web service recommendation. The proposed scheme has been applied to the real-world datasets, with encouraging results.

Mandar Shaha, S. Sowmya Kamath
Enhancing Web Service Discovery Using Meta-heuristic CSO and PCA Based Clustering

Web service discovery is one of the crucial tasks in service-oriented applications and workflows. For a targeted objective to be achieved, it is still challenging to identify all appropriate services from a repository containing diverse service collections. To identify the most suitable services, it is necessary to capture service-specific terms that comply with its natural language documentation. Clustering available Web services as per their domain, based on functional similarities would enhance a service search engine’s ability to recommend relevant services. In this paper, we propose a novel approach for automatically categorizing the Web services available in a repository into functionally similar groups. Our proposed approach is based on the Meta-heuristic Cat Swarm Optimization (CSO) Algorithm, further optimized by Principle Component Analysis (PCA) dimension reduction technique. Results obtained by experiments show that the proposed approach was useful and enhanced the service discovery process, when compared to traditional approaches.

Sunaina Kotekar, S. Sowmya Kamath
Advancement in Personalized Web Search Engine with Customized Privacy Protection

Technologies are blooming, needs are growing, larger user data is getting aggregated, and thus privacy becomes a matter of concern in this fast paced, technology driven environment. People are relying mostly on Internet for almost everything they work on or experience. The web search engines confuse us sometimes by giving mixed results. Different people may have variant requirements, and search engines provide same results for same queries, but to different people. In this paper, we intend to solve this problem by a technique of generating online user profiles before firing any query. This user profile would store the user details and the search engine would display results according to this generated profile. We use collaborative filtering and ranking function to filter out the pages according to the preferences of user. We intend to add a feature in our system where in, the users will get a chance to handle their degree of privacy. We offer them two friendly buttons—“Private” and “Public”. These buttons will decide whether the user wants to share his details with other users or not. A combination of personalization and privacy would surely be worth a good use for the Internet seekers.

Jeena Mariam Saji, Kalyani Bhongle, Sharayu Mahajan, Soumya Shrivastava, Ashwini Jarali

Research on Optical Networks, Wireless Sensor Networks, VANETs, and MANETs

Frontmatter
Traffic Classification Analysis Using OMNeT++

There has been a lot of research on effective monitoring and management of the network traffic, where a large amount of internet traffic requires more accurate and efficient ways of traffic classification methods and approaches with an aim to improve network performance. In our research, we introduce the subject of packet classification in IP traffic analysis with a simple technique that relies on prototype classifier using OMNET++ (Optical Modelling Network using C++ programming language) which unfolds one new possibility for an online classification focusing on application detection in the absence of payload information. In this research, we evaluated our novel IATP (Inter-arrival time and precision) clustering algorithm with the help of OMNET++ scheduler for classification of network traffic. The analysis is based on the measure combined with inter-arrival time and precision which was able to distinguish fairly as a small different subset of clusters. With our implementation of a range of flow attributes, the simulation result demonstrates the effectiveness of 100% accuracy of classifying packets but does not constitute the same level of accuracy with real-time traffic classifier which operates under certain constraints. Accuracy for real-time traffic might normally varies from 80 to 95% and depends on the type of each application. Further study and heuristics are required for detecting much better methodologies for detecting applications with real-time traffic measurements.

Deeraj Achunala, Mithileysh Sathiyanarayanan, Babangida Abubakar
Irregular-Shaped Event Boundary Estimation in Wireless Sensor Networks

In a wireless sensor network (WSN), sensor nodes are deployed to monitor a region. When an event occurs, it is important to detect and estimate the boundary of the affected area and to gather the information to the sink node in real time. In case, all the affected nodes are allowed to send data, congestion may occur, increasing path delay, and also exhausting the energy of the nodes in forwarding a large number of packets. Hence, it is a challenging problem to select a subset of affected nodes, and allow them only to forward their data to define the event region boundary satisfying the precision requirement of the application. Given a random uniform node distribution over a 2-D region, in this paper, three simple localized methods, based on local convex hull, minimum enclosing rectangle, and the angle of arrival of signal, respectively, have been proposed to estimate the event boundary. Simulation studies show that the angular method performs significantly better in terms of area estimation accuracy and number of nodes reported, even for sparse networks.

Srabani Kundu, Nabanita Das, Sasanka Roy, Dibakar Saha
A New Two-Dimensional Mesh Topology with Optical Interlinks

The performance of parallel computer heavily depends on the topology of the interconnection network. Two-dimensional mesh is a well-known topology for processor arrays. However, its large diameter increases execution time when the parallel algorithm requires communication between arbitrary pair of nodes. Wraparound connections between end nodes reduces its diameter, however, increases the complexity in the design of parallel algorithms. In this paper, we have proposed an intermediate approach, where additional links are used to reduce the diameter without increasing the design complexity. These additional optical links provides high-speed communication between nodes that are separated by half the number of nodes in each dimension. Also, we present efficient parallel algorithms for some elementary problems on the proposed system.

Amritanjali
Enhanced TCP NCE: A Modified Non-Congestion Events Detection, Differentiation and Reaction to Improve the End-to-End Performance Over MANET

The characteristics of Mobile Ad hoc Network (MANET) like error and reordering degrades the performance of TCP-based applications. Among the many proposals to reduce the impact of non-congestion events, TCP-NCE has been designed as the unified solution to discriminate between non-congestion and congestion events, and to respond to the events. Our initial analysis on TCP-NCE and other schemes (TCP-DCR and SACK-TCP) showed that the existing schemes including TCP-NCE fail to improve end-to-end performance in the presence of congestion, error, and reordering due to mobility and multipath routing. To overcome this problem, we designed “Enhanced TCP NCE” protocol to reduce the false differentiation on non-congestion events and to optimize the response procedure to those events. Our simulation results showed that the enhancement increased the performance by 15–20% over TCP-NCE. In addition, the consistency in yielding the higher performance throughout the simulation is observed for our protocol.

J. Govindarajan, N. Vibhurani, G. Kousalya
Intelligent Building Control Solution Using Wireless Sensor—Actuator Networking Framework

Intelligent building refers to a residence that is automated through a network of electronic devices which cooperate transparently to provide protection and comfort to the residents and minimize the energy consumption drastically. In this paper, a novel approach is made to design an intelligent building control solution using wireless sensor actuator networks (WSAN) in hardware domain. The intelligent building control solution provides automation to most household operations such as intrusion alarm, environment monitoring and controlling, asset tracking, etc. Real-time sensing is established with the help of sensor modules whereas real-time tracking is achieved with the deployment of active radio frequency identification (RFID) module. The WSAN greatly consists of gateway, routers, and end devices. The main controlling unit of WSAN is AVR microcontroller which manages the transceiver section and processes the received data accordingly. This paper focuses on remote monitoring and management system for a partly automated building equipped with sensors and actuators. Here, we also explore the prospects of wireless mesh networks and design a framework for developing an intelligent and smart environment.

Anindita Mondal, Sagar Bose, Iti Saha Misra
Security Framework for Opportunistic Networks

Opportunistic Networks have evolved as special class of mobile ad hoc and delay tolerant networks which have a vast range of applications. In opportunistic networks, the permanent links among the nodes are absent and delay is high. Due to self-organized nature of opportunistic networks, these networks have many security threats, e.g, how to protect the data confidentiality, integrity, privacy as well as the trust among the nodes. In this article, we present the specific security challenges to opportunistic networks and analyze related security requirement. Based on these discussions, we propose a general security framework for opportunistic networks and point out the future research direction in opportunistic networks.

Prashant Kumar, Naveen Chauhan, Narottam Chand
Replica-Based Efficient Data Accessibility Technique for Vehicular Ad Hoc Networks

Vehicular ad hoc networks are also one of the emerging fields which carry and forward data to deliver it to the destination. The proposed mechanism discusses scheme for replica augmentation when new replica is required, and replica abandon when any replica is disused from minimum fixed time period. Proposed scheme also performs arrangement of data into most relevant replica depending upon size of data stored, size of buffer left at each replica, deadline, and popularity of data. The proposed scheme also considers probability functions to fairly select most suitable replica for increasing accessibility.

Brij Bihari Dubey, Rajeev Kumar, Naveen Chauhan, Narottam Chand
Reinforcement Based Optimal Routing Algorithm for Multiple Sink Based Wireless Sensor Networks

Routing in wireless sensor networks has been a recent area of research because of its increased use in diverse application environments. Sensor nodes are energy constrained which possess a formidable challenge in designing efficient routing algorithms for them. Most of the scenarios where sensor networks are used such as battle field surveillance, health monitoring are delay sensitive in nature. To mitigate these problems, we have proposed two routing algorithms in this paper: one based on multiple static sink based scenario and the other based on multiple mobile sink based scenario. Both of these protocols use reinforcement learning methodology in order to solve the routing problem in an intelligent and efficient way.

Suraj Sharma, Azad Kumar Patel, Ratijit Mitra, Reeti Jauhari
Study and Impact of Relay Selection Schemes on Performance of an IEEE 802.16j Mobile Multihop Relay (MMR) WiMAX Network

This paper investigates two efficient relay selection algorithms on the performance of an IEEE 802.16j MMR WiMAX network. The relay selection algorithms considered here are max-min relay selection and harmonic mean of SNR relay selection. The WiMAX transmitter comprises of OFDMA transmission technique and the receiver with maximum likelihood (ML) detection method is adopted for reliable detection of transmitted bits. The relays used here are the Amplify-Forward (AF) relay and Decode-Forward (DF) relay. Both selection combining (SC) and maximal ratio combining (MRC) techniques are applied and the performance metrics are symbol error rate (SER) and channel capacity. The analysis of the network is performed using the MATLAB software. It is observed that the DF relay along with MRC combination technique outperforms the other relay-diversity combining techniques in terms of providing improved SER and channel capacity. Again in providing less SER, the harmonic mean of SNR relay selection algorithm is superior in comparison to max-min relay selection algorithm. Further, the max-min based relay selection algorithm provides better channel capacity about the harmonic mean of SNR relay selection algorithm for the considered MMR WiMAX network.

Chaudhuri Manoj Kumar Swain, Susmita Das
Predicting Link Failure in Vehicular Communication System Using Link Existence Diagram (LED)

Link failure is a major problem in vehicular communication system because of high mobility of vehicles in the network. By predicting the link failure at an early stage the performance of the system can be improved. In this paper, we proposed a data forwarding technique by predicting the link failure in a route by generating a Link Existence Diagram (LED). LED shows whether a link exist between the vehicles or not. Second, we have calculated the Link Expiration Time (LET) between the two vehicles in a route. Third, we have generated a Time-Link diagram to synchronize LET with the data transmission time. Finally, by updating the LET and checking the existence of LET, we generate the LED. Simulations are performed to evaluate the performance of the proposed protocol.

Sourav Kumar Bhoi, Munesh Singh, Pabitra Mohan Khilar

Communication Systems, Antenna Research, and Cognitive Radio

Frontmatter
An M-Shaped Microstrip Antenna Array for WLAN, WiMAX and Radar Applications

A compact multiband antenna, using HFSS, is designed on FR4_epoxy substrate. The dimension of substrate are chosen to be 50 × 75 × 1.6 mm3 and has a permittivity of 4.4. An M-shaped antenna with symmetrical slots is used to form an array which is proposed in this paper. Inset feeding is made to make it simple and compact. Multibands are obtained in S11 versus frequency plots. The effects of forming an array of the antenna elements had been evaluated in order to obtain an upgraded performance of the presented antenna in the WLAN, WiMAX and Radar range. Lower bands were obtained with operating frequencies at 2.4, 3.6, and 5 GHz suitable for WLAN and WiMAX applications. For the operations in K-band Radar range, interesting results were obtained in 11.3–14.4 GHz band.

Aastha Gupta, Vipin Choudhary, Malay Ranjan Tripathy
Gain Enhancement of Microstrip Patch Antenna Using H-Shaped Defected Ground Structure

This paper presents the enhancement of the gain of an inset feed microstrip slot antenna using H-shaped Defected Ground Structure (DGS). Reduced higher order harmonics increases the gain of the antenna using DGS technology. A critical comparative analysis is made between the antenna with and without DGS. Simulation results reveal that there is a remarkable increase in gain. The proposed design is applicable for commercial frequency band of 900 MHz–2.4 GHz.

Asmita Rajawat, P. K. Singhal, Sindhu Hak Gupta, Chavi Jain, Praneet Tomar, Kartik Kapur
Robust Acoustic Echo Suppression in Modulation Domain

The presence of acoustic echo deteriorates the quality of speech transmission in mobile communication systems. In conventional acoustic echo suppression (AES) set-up, the echo path effect is modelled either in time domain or in frequency domain, and to cancel the echo, a replica of the echo is created by estimating the echo path response adaptively. Recently, the modulation domain analysis which captures the human perceptual properties is widely being used in speech processing. Modulation domain conveys the temporal variation of the acoustic magnitude spectra. In this work, a novel method for modelling the echo path and estimating the echo in modulation domain is developed and implemented. Echo cancellation is done effectively using the modulation spectral manipulation. So far, no work on echo suppression in modulation domain has been found as reported. The quality of output of the proposed system is found to be better than conventional AES systems.

P. V. Muhammed Shifas, E. P. Jayakumar, P. S. Sathidevi
FPGA-Based Equalizer Design Using a Novel Adaptive Reward-Punishment VSSLMS Algorithm for Rayleigh Fading Channel

In this paper, a new and novel Reward-Punishment-based Variable Step Size Least Mean Square (RP-VSSLMS) algorithm has been proposed and a novel methodology is used to construct a Rayleigh fading channel adaptive equalizer employing the proposed algorithm in hardware domain. As the Rayleigh fading channel reveals the property of real-time wireless communication environment, it is chosen here. The Spartan 6 FPGA board is configured here to model the digital circuitry of the proposed RP-VSSLMS algorithm using a novel “Hardware Co-simulation” technique. The hardware co-simulation analysis showed that, the proposed RP-VSSLMS algorithm has faster convergence speed, smaller steady-state misadjustment, and lesser computational complexity than the existing LMS and VSSLMS algorithms. The performance of the proposed algorithm is observed by calculating the Bit Error Rate (BER) of different modulated signals under Rayleigh Fading channel.

Sudipta Bose, Iti Saha Misra
Phase Reversal and Suppressed Carrier Characteristics of Neo-Cortical Electroencephalography Signals

Transmission networks of human nervous system carry both sensory and motor neural signals simultaneously. For instance, median nerve carries sensory information from middle, index, and thumb finger to primary motor cortex. In communication engineering systems double-sideband suppressed carrier (DSB SC) modulation is a prominent power efficient analog modulation technique. This paper sheds some light on investigating the modulation technique that is followed by neural networks of human nervous system. EEG signals from motor cortex are applied as inputs to a narrow band pass filter with bandwidth 0.4 Hz. Characteristics of output signals from the filter are similar to that of amplitude modulated signals. In time domain, a 180° phase reversal is observed in case of all outputs. Suppression of power at a particular frequency and boosting the powers of frequencies which are at equidistant from the suppressed frequency value is the key feature of the filters output signal. Results of this study are prima facie evidences that allow one to think in the direction, human nervous system might be following double-sideband suppressed carrier modulation which is the most power efficient technique among available analog modulations, to avoid spectral overlapping.

Manikumar Tellamekala, Shaik Mohammad Rafi
Short-Range Frequency-Modulated Continuous Wave (FMCW) Radar Using Universal Software-Defined Radio Peripheral (USRP)

In this paper, we design a prototype FMCW (Frequency Modulated-Continuous Wave) radar for short distance using universal software defined radio peripheral (USRP). USRP is an ideal platform for all type of telecommunication research, until the bandwidth requirement is fulfilled by universal serial bus (USB). The limited bandwidth of USB bus restricted to realize short distance radar on this platform using GNU-RADIO signal processing block. To avoid the bandwidth limitation, we build transmit section independently inside the FPGA, which free the transmit section bandwidth to be fully utilized by receiving section. The prototype performance test is carried out at the center frequency of 2.4 GHz with a bandwidth of 15 MHz. To avoid the clutter of the environment, directional antenna is used for better target detection at certain distances. The return target echo is further processed using MATLAB.

Munesh Singh, Sourav Kumar Bhoi, Pabitra Mohan Khilar
Backmatter
Metadata
Title
Progress in Intelligent Computing Techniques: Theory, Practice, and Applications
Editors
Dr. Pankaj Kumar Sa
Dr. Manmath Narayan Sahoo
Dr. M. Murugappan
Dr. Yulei Wu
Dr. Banshidhar Majhi
Copyright Year
2018
Publisher
Springer Singapore
Electronic ISBN
978-981-10-3376-6
Print ISBN
978-981-10-3375-9
DOI
https://doi.org/10.1007/978-981-10-3376-6

Premium Partner