Skip to main content
main-content

Über dieses Buch

This book presents studies involving algorithms in the machine learning paradigms. It discusses a variety of learning problems with diverse applications, including prediction, concept learning, explanation-based learning, case-based (exemplar-based) learning, statistical rule-based learning, feature extraction-based learning, optimization-based learning, quantum-inspired learning, multi-criteria-based learning and hybrid intelligence-based learning.

Inhaltsverzeichnis

Frontmatter

Hesitant-Intuitionistic Trapezoidal Fuzzy Prioritized Operators Based on Einstein Operations with Their Application to Multi-criteria Group Decision-Making

Abstract
In this article, a ranking method for hesitant-intuitionistic trapezoidal fuzzy (H–ITF) numbers (H–ITFNs) is proposed. After introducing H–ITFN, the concept of score function and accuracy function of H–ITFN are defined and H–ITF prioritized weighted averaging and geometric operators based on Einstein operations are developed. Some desirable properties of the proposed operators are investigated in detail. A method for ordering the alternatives in multi-criteria group decision-making problems with H–ITF information based on different priority levels of decision-makers and criteria is presented. An illustrative example concerning academic resource deployment in educational institutions studied previously, is considered and solved. The comparison of the results with earlier methods reflects superiority of the proposed methodology.
Arun Sarkar, Animesh Biswas

Unsupervised Feature Selection Using Information-Theoretic Graph-Based Approach

Abstract
Feature  selection is a critical part of any machine learning project involving data sets with high dimensionality. Selecting n optimal subset consisting of important features reduces the execution time and increases the predictive ability of the machine learning model. This paper presents a novel graph-based feature selection algorithm for unsupervised learning. Unlike many of the algorithms using correlation as a measure of dependency between features, the proposed algorithm derives feature dependency using information-theoretic approach. The proposed algorithm—Graph-based Information-Theoretic Approach for Unsupervised Feature Selection (GITAUFS) generates multiple minimal vertex covers (MVC) of the feature graph and evaluates them to find the most optimal one in context of the learning task. In our experimental setup comprising 13 benchmark data sets, GITAUFS has shown a 10% increase in the silhouette width value along with a significant feature reduction of 90.62% compared to the next best performing algorithm.
Sagarika Saroj Kundu, Abhirup Das, Amit Kumar Das

Fact-Based Expert System for Supplier Selection with ERP Data

Abstract
For  any business enterprise, supply chain management (SCM) plays an important role in an organization’s decision- and profit-making process. A very crucial step in SCM is supplier selection. It is such a pivotal step because it deploys a large amount of a firm’s financial resources. In return, the firms expect significant interest from contracting with suppliers offering higher value. Any discrepancy in this process can lead to low SCM performance which in turn may cause financial losses as well as bring about a decline in the firm’s market performance. This paper deals with the development of a strictly fact-based expert system for appropriate supplier selection and shows how rules can be broken down into atomic clauses.
Kartick Chandra Mondal, Biswadeep Deb Nandy, Arunima Baidya

Handling Seasonal Pattern and Prediction Using Fuzzy Time Series Model

Abstract
Seasonal variation is one of the important components of the time series. There are many techniques available in the literature to deal with the problem of seasonality. A few hybrid fuzzy time series models investigated the problem of forecasting in the presence of seasonal variation. But these techniques follow complex computational procedures. The aim of this present study is to develop a new fuzzy time series forecasting model that can process seasonal patterns present in the data directly without any seasonal adjustment by applying certain mathematical techniques. The proposed Neuro-uzzy model is capable of extracting the seasonal pattern from the training set and forecasting the future pattern. This model makes use of Self-organizing map (SOM) for clustering similar patterns. Performance of the model is evaluated using Rainfall data and Milk Production data.
Mahua Bose, Kalyani Mali

Automatic Classification of Fruits and Vegetables: A Texture-Based Approach

Abstract
Fruits and Vegetables are very important food product for the daily life of the humans. Classification of fruits and vegetable is needed for every aspect of the agricultural industry. It is quite challenging to automatically classify fruits and vegetables from digital images. The task of automatic classification becomes more difficult when the image is captured from a different viewing angle. This paper proposes a complete texture-based approach for addressing the effect of viewing angle change to classify fruits and vegetables automatically. At first, a grayscale image is generated from the input color image. The grayscale version of the input image is used to extract multiple threshold values using the multilevel Otsu thresholding technique. Those threshold values are used to generate a set of binary images. The binary images pass through a border extraction process to generate the border image of every binary image. Finally, the border image is processed to calculate the fractal dimension. In parallel flow, the same grayscale image is processed to compute gray-level co-occurrence matrix based features. The fractal dimension and gray-level co-occurrence matrix based features are combined to make a feature vector for classifying the fruit and vegetable classes. Images are collected by covering the entire range of 0\(^\circ \)–360\(^\circ \) angle for each class in our dataset. In total, 1656 images of 23 classes of fruits and vegetables are used for experimentation. The maximum accuracy of the system is 98.33% with Naive Bayes classifier.
Susovan Jana, Ranjan Parekh, Bijan Sarkar

Deep Learning-Based Early Sign Detection Model for Proliferative Diabetic Retinopathy in Neovascularization at the Disc

Abstract
The patients with diabetes for many years are prone to have Diabetic Retinopathy (DR) which is one of the leading causes of blindness. Proliferative Diabetic Retinopathy (PDR) is the advanced stage out of four major progressive stages of DR, where high risk of visual impairments occur. This work shows a deep learning-based automated method for detection of early signs of Proliferative Diabetic Retinopathy at the optic disc area in human retina. Here, we propose the design and implementation of a deep neural network model in replacement of the semi-automated and automated retinal vascular feature extraction methods. Finding the optic disc (OD) center, followed by artery and vein classification from segmented images are essentially important to focus on Neovascularization at the Disc (NVD). A count on the number of major vessels and their width measurement around the OD center are the two indicative parameters for disease diagnosis. Finally, the major vessels are classified as artery and vein sets to differentiate from the newly generated and unwanted blood vessels. This network was trained with the training and testing images of DRIAVE/RITE database for segmentation and artery vein classification. Also, some of our previously published result sets on automated center of optic disc detection on DRIVE dataset have been used to train the model on NVIDIA Titan Xp 8 GB GPU. Finally, the images from MESSIDOR and DIARETDB0 databases were used for testing in detection of Neovascularization at the Disc.
Nilanjana Dutta Roy, Arindam Biswas

A Linear Regression-Based Resource Utilization Prediction Policy for Live Migration in Cloud Computing

Abstract
A new emerging state-of-the-art challenging research area has been found in cloud computing. Cloud Computing is an idea, rely on service and delivery, it is distributed over the Internet and governed by appropriate set of protocol. In last few decades, Internet is growing rapidly as a result cloud computing and also expanded exponentially. Cloud computing is said to provide resoruces such as Software, Platform, and Infrastructure as services, namely, Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS). Cloud profaned the infrastructure resources like CPU, bandwidth, and memory to its end users as a part of its IaaS service. To meet the end users’ heterogeneous needs for resources it profaned and unprofane the resources dynamically, with minimal management effort of the service providers over the Internet. Thus, eliminating the need to manage the expensive hardware resources by companies and institutes. However, to satisfy the need for resources of the users on time, Cloud Service Provider (CSP) must have to maintain the Quality of Service (QoS). Service Level Agreement (SLA) is done between the Datacenters and its end users. Minimization of the violation of the SLA ensures better QoS. Research fraternity has proposed that one of the main reasons for violation of SLA is inefficient load balancing approaches in hosts that fail to ensure QoS, without missing the deadline by the distribution of dynamic workload evenly. In this paper, we propose to extend our previous work of simulated annealing-based optimized load balancing [1] by adding VM migration policy from one host to another on the basis of linear regression-based prediction policy for futuristic resource utilization. In our approach, we are going to predict short-time future resource utilization using linear regression based on the history of the previous utilization of resources by each host. We further use it in migration process to predict the overloaded hosts to underloaded ones. Experiments were simulated in CloudAnalyst and the results are quite encouraging and outperform some previous existing strategies of load balancing for ensuring QoS.
Gopa Mandal, Santanu Dam, Kousik Dasgupta, Paramartha Dutta

Tracking Changing Human Emotions from Facial Image Sequence by Landmark Triangulation: An Incircle-Circumcircle Duo Approach

Abstract
Intelligent recognition of human emotions from face images is a challenging proposition in the field of affective computing which becomes even more difficult when one has to deal with characterizing the nature of transition of human emotion from a relevant sequence of face images. In the present scope, we considered a triangulation mechanism derived from the landmark points of the face images. Resulting in a number of triangle formulations which are found to be sensitive to different emotions like anger, disgust, fear, happiness, sadness, and surprise. Accordingly a pair of circles, viz, incircle and circumcircle corresponding to these triangles are taken into account and geometric features arising out of such pair are utilized for classification of different emotional transitions from various face image sequences. Results of the proposed method obtained by application on various benchmark image databases are found to be quite impressive and encouraging compared to existing state-of-the-art technique.
Md Nasir, Paramartha Dutta, Avishek Nandi

Recognizing Human Emotions from Facial Images by Landmark Triangulation: A Combined Circumcenter-Incenter-Centroid Trio Feature-Based Method

Abstract
Human emotion reflected in facial expression is generated by coordinated operation of muscular movement of facial tissue which associates with the emotional state of the human subject. Facial expression is one of the most significant non-articulated forms of social communication and it is highly adopted by scientific community for the purpose of automated emotion analysis. In the present scope, a triangular structure is induced with three points, viz., circumcenter, incenter, and centroid are considered as the geometric primitive for extraction of relevant features. Information extracted from such features is utilized for the purpose of discrimination of one expression from another using MultiLayer Perceptron (MLP) classifier in images containing facial expressions available in various benchmark databases. Results obtained by applying this method found to be extremely encouraging.
Avishek Nandi, Paramartha Dutta, Md Nasir

Stable Neighbor-Node Prediction with Multivariate Analysis in Mobile Ad Hoc Network Using RNN Model

Abstract
In mobile ad hoc networks (MANETs), mobile nodes are communicating with each other without use of any fixed infrastructure. Here, each node works as a receiver as well as transmitter point in the network. This network maintains the wireless connections with the neighbor nodes and establishes a connecting link between the source–destination (s-d) pair. The route in this type of network is highly unstable due to the mobility of the nodes. So, to construct a steady path between s-d pair, it is obvious to build a path through the stable neighbor nodes. In this article, we propose a stability index (SIN) which depends on the various parameters of the nodes such as past SIN values in different time intervals, node velocity, etc. In this paper, we establish a time series prediction model with multivariate analysis for predicting the stability index (SIN) of a node in reference to its neighbor nodes for the future time frame based on their past observation. For this purpose, we use the Elman recurrent neural network (ERNN)-based learning tool to determine the behavior of the mobile nodes of the network in the future time frame.
Arindrajit Pal, Paramartha Dutta, Amlan Chakrabarti, Jyoti Prakash Singh

A New Approach for Optimizing Initial Parameters of Lorenz Attractor and Its Application in PRNG

Abstract
Lorenz attractors are sensitive to its initial parameter values. The values of the initial parameters or seeds are very important in terms of simulating the strange chaotic behavior and security of using it in different applications. Using a single initial seed or a set of seed values for an application is a threat to its security. The unique seed for each execution of an application will make the system more secure and robust. In last two decades lot of researches are carried out to estimate the initial parameter of different chaotic maps. They solved this problem by checking the similarity between an observed (benchmarked) and an experimental (attractor with the estimated seed) attractor in either time-domain or in state-space. All the solutions over time-domain are not able to consider the sensitivity on the seed of the attractor. The solutions over state-space considers this. However, these solutions also require an observed attractor to check the similarity with the experimental attractor. This research overcomes the aforementioned constraint and flaw of the past models. In this research, a Real Coded Genetic Algorithm(RCGA) based optimization technique for optimizing the initial parameter values for the Lorenz attractor is proposed. A Pseudorandom Number Generator (PRNG) based on the Lorenz attractor is also proposed. The optimized seed for the Lorenz attractor is used for the PRNG. NIST specified statistical tests for testing the randomness of the output bit sequence of the PRNG is done here.
Ramen Pal, Somnath Mukhopadhyay

Backmatter

Weitere Informationen

Premium Partner

    Marktübersichten

    Die im Laufe eines Jahres in der „adhäsion“ veröffentlichten Marktübersichten helfen Anwendern verschiedenster Branchen, sich einen gezielten Überblick über Lieferantenangebote zu verschaffen. 

    Bildnachweise