Skip to main content

2020 | Buch

Intelligent Technologies and Applications

Second International Conference, INTAP 2019, Bahawalpur, Pakistan, November 6–8, 2019, Revised Selected Papers

herausgegeben von: Prof. Imran Sarwar Bajwa, Prof. Dr. Tatjana Sibalija, Dayang Norhayati Abang Jawawi

Verlag: Springer Singapore

Buchreihe : Communications in Computer and Information Science

insite
SUCHEN

Über dieses Buch

This book constitutes the refereed proceedings of the Second International Conference on Intelligent Technologies and Applications, INTAP 2019, held in Bahawalpur, Pakistan, in November 2019.

The 60 revised full papers and 6 revised short papers presented were carefully reviewed and selected from 224 submissions. Additionally, the volume presents 1 invited paper. The papers of this volume are organized in topical sections on AI and health; sentiment analysis; intelligent applications; social media analytics; business intelligence;Natural Language Processing; information extraction; machine learning; smart systems; semantic web; decision support systems; image analysis; automated software engineering.

Inhaltsverzeichnis

Frontmatter

Internet of Things

Frontmatter
Design of Wearable Prototype Smart Wristband for Remote Health Monitoring Using Internet of Things

Remote health monitoring for patients is increasing with the advancements of various types of health related mobile applications. The vital signs such as pulse rate, blood pressure and temperature are the basic parameters, used for monitoring a patient’s health. The designed healthcare kit consists of different sensors that are used for sensing and monitoring a patient’s health. The data from healthcare kit is transmitted via the internet to save in a cloud-based server, which helps to monitor the consistent health situation of a candidate. The sensing information will be collected constantly over specified intervals of time and will be used to aware the patient about any concealed problem to endure possible diagnosis. The higher and lower range of temperature, blood pressure and heartbeat can be defined by the doctor according to the patient’s health. Afterward, the system starts monitoring the patient and sends alert to the concerned doctor as soon as the sensing parameters cross the defined limits. The recorded values are transmitted via internet to the cloud server and a notification in the form of a tweet is also displayed on the doctor’s twitter account and as well as the patient’s account.

Ayesha Masood, Khan Bahadar Khan, Talha Younas, Ali Raza Khalid
FireNot - An IoT Based Fire Alerting System: Design and Implementation

Internet of Things (IoT) based systems have revolutionised the way real world systems are inter-connected through internet. At present the application of IoT based systems is extend to real time detection and warning system. However, cost has been a major factor for development and implementation of IoT systems. Considering the cost, ease of implementation, this paper proposes a low cost yet efficient IoT system called FireNot for warning and alerting fire incidents. FireNot is a cloud-based system that uses sensors (hardware) to detect fire and alert the user through internet and is maintained and monitored using a simple Android app. The FireNot system uses Raspberry Pi programmed through Python language and utilises Google API for location detection. This paper practically demonstrates the FireNot system through extensive testing on various operations and the FireNot system is proven to be efficient.

Bahman A. Sassani, Noreen Jamil, M. G. Abbas Malik, S. S. Tirumala
Smart Intelligent System for Mobile Travelers Based on Fuzzy Logic in IoT Communication Technology

An internet of thing (IoT) is vulnerable to many identity-based attacks. However, due to the substantial increase in the density of users and nodes mobile travelers are more vulnerable to such attacks in IoT Communication Technology. In this paper, we have tried to explore the potential weaknesses related to security for mobile traveler’s Smart Intelligent System environment in nodes mainly for handover conditions. We propose a novel scheme to protect against spoofing that utilizes probability distributions of Received Power based on the regions for mobile (moving) user. We further investigate the impact on secrecy rate of the targeted user in the absence and presence of an eavesdropper. Also, we evaluated our methods through simulation results for sensitive regions information based on fuzzy logic.

Imran Memon, Hadiqua Fazal, Riaz Ahmed Shaikh, Ghulam Ali Mallah, Rafaqat Hussain Arain, Ghulam Muhammad
Perception of Wearable Intelligent Devices: A Case of Fitbit-Alta-HR

Several wearable devices are available in the market they are becoming increasingly popular. These devices have been used for scheduling, time management, energy prediction, as well as for health management. Our thorough analysis of the literature has revealed that the majority of the studies focus on investigating the usefulness, accuracy, or adoption of the devices. However, no study has been conducted to evaluate the perception of these wearable devices among its users. To that end, in this study, we have used the end-user feedback in natural language to evaluate the perception of Fitbit-Alta-HR, which is an award-winning and widely used wearable device. The study reveals that majority of the users talk positively about the device. Furthermore, we have performed experiments using three supervised learning techniques to evaluate their effectiveness for classification. The studies show that state-of-the-art Artificial Neural Networks is the most effective technique, and unigram is the most suitable feature.

Khurram Shahzad, Muhammad Kamran Malik, Khawar Mehmood
Automatic Sun Tracking Smart Wheel Chair with Real Time Navigation

Aging is a natural procedure prompt changes in quality and amount of muscles in skeletal, due to which muscle shortcoming and disability in moving of maturing populace. Crumbling pain in the arms and legs in the main cause of movement and the basic cause of disability in the spinal cord, cauda equina, or fringe neuropathy. As a result of seriousness and merits, electric wheelchairs are progressively requested in the recent era. Our proposed work is a smart electric wheelchair control by both joystick and a dependent user recognition voice system. To facilitate the patients in the outdoor environment an automatic sun-tracking solar plate is provided to charge the battery itself and automatically tracks sun position to provide not only power backup but also shadow to walk in sunlight. The navigation device is installed for providing real-time navigation which helps the user to select the shortest and easiest route. The proposed system is cost-effective of existing available in the market and also energy efficient.

Muhammad Rizwan Masood, Ali Sufyan, Usama Khalid, Muhammad Makki Khan, Sundus Amin
A Blockchain-Based Framework for Information Security in Intelligent Transportation Systems

Efficient utilization of the IoT generated data helps in the decision-making process as well as the realization of the concept of smart cities into a reality. The smart cities utilize Intelligent Transportation System (ITS) to facilitate smart vehicles to operate independently without much human intervention. The ITS system utilizes the data produced by the smart infrastructure to intelligently operate the smart vehicles and make highly precise decisions in the whole smart city ecosystem. Although ITS has been in its evolutionary phase, this is the right time to realize the vulnerabilities present in the ITS infrastructure that can be exploited by the adversaries to render devastating attacks. The ITS network involves highly sophisticated infrastructure where any anomaly provokes threat to human lives. The ITS data can be manipulated by the adversaries for attacks such as password theft, information spoofing, data manipulation, information loss, eavesdropping, and key fob attacks.Blockchain provides a distributed and tamper-resistant technology by utilizing a privacy-preserving ledger to secure data in an efficient manner. The Blockchain technology eliminates the adversarial impact of the attackers by leveraging the computational capabilities of the trusted nodes in the network. In this research, we provide a secure framework for effective implementation of ITS using Blockchain technology in the centralized management of software defined networking. This framework utilizes edge computing where Blockchain provides security services. This framework is secure, provides interoperability, less information leakage, and seamless integration between versatile smart vehicles and service infrastructure.

Wajid Rafique, Maqbool Khan, Xuan Zhao, Nadeem Sarwar, Wanchun Dou
Home Automation Security System Based on Face Detection and Recognition Using IoT

In the modern world security is one of the major issues. As technology is getting advanced many security issues are arising. The existed developed security methods have some flaws and they can be hacked. The proposed system for resolving the security issue is based on face detection and recognition using Internet of Things (IoT). The face of a person is captured by the camera and compared with the acquired database. The authorized user can also utilize mobile application to give access to the premises to any unregistered person. In the case of unauthorized/unknown access, the face image of the person will be captured and notified to the concerned authorities through an email. An Alarm will be generated in the case of unauthorized access. The proposed system produced accurate results in both cases: authorized and unauthorized access. The introduced system provides a low-cost solution for monitoring and controlling the houses, different organizations like banks, universities, etc.

Sana Ghafoor, Khan Bahadar Khan, Muhammad Rizwan Tahir, Maryoum Mustafa
Smart Reader for Visually Impaired People Based on Optical Character Recognition

There are millions of visually impaired people in the world. According to the World Health Organization (WHO) data on visual impairment, 1.3 billion people are living with some kind of visual impairment while 36 million people are completely visually impaired. Reading is one of the major necessities of visually impaired people. Numerous researchers have worked on developing a mechanism that allows blind people to detect obstacles, to read the labels or specific currencies and to read the written, typed or printed text. We proposed a system which facilitates the visually impaired people by converting the text into voice signal based on raspberry pi. Optical Character Recognition (OCR) scheme is employed for the detection of printed text using a camera. Typed, handwritten characters or text are converted into machine-encoded text. The proposed method is developed on the raspberry pi in which OCR is employed for an image to audio converter which is the output of the system. It is a smart real-time device based on OCR.

Muhammad Farid Zamir, Khan Bahadar Khan, Shafquat Ahmmad Khan, Eid Rehman

Intelligent Applications

Frontmatter
A Survey on Context-Aware Computing Frameworks for Resource-Bounded Devices

Internet of Things (IoT) provides ubiquitous computing at any place, at any time and in any data format to any user across a network. Context-awareness is a phenomenon where an entity can portray its behavior in a particular time based on facts, rules, and axioms to form a system that is formally called context-aware computing framework. Several frameworks exist for context-awareness either ported from other platforms to android or explicitly build for android platform. Resource-bounded devices like tablet, smart TV, the smartphone and the wireless sensors nodes have several constraints like memory, power, and time that must be considered while designing a framework for resource-bounded devices. This paper surveys various resource-bounded context-aware computing frameworks that are either ported from desktop to android platform or explicitly build for android platform. The key challenges associated with these frameworks and portability issues from desktop to Android platform have also been discussed in detail.

Younas Khan, Sajjad Ahmad Bhatti, Sohail Khattak
A Real-Time Driver Drowsiness Detection and Warning System Based on an Eye Blinking Rate

Every year many people lose their lives due to fatal road accidents around the world and drowsy driving is one of the primary causes of road accidents and deaths. The other causes of traffic accidents are due to human errors and/or due to mechanical failures. Driver fatigue is one of the leading causes of Road Traffic Accidents (RTA) in Pakistan. Numerous systems are invented that minimize the impact of these accidents. The research in this area began sixty years ago, to determine the drowsiness of driver using computer assisted techniques. In this paper, we have purposed a method that will detect the drowsiness of driver by its eye behavior, such as, eye blink rate and patterns. This paper presents a system for detecting driver drowsiness, based on analysis of the eyes. The system has the ability to adapt to any person, works in real driving conditions, under varying lighting and generating drowsiness index at every moment, which measures the wakefulness of the driver. In several experiments, the proposed system has shown excellent results regarding the objectives and the problems have been successfully overcome.

Mubeen Arif, Khan Bahadar Khan, Khizar Fiaz, Ayesha Niaz
Optimization of Bug Reporting System (BRS): Artificial Intelligence Based Method to Handle Duplicate Bug Report

Bug tracking and reporting are some of the most critical activities/steps in software engineering and implementation which has a direct impact on the quality of tested software and productivity of resources allocated to that software. Bug Reporting System (BRS) plays an important role in tracking all essential bug reports during software development life cycle (SDLC). Duplicate Bug Reports (DBR) have an adverse effect on the software quality assurance process as it enhances the processing time of bug triager whose job is to keep track of all bug reports and also on application developers, to whom bug tickets are assigned by the triager. Duplicate bug reports if remain unidentified may result in enhancing bug handling time (rework) and decreasing overall team performance. However identification of duplicate bug report remains as a critical task as it is a tough job to manually identify all second images of earlier reported bug. In this paper we have proposed an enhancement in existing BRS which uses artificial intelligence based intelligent techniques to detect the existence of a duplicate bug. Every new bug reported to the system will be marked with an identification tag. A bug containing duplicate tag will be phased out from the bug repository which will not only reduce the additional effort on bug triage but also improve the system’s performance.

Afshan Saad, Muhammad Saad, Shah Muhammad Emaduddin, Rafi Ullah
A Review of Star Schema and Snowflakes Schema

In the new age, digital data is the most important source of acquiring knowledge. For this purpose, collect data from various sources like websites, blogs, webpages, and most important databases. Database and relational databases both provide help to decision making in the future work. Nowadays these approaches become time and resource consuming there for new concept use name data warehouse. Which can analyze many databases at a time on a common plate from with very efficient way. In this paper, we will discuss the database and migration from the database to the data warehouse. Data Warehouse (DW) is the special type of a database that stores a large amount of data. DW schemas organize data in two ways in which star schema and snowflakes schema. Fact and dimension tables organize in them. Distinguished by normalization of tables. Nature of data leads the designer to follow the DW schemas on the base of data, time and resources factor. Both design-modeling techniques compare with the experiment on the same data and results of applying the same query on them. After the performance evaluation, using bitmap indexing to improve the schemas performance. We also present the design modeling techniques with respect to data mining and improve query optimization technique to save time and resource in the analysis of data.

M. Zafar Iqbal, Ghulam Mustafa, Nadeem Sarwar, Syed Hamza Wajid, Junaid Nasir, Shaista Siddque
A Region-Based and a Unified Team’s Strength in the Game of Cricket Using Principal Component Analysis (PCA)

In sports, one of the primary measures for assessing the performance of a team is their team’s strength. The “team’s strength” is the combination of various factors of a team such as batting, bowling and fielding strengths. In cricket, it is significant to measure the strengths of the teams across the region/venue because every region has different playing condition which affects their performance. The selectors should be aware of their team’s strength against the opposite team from the visiting region so that the team’s management, the coach and the captain select a strong combination from his team. A lot of research has done for player’s performance measurement, team’s selection, team’s ranking, etc. but there is no research done in finding the strength of a team across the region. In this paper, we computed the Region-based Team’s Strength (RTS) and a Unified Team’s Strength (UTS) based on several parameters of a team using Principal Component Analysis (PCA). In addition, we also used the Weighted Average System (WAS) to compute the region-based team’s strength. The intuition is to distinguish a stronger and a weaker team across the region. The results show that the proposed method provides quite promising insights into a region-based and a unified team’s strength in cricket sports.

Akbar Hussain, Yan Qiang
A Unit Softmax with Laplacian Smoothing Stochastic Gradient Descent for Deep Convolutional Neural Networks

Several techniques were designed during last few years to improve the performance of deep architecture by means of appropriate loss functions or activation functions. Arguably, softmax is the traditionally convenient to train Deep Convolutional Neural Networks (DCNNs) for classification task. However, the modern deep learning architectures have exposed its limitation towards feature discriminability. In this paper, we offered a supervision signal for discriminative image features through a modification in softmax to boost up the power of loss function. Amending the original softmax loss and motivated by the A-softmax loss for face recognition, we fixed the angular margin to introduce a unit margin softmax loss. The improved alternative form of softmax is trainable, easy to optimize and stable for usage along with Stochastic Gradient Descent (SGD) and Laplacian Smoothing Stochastic Gradient Descent (LS-SGD) and applicable to classify the digits in image. Experimental results demonstrate a state-of-the-art performance on famous database of handwritten digits the Modified National Institute of Standards and Technology (MNIST) database.

Jamshaid Ul Rahman, Akhtar Ali, Masood Ur Rehman, Rafaqat Kazmi
Scientific Map Creator: A Tool to Analyze the Author’s Affiliation and Citations by Producing Mind Maps

Citations have been considered a parameter to gauge the importance of research paper in the scientific society. Scientific societies consider citations to evaluate the authors to rank them for deserving positions such as assigning those researchers PHD supervisions, presenting them award, hiring them as a reviewer or editor in the Journal. With the help of “Scientific Map Creator”, we can provide web-based concept mapping application with a Database. Our main research focus is to extract the information from the bulk of data. The data that has been stored in several online databases for research papers such as JUCS, IEEE Explore and Scopus is intended to be managed and extracting influential author and present that meta data in the form of mind maps with the help of “Scientific Map Creator”.

Sahar Maqsood ul Hassan, Khizra Tariq, Tooba Dar
Prediction and Analysis of Sun Shower Using Machine Learning

Climate is the absolute most occasions that influence the human life in each measurement, running from nourishment to fly while then again it is the most tragic wonders. In this manner, expectation of climate wonders is of significant enthusiasm for human culture to keep away from or limit the devastation of climate risks. Climate forecast is unpredictable because of clamor and missing qualities dataset. Various endeavors were made to make climate forecast as precise as would be prudent, yet at the same time the complexities of commotion are influencing exactness. In this paper, the five-year rainfall record of weather is used for predicting the rainfall by calculating the performance and accuracy through 10 cross-fold validation technique. Its initial step is gathering, isolating, sorting, and detachment of datasets dependent on future vectors. Arrangement strategy has numerous calculations, some of them are Support Vector Machine (SVM), Naïve Bayes, Random Forest, and Decision Tree. Prior to the execution of each strategy, the model is made and afterward preparing of dataset has been made on that model. Learning the calculation created model must be fit for both the information dataset and estimate the records of class name. Various classifiers, for example, Linear SVM, Ensemble, Decision tree has been utilized and their precision and time broke down on the dataset. At last, all the calculation and results have been determined and analyzed in the terms of accuracy and execution time.

Nadeem Sarwar, Junaid Nasir, Syed Zeeshan Hussain Shah, Alishba Ahsan, Sameer Malik, Sarousha Nasir, M. Zafar Iqbal, Asma Irshad
Electroencephalography Based Machine Learning Framework for Anxiety Classification

Anxiety is a psycho-physiological phenomenon related to the mental health of a person. Persistence of anxiety for an extended period of time can manifest into anxiety disorder, which is a root cause of multiple mental health issues. Therefore, accurately detecting anxiety is vital using methods that are automated, efficient and independent of user bias. To this end, we present an experimental study for the classification of human anxiety using electroencephalography (EEG) signals acquired from a commercially available four channel headset. EEG data of 28 participants are acquired for a duration of three minutes. Five different feature groups in time domain are extracted from the acquired EEG signals. Wrapper method of feature selection is applied, which selects features from two feature groups among the five feature groups initially extracted. Classification is performed using logistic regression (LR), random forest (RF), and multilayer perceptron (MLP) classifiers. We have achieved a classification accuracy of 78.5% to classify human anxiety by using the RF classifier. Our proposed scheme outperforms when compared with existing methods of anxiety/stress classification.

Aamir Arsalan, Muhammad Majid, Syed Muhammad Anwar

Decision Support Systems

Frontmatter
A Doctor Recommendation System Using Patient’s Satisfaction Analysis

The relationship between a patient and doctor is very important in the field of healthcare. It is essential for proper diagnosis and treatment of diseases. There is a substantial effort devoted in devising techniques to identify the key opinion leaders using active surveying methods based on acquiring and integration of secondary and primary data. The results are validated by describing a physician nomination network. A recommender framework has been proposed in various studies for seeking doctors in accordance with a patient characteristic, including their symptoms and preferred choices. The purpose of this study is to identify doctors based on the satisfaction of patients. The data is collected using the patient satisfaction questionnaire (PSQ-3) for subjects located in Pakistan. This data is further analyzed for devising a collaborative filtering-based recommender algorithm using K-nearest neighbor (KNN) similarity measure to recommend doctors to patients in the form of a sorted ranked list.

Haseeb Iftikhar, Syed Muhammad Anwar, Muhammad Majid
Analysis of White Blood Cells Using Hematology Counts

This research paper shows the detailed structure of white blood cells in a human body. The complete description includes the cell types, their functions and the importance of white blood cells in human body with the procedure of its calculation. How much amount of WBCs is required to the body and how to calculate them is a very important question. Hematology analyzer analyses the count of blood cells and its types in the human blood. Its usage and types are also discussed in this article. Another important aspect that has been covered in this article is the statically rate of the patients suffering from diseases that happens due to the deficiency of WBCs or due to the access about of WBCs in human blood. These patients are divided in three main categories of child, adult and old age patients. The microscopic images are found from different research laboratories for the better understanding of what happens to the cells when the amount of WBCs increases or decreases from the normal rage in different age groups of a human life.

Syeda Mariyum, Syed Gulfraz, Tayyaba Sultana, Khalid Masood
High Performance Simulation of Blood Flow Pattern and Transportation of Magnetic Nanoparticles in Capillaries

In the first place, the paper analyses the blood flow patterns in a capillary during the existence of a uniform external magnetic field by a hybrid CPU/GPU approach. The blood flowing through the capillary is supposed to be Newtonian; while the flow is incompressible and laminar. Magnetic Nanoparticles have been considered as therapeutic agent for the magnetic targeted drug delivery in the defence against cancer. However, the problem is expressed as a boundary value problem containing a system of partial differential equations in order to study the flow field and magnetic Nanoparticles. Finite element discretization is applied to resolve the system of equations which contains a large sparse system of equations requiring high computation. The CPU/GPU method serves as a platform to deal the wide-ranging computations in parallel. Therefore, the solution times can significantly be reduced by this platform as compared to the application of CPU. This allows more effective examination of different mathematical models and their leading parameters. The influence of the magnetic nanoparticle radius $$ R_{M} $$, capillary radius R, pressure P, magnetic field intensity H on the velocity profile of blood and magnetic nanoparticles have been investigated in terms of the magnetic field inputs and model. Secondly, the numerical solutions for velocity of blood and velocity of particles are computed along with the observation that an increase in magnetic field leads to increase in the flow pattern. Finally, the simulation concludes that the magnetic parameters have a key role to control the velocity profile.

Akhtar Ali, Rafaqat Kazmi
To Explore the Factors that Affect Behavior of Employee Regarding Knowledge Sharing in IT Service Companies

Context: It is vital to investigate that why employees of an organization do or do not prefer to share their acquired knowledge with coworkers and how their perceived barriers and fears influence their knowledge sharing intentions in a working environment. Objective: This study aims to understand significant factors and motives that influence or reduce employee’s intention to share knowledge with coworkers in an organization. Method: We planned a field study with 125 employees including HRM and Line managers in an IT service company to test our hypotheses. The structural equation modeling (SEM) approach is employed to test whether our hypotheses are true or not. Results: Results supported the positive effects of ethical leadership, R&R and technology related constructs and showed positive effect of Knowledge Transfer upon employee’s knowledge sharing intention. Contribution: This research contributes to concept of Knowledge Sharing and exploring its part in the knowledge seeking and contribution. It will help managers to learn how to promote knowledge sharing intentions between employees by applying knowledge oriented and ethical Leadership, R&R and use of technology.

Kiran Naz Abbasi, Syed Fakhar Bilal, Saqib Ali, Muhammad Naeem
Decision Support System for Dental Clinics: A Systematic Literature Review Protocol

Decision Support System (DSS) is a new trend in technology which provides decisions to users based on the information. Nowadays, it is being used in many fields and, it is becoming an essential part of the dental clinic. Many dental diseases are now detecting with the DSS and many treatments for dental diseases are providing through it. Therefore, it becomes an inspiring re-search issue to provide the best solution in sensitive types of data. This paper presents a Protocol for a Systematic Literature Review on DSS for dental diseases. This paper performed a specific documented plan and provides a methodology supportive to perform a Systematic Literature Review (SLR). Five important steps are performed in this review protocol and five research questions in the field of Dental Decision Support System (DDSS) are considered. Important attributes in the field of interest are considered. The expected result of this review will identify the research issues, provide different paths to researchers and it also discussed the importance/requirement of the DDSS. The results of this protocol can be helpful for the researchers, and it will provide a way for researchers to implement SLR more easily and effectively.

Muhammad Asim, Muhammad Arif Shah, Mumtaz Ali, Rashid Naseem
Comparison of Localization Algorithms for Unmanned Aerial Vehicles

Unmanned Aerial Vehicles (UAVs) are experiencing exponential growth these days. UAVs are used for multi purposes such as for security, photography, weather forecasting etc. The increase in number of UAVs are causing compromise to the personal privacy of people as well as threats to the privacy of confidential areas. Determining the exact location of these UAVs is important in many aspects. In this paper, several well-known localization techniques are compared. These techniques include received signal strength (RSS), angle of arrival (AoA), correlative interferometry and Watson-Watt method. These are compared on the basis of different parameters i.e. cost, efficiency, range, accuracy, energy consumption and hardware size. The comparison results show that correlative interferometry is the best available solution for UAV localization. Complexity of each algorithm is also computed. Watson-Watt and AoA has less computational complexities compared to other methods which is computed as O(n) for both algorithms.

Imran Qasim, Nauman Habib, Uzair Habib, Qazi Fakhar Usman, Mohsin Kamal
Metaheuristic Algorithms in Industrial Process Optimisation: Performance, Comparison and Recommendations

The process parameters design is one of the most demanding tasks in a modern industry, aiming to find optimal process parameters that meet strict requirements for process responses. Various methods have been employed to tackle this problem, including metaheuristic algorithms. The objective of this paper is twofold. Firstly, it presents a review analysis and comparison of metaheuristics’ performance in optimising industrial processes as evidenced from the literature, with a particular focus on the most commonly used algorithms: genetic algorithm (GA), simulated annealing (SA), and particle swarm optimisation (PSO). Secondly, an intelligent method for parametric multiresponse process design, based on the soft computing techniques, is presented. GA, SA and PSO are used as an optimisation tool, and comparison of their results in real case studies is performed including two criteria: (i) accuracy of an obtained optimum; (ii) a number of objective function evaluations needed to reach an optimum. Tuning of the algorithms’ own parameters is also discussed, which is especially interesting due to a different nature of three algorithms: a single point-based algorithm (SA), a population-based algorithm (GA), and a population-based algorithm with swarm intelligence (PSO). The algorithms’ robustness, i.e. sensitivity in respect to the algorithm-specific parameters tuning is studied and introduced as the third criterion in comparing the algorithms’ performances. The concluding remarks are drawn from this analysis, followed by the recommendations for an efficient metaheuristics’ application in optimising industrial processes.

Tatjana Sibalija

Social Media Analytics

Frontmatter
Sentimental Content Analysis and Prediction of Text

In the advancement of technology, the web era revolutionized mankind life; huge amounts of data are available on the internet in the form of articles and blogs. From this huge volume of data opinion mining is an important for extracting the raw data to become useful information. Sentiment analysis provides categorization in opinion mining as positive or negative class for content analysis. English language is considered as a universal language and used almost every part of the word, so classification of opinion is important to get the end meaning of the word phrase and comments. No literature is available for classification of sub opinion in the text mining. SAP of Text through Machine Learning algorithm (KNN) is a three-step technique of opinion mining. In this study, authors have put articles at first removing stop-words, tokenizing the sentence and revamping the tokens, it will calculate the polarity of the sentence, paragraph and text through contributing weighted words by keeping sentiment shifters and intensity clauses in consideration. Secondly, over polarization of sentence is adjusted. Finally, overall trend of the input text on the basis of tokenization and polarization of sentence is predicted with proposed algorithm and compared with KNN. Furthermore, domain specific analysis is a distinct feature of the proposed model where data can be updated according to the required domain to ensure the optimal level of efficiency.

Ali Haider Khan, Muhammad Haroon, Osama Altaf, Shahid Mehmood Awan, Aamna Asghar
An Analysis of Depression Detection Techniques from Online Social Networks

Mental illness is caused by depression which may have a deep negative impact on individuals or on the society as a whole. It is a growing and severe problem which has the tendency to increase with time due to the extensive use of social networking websites such as Facebook, Twitter, Instagram etc. These social networking websites allow users to share images, videos, expressions and emotions. Depression is a form of mental illness. Patients suffering from depression have mood disorders such as low mood, high mood, lack of interest in things, etc. Machine learning techniques on text data and emojis have been applied to automatically classify a user into depressed and non-depressed. State-of-the-art classifiers have been used by the researcher to detect depressed individuals. Benchmark datasets are composed of text and emojis used in the social networking websites where classification is based on four factors Emotional Process, Temporal Process, Linguistic Style and combination of these three factors. In the proposed method, the emotional process is combined with their respective emojis to develop an automatic system for the detection of depressed patients. The features from the emotional process and emojis will be extracted and state-of-the-art classifiers have been proposed to be trained and evaluated using multiple classifiers using different combinations of part-of-speech tags.

Uffaq Bilal, Farhan Hassan Khan
Classification of Social Media Users Based on Disagreement and Stance Analysis

Analyzing conversational behavior is a primary task in the field of sentiment analysis. Different researchers have proposed their models to perform the sentiment analysis of social media discussions. Existing approaches mainly studying the conversational behavior based on the text in the conversation. In any discussion, the users could have different viewpoints about the topic of conversation. A user can agree or disagree on the topic of discussion. Agreement of a user to the topic is called a stance, and if the user disagrees to the topic, we refer it as disagreement. The classification of the users based on stance and disagreement is not a well-researched area, and need to be explored further. In this work, we have proposed a computational model to classify the members according to stance or disagreement. The proposed model uses the novel approach, and it is a hybrid of topic modeling and VADER (Valence Aware Dictionary and sEntiment Reasoner). To evaluate the proposed model, we have conducted an experiment, and we did use the WhatsApp group discussion, Facebook comments as a dataset. We also compared the proposed model with two baseline approach, topic modeling, and VADER. From the results, we can conclude that the proposed model can effectively classify social media users based on disagreement and stance.

Farhad Muhammad Riaz, Nasir Mahmood Minhas, Sarfraz Bibi, Waqas Ahmed
A Framework to Strengthen up Business Interests in Students by Using Matrix Factorization on Web Log

Business and entrepreneurship play vital role in economy of a country. Job hunting stress, anxiety and prevailing of criminal thoughts in graduates is common due to unclear mind about their career these days. It is important to shape business skills in students while they are still studying. Internet is great source of motivation and guidance for them. Many students start searching their career goals during their school. At that time, their objectives can be transformed from job seekers to job providers, if they are steered right. In this study we have deliberated web browsing data of students and it is observed that so many of students search for business, finance and entrepreneurship related data while in school. This illustrates deep interest of students in business related activities. Their interest can be evolved and matured with appropriate supervision and direction. So, a two phased approach is demarcated in this paper to strengthen up Business Interests. 1st step involves extracting student’s data from the web log who are interested in business activities and matrix factorization (MF) algorithms are applied to recommend them supplementary interrelated knowledge over the internet. Matrix Factorization (MF) techniques, SVD++ and ASVD are applied to appraise the best working one using Root mean squared error. 2nd step is to notify their IP addresses about any business events happening around. This will strengthen up their business interests and they will come up with mature ideas of business after completion of their education.

Mehwish Naseer, Wu Zhang, Wenhao Zhu
Improving Validity of Disaster Related Information by Identifying Correlation Among Different Social Media Streams

Social media has become an important mode for communication and content sharing in this digital world. During large-scale events, a big cluster of data usually posted by the users on social media; in the form of tweets, pictures and videos. The data is informative, but not all the contents which are posted on the social media have reliable information. The existence of spam, fake images and manipulation can reduce the validity of information on social media. To establish trust in information posted on social media, there is a need to identify a mechanism that can recognize and report questionable posts and flag them for scrutiny and verification. This research will provide an approach in assessing and improving the validity of information on social media. The users will able to identify the validity of information along with polarity and subjectivity of tweets and videos.

Muhammad Faizan Arshad, Bakhtiar Kasi, Riaz Ul-Amin, Abdul Sattar Malik
An Approach to Map Geography Mark-up Language Data to Resource Description Framework Schema

GML serves as premier modeling language used to represent data of geographic information related to geography locations. However, a problem of GML is its ability to integrate with a variety of geographical and GPS applications. Since, GML saves data in coordinates and in topology for the purpose to integrate data with variety of applications on semantic web, data be mapped to Resource Description Framework (RDF) and Resource Description Framework Schema (RDFS). An approach of mapping GML metadata to RDFS is presented in this paper. This study focuses on the methodology to convert GML data in semantics to represent in extended and enriched form such as RDFS as representation in RDF is not sufficient over semantic web. Firstly, we have GML script from case study and parse it using GML parser and get XML file. XML file parse using Java and get text file to extract GML features and then get a graph form of these features. After that we designed methodology of prototype tool to map GML features to RDFS. Tool performed features by features mapping and extracted results are represented in the tabular form of mapping GML metadata to RDFS.

Ammara Faqir, Aqsa Mahmood, Kiran Qazi, Saleem Malik

Machine Learning

Frontmatter
Multi-aspects Intelligent Requirements Prioritization Technique for Value Based Software Systems

Requirements engineering (RE) is an important phase of software engineering. During this phase, an important set of activities are carried out to manage requirements elicitation, verification, prioritization and validation. Dimension and dynamics of software development are changing with the passage of time. Economic growth in different sectors is increasing the demand for software development. This enhancement has introduced the concept of Value Base Software (VBS) development. Requirements prioritization is playing a vital role in ordering requirements to support the release planning of the software. A prioritization process is considered as highly complex process and depends on the nature and size of requirements. VBS systems are entirely different from typical software development, and prioritization process for VBS is also very challenging. A need arises from the provision of prioritization techniques to support the technical and business aspects-based prioritization. Existing techniques are not qualified to meet the expectation of the industry for VBS development. Therefore, this research contribution is an effort made, based on an intelligent decision support system for requirements prioritization in the domain of VBS system. Aspects based requirements prioritization is applied to many requirements and results are produced in two clusters. Results are claimed as a prioritized list of requirements for traditional as well as value-based system.

Falak Sher, Dayang N. A. Jawawi, Radziah Mohammad, Muhammad Imran Babar, Rafaqat Kazmi, Muhammad Arif Shah
An Intelligent Approach for CRC Models Based Agile Software Requirement Engineering Using SBVR

In requirement engineering (RE) for agile software development, the Class-Responsibility-Collaborator (CRC) models are used as important brainstorming tool. However, manual generation of such CRC models by analyzing the requirements is a difficult and time-consuming task due to ambiguity and informal nature of natural languages-based software requirements. This paper introduces an improved requirement engineering technique based on CRC models that can help in specifying and analyzing software requirements in a better and faster way and curtailing difficulties associated with the traditional RE analysis technique. The proposed technique employs Semantics of Business Vocabulary and Rules (SBVR) to capture and specify software requirements in a controlled natural language. The SBVR representation is processed to extract object-oriented information and map the extracted information to CRC models in both textual and visual form. The proposed approach is implemented as an Eclipse plugin prototype SBVR2CRC as a proof of concept and the results of the experiments validate the effectiveness of the presented approach. Results show that such automated approach not only saves certain time and effort but also assists in generation of better CRC models and simplifies the CRC models based agile software development.

Hina Afreen, Umer Farooq
Automatic RDF, Metadata Generation from Legacy Software Models

The resource description framework is a method which help us to interchange and reuse of metadata. Early metadata gives a simple method for performing and reporting Data checks. Checks are indicated in an intuitive way to find at which extend we find the relevant data. Moreover, to create vocabularies to handle metadata of larger data sets and to capture the correct relation between the objects we needed some framework that results us reliable. Resource description framework is XML application which follow organize rules to give clear method of explaining semantics. RDF is a data model. It is simple model which have different statement about resources. Resource can be any object. RDF gave metadata of web data and gave structural information. UML data can also be explained in form of RDF that is quite a fruitful in field of RDF. Transformation from UML to RDF, XML help eminently. Metadata of class diagram acquire through XML then Parsing of XML pledge metadata and retinue the RDF rules to change metadata into RDF Triples. Triple include subject, predicate and object. Which transmits a sense of a sentence. Further we store triples in RDF system. This technique helps to keep data useful for many generations.

Amna Riaz, Imran Sarwar Bajwa, Munsub Ali
An Approach to Measure Functional Parameters for Ball-Screw Drives

Linear Ball screw drives are commonly used for precise motion control of different mechanical systems including CNC machine tools, robotics and various aerospace systems. These drives are specifically designed on the basis of required operating performance parameters like output torque, rated power, slew rate, backlash, friction and required efficiency. These desired parameters significantly affect the functionality of ball-screw drive systems. Parameters like torque capability and slew rate should be tested to validate design and ensure safety during operation. Factors like Friction (dead band) and mechanical backlash offers very adverse effect on the repeatability and operation of overall dynamic system; specifically, when precise positioning accuracy is desired under extensive loading. This paper gives an approach to measure performance parameters for ball screw drives. Torque capacity & slew rate are analyzed and evaluated experimentally on a self-designed Hex Twist suspension test setup. Initially, Hex Twist setup was calibrated to find the deflection scale factor. A complete closed loop testing is also performed on the same testing setup to measure the overall functionality of ball screw drive actuator. The results indicate that the developed testing setup is very accurate in measuring open loop and close loop torque and slew rate values.

Naveed Riaz, Syed Irtiza Ali Shah, Faisal Rehman, Syed Omer Gilani, Emad-udin
Secure NoSQL Over Cloud Using Data Decomposition and Queryable Encryption

NoSQL-based databases are attractive to store and manage big data, mainly due to high scalability and data modeling flexibility. However, security in NoSQL-based databases is not as compared to the SQL-based relational databases, which raises concerns for users. Specifically, the security of data-at-rest is of serious concern for the users, who deploy their databases on the cloud, because any unauthorized access to the servers will expose the data easily. There have been some efforts to enable encryption for data at rest for NoSQL-based databases. However, most of the existing solutions do not support secure query processing and are difficult to integrate with the applications. In our work, we address the NoSQL data-at-rest security issue by proposing a system which decomposes a given database into two sub-databases. Then encrypt the data, support secure query processing, and enable seamless integration with NoSQL-based databases. The proposed solution for the data at rest is based on a combination of Chaotic Encryption (CE), Order Preserving Encryption (OPE), and Advanced Encryption Standard (AES). The database decomposition into two sub-databases helps to improve data confidentiality while providing comparable performance to the state-of-the-art baseline method.

Muhammad Ali Raza, Muhammad Usama, Waheed Iqbal, Faisal Bukhari
Pseudo Transfer Learning by Exploiting Monolingual Corpus: An Experiment on Roman Urdu Transliteration

This paper shares two things: an efficient experiment for “pseudo” transfer learning by using huge monolingual (mono-script) dataset in a sequence-to-sequence LSTM model; and application of proposed methodology to improve Roman Urdu transliteration. The research involves echoing monolingual dataset, such that in the pre-training phase, the input and output sequences are ditto, to learn the target language. This process gives target language based initialized weights to the LSTM model before training the network with the original parallel data. The method is beneficial for reducing the requirement of more training data or more computational resources because these are usually not available to many research groups. The experiment is performed for the character-based Romanized Urdu script to standard (i.e., modified Perso-Arabic) Urdu script transliteration. Initially, a sequence-to-sequence encoder-decoder model is trained (echoed) for 100 epochs on 306.9K distinct words in standard Urdu script. Then, the trained model (with the weights tuned by echoing) is used for learning transliteration. At this stage, the parallel corpus comprises 127K pairs of Roman Urdu and standard Urdu tokens. The results are quite impressive, the proposed methodology shows BLEU accuracy of 80.1% in 100 epochs of training parallel data (preceded by echoing the mono-script data for 100 epochs), whereas, the baseline model trained solely on parallel corpus yields ≈76% BLEU accuracy in 200 epochs.

Muhammad Yaseen Khan, Tafseer Ahmed
Classification and Prediction Analysis of Diseases and Other Datasets Using Machine Learning

Classification is one of the most used machine learning technique especially in the prediction of daily life things. Its first step is grouping, dividing, categorizing, and separation of datasets based on future vectors. Classification procedure has many algorithms, some of them are Random Forest, Naïve Bayes, Decision Tree and Support Vector Machine. Before the implementation of every technique, the model is created and then training of dataset has been made on that model. Learning the algorithm-generated model must be fit for both the input dataset and forecast the records of class label. Many models are available for prediction of a class label from unknown records. In this paper, different classifiers such as Linear SVM, Ensemble, the Decision tree has been applied and their accuracy and time analyzed on different datasets. The Liver Patient, Wine Quality, Breast Cancer and Bupa Liver Disorder datasets are used for calculating the performance and accuracy by using 10 cross-fold validation technique. In the end, all the applied algorithm results have been calculated and compared in the terms of accuracy and execution time.

Junaid Nasir, Alishba Ahsan, Nadeem Sarwar, Wajid Rafique, Sameer Malik, Syed Zeeshan Hussain Shah, Sarousha Nasir, Asma Irshad
QoE Analysis of Real-Time Video Streaming over 4G-LTE for UAV-Based Surveillance Applications

Drones also known as Unmanned Aerial Vehicles (UAVs) perform an significant role in surveillance at a remote location by streaming real-time video with their attached cameras. A good architecture for such kind of surveillance is required that ensures real-time monitoring at targeted areas. As the streaming video is used in monitoring; it is much important to ensure its quality during transmission so that remote client can view clear insights and could take prompt action on time if required. In this paper, we have proposed a 4G-LTE architecture and examined the effects of different factors in such architecture. We have shown the comparative analysis between two latest codec schemes i.e. H.264 and H.265 (HEVC) in video streaming. Our study is an important step towards exploring the factors that influence the real-time video streaming and degrade the Quality of Experience (QoE) of video viewing in such architecture. To examine the received video quality, two objective metrics, Peak-Signal-to-Noise-Ratio (PSNR) and Structural-Similarity-Index (SSIM) have been considered in this paper. The simulation results are based on the most famous Network simulator in the research community i.e. NS-3. The results have shown that H.265 works better in comparison with H.264 under different circumstances.

Muhammad Naveed, Sameer Qazi

Natural Language Processing

Frontmatter
Urdu Natural Language Processing Issues and Challenges: A Review Study

Natural language processing is the technology used to aid computers to understand the human’s natural language. However this is not an easy task to teach a machine to understand how humans communicate. This paper provides a summary of information about some speech recognition techniques that are in the literature for new scholars to look into. It also discusses related work along with efficiency comparison for different natural languages. After that, a brief summary of Urdu language and related work done in Urdu language processing issues and challenges is presented. In the last part, future work is proposed for efficient processing of Urdu language along with some useful techniques.

Usman Khan, Maaz Bin Ahmad, Farhan Shafiq, Muhammad Sarim
Urdu Spell Checker: A Scarce Resource Language

In the digital world of computers, several software applications have been developed to ensure spellings of various words. English language is found to have gone far ahead in the development of spell checking applications whilst other languages specifically naming Urdu, lack behind to cherish such technologies. We develop “Urdu Spell Checker” which detects incorrect spellings of a word and provides a list of options containing correct spellings. The spell checker carries correct spellings of words residing inside a predefined lexicon or corpus. It is to ensure whether entered word is correct or not. In case if the input word matches with the corpus words it is considered correct otherwise it is considered as misspelled word. Multiple techniques are used individually as well as a combination these techniques is used to check which set of methods is best in terms of output. By using multiple techniques for error correction, it is observed that Jaro distance provides best results with combination of soundex, shapex and n-gram that is 80.0% precision, 44.87% recall and 57.37% F-Measure.

Romila Aziz, Muhammad Waqas Anwar
Maximum Entropy Based Urdu Part of Speech Tagging

This paper represents results for a part of speech tagger based on Maximum Entropy model on Urdu corpora. We discussed the specialized features/parameters of the model and their impact on tagger performance. We also discussed the complexity of Urdu language for a tagger to predict correct tag and inconsistencies in corpora found during experiments. For the purpose of detailed experiments, two different corpora are used in this paper. The maximum accuracy recorded for tagger is 93.24% overall and 69.89% on previously unseen data.

Usman Mohy Ud Din, Muhammad Waqas Anwar, Ghulam Ali Mallah
Towards a Generic Approach for PoS-Tagwise Lexical Similarity of Languages

The lexical similarity measures of the languages are used to find genetic affinity among them—as the languages come closer in language tree, chances increase to have more cognates in common. In this regard, this paper describes a tool to calculate the lexical similarity between pairs of languages. We used the words present in Universal Dependency (UD) corpora to find lexical similarities of the words. Since, many of languages in the UD corpora share the same scheme of part of speech (PoS) tag-set; we got the lists of words, corresponding to standard set of PoS tags. The tool can compare words of particular PoS tags for two different languages. Hence, we can calculate lexical similarity not only for the whole language but also for the specific PoS or a subset of PoS. Further, a user can compare function-words to find genetic affinity, nouns, and proper nouns to find borrowing or the loan-words. Moreover, this tool is more flexible than using either all of the words or a list (e.g., Swadesh list).

Muhammad Suffian Nizami, Muhammad Yaseen Khan, Tafseer Ahmed
Preprocessing Techniques in Text Categorization: A Survey

Text Categorization is a process of categorizing or labeling an unstructured Natural Language (NL) text to related categories with the help of a predefined set. In text categorization, pre-processing is a crucial step which is used for extracting non-trivial, interesting and useful input for further stages of the process of text categorization. As the words in text usually contains a lot of structural variations, so before accessing the information from documents, pre-processing techniques are applied on the data to minimize the size of the data which may increase efficacy of the result and better categorize the text. The main objective of this research is to Survey about the pre-processing techniques like Tokenization, Stop-words removing and Stemming. We’ll see how these techniques affect text categorization in good or may be bad ways.

Sayyam Malik, Sana Ahmad Sani, Anees Baqir, Usman Ahmad, Faizan ul Mustafa
Educational Data Mining: A Review and Analysis of Student’s Academic Performance

Data mining is a technique for extraction of valuable patterns from multiple sources. Data mining plays an important role in marketing, electronic-commerce, business intelligent, healthcare and social network analysis. Advancement in these applications, many researchers show their interest in development of data mining applications in educational context. Educational data mining is a technique defined as a scientific area making inventions within rear types of data that derived from educational surroundings. This Paper reviews different case studies based on data mining educational systems. These systems and mining methods are considered for gathering and analysis of information. Due to huge amount of data in Educational databases, it becomes very challenging to evaluate student performance. Currently in Pakistan, there is dire need to monitor and examine student’s academic progress. There are two main causes of why existing systems were not able to analyze performance of students. First, the study on present evaluation methods is still not satisfactory to analyze the appropriate methods for evaluating the progress and performance of students in institutions of Pakistan. Second is because of absence of investigations on parameters; that effects student’s success in specific courses. Thus, a comprehensive review is proposed on evaluation of student’s performance by using techniques of Data Mining methods to progress student’s achievements. The aim of paper is to improve students’ academic performance by identifying most suitable attributes by using techniques of EDM.

Sadia Ijaz, Tauqeer Safdar, Muhammad Sanaullah
Multi Agents Based System Architecture for Market Research in E-Business

Companies and organizations in today’s business especially online business are found in dire need of hiring employees for market research. They pay handsome compensations for the services of such employees in return. In addition, market research deals with loads of internet surfing, spending time on studying potential markets and tapping potential customers. Few existing market research tools/platforms are able to generate potential clients and assist to run marketing campaigns; but, these also require prepared data sets that need to be mapped as per the software’s own requirements and capacities. Our idea is to employ Multi Agent System (MAS) to search particular markets and provide an efficient route to gather business opportunities that are incessantly created in the fast paced world today. Two basic methodologies for MASs are complementary to each other; one is iterative on each stage of development while the other one is a cyclic process of analysis and design. Amalgamation of these two, is used for designing our desired multi agent system for Market Research in E-Business (MREB). In order to make a prototype, we used Netbeans, JADE FIPA-compliant with SQL Server 5.7. It equips any company using this platform with an opportunity to get a huge list of suitable projects according to its product base and/or services expertise. We have achieved considerable results by our system MREB with some new features.

Amna Ashraf, Muhammad Aslam, Nayab Tasneem Bari

Image Processing and Analysis

Frontmatter
An Investigation on Ability of Pre-trained Convolutional Neural Networks Trained on ImageNet to Classify Melanoma Images Without Re-training

Deep learning, particularly with Convolutional Neural Network based implementations for medical diagnostics using images is widely acclaimed for assisting doctors. Medical image processing serves as a second opinion for doctors particularly for diseases like Meleroma. Several deep learning paradigms have proved their ability and advantages interns of reducing the training time which is crucial for medical image processing. Using a pre-trained deep architecture is always advantages and one such successful and widely used CNN based deep learning architecture is ResNet50.This paper tries to explore the ability of ResNet for classification of Melodrama images to articulate the possibility using pre-trained deep architectures in healthcare decision making systems. The dataset used in this paper is provided by International Skin Imaging Collaboration, as part of ISIC challenge 2019.Experiments are performed using pre-trained ResNet50 with and without retraining and using multiple sets of test data with different sample sizes. The experiment results shows that by a simple retraining with a small number of samples, the classification accuracies can be improved which is 13.41% for the set of experiment conducted using a sample size of 6000 images. Further, it is noteworthy to observe that ResNet was able to provide a classification accuracy of 61.39% without any re-training.

S. S. Tirumala, Noreen Jamil, Bahman A. Sassani
Effect of Laplacian Smoothing Stochastic Gradient Descent with Angular Margin Softmax Loss on Face Recognition

An important task in deep learning for face recognition is to use proper loss functions and optimization technique. Several loss functions have been proposed using stochastic gradient descent for this task. The main purpose of this work is to propose the strategy to use the Laplacian smoothing stochastic gradient descent with combination of multiplicative angular margin to enhance the performance of angularly discriminative features of angular margin softmax loss for face recognition. The model is trained on a most popular face recognition dataset CASIA-WebFace and it achieves the state-of-the-art performance on several academic benchmark datasets such as Labeled Face in the Wild (LFW), YouTube Faces (YTF), VGGFace1 and VGGFace2. Our method achieves a new record accuracy of 99.54% on LFW dataset. On YTF dataset it achieves 95.53% accuracy.

Mansoor Iqbal, Muhammad Awais Rehman, Naveed Iqbal, Zaheer Iqbal
Brain Tumor Localization and Segmentation Based on Pixel-Based Thresholding with Morphological Operation

Brain tumor localization and segmentation from MRI of the brain is a significant task in medical image processing. Diagnosis of brain tumors at early stages play a vital role in successful treatment and raise the survival percentage of the patients. Manual separation of brain tumors from huge quantity of MRI images is a challenging and time taking task. There is need for an automatic efficient technique for brain tumor localization and segmentation from MRI images of brain. Some years ago, improper filtration and segmentation techniques was used for brain tumor detection, which gives almost inaccurate detection of tumor in MRI images. The proposed technique is mainly based on the preprocessing step for de-noising input MRI, thresholding, and morphological operation and calculating performance parameters for validation. Firstly, anisotropic diffusion filter is applied for removal of noise because input MRI images are mostly noisy and inhomogeneous contrast. Secondly, MRI pre-processed brain image is segmented into binary using thresholding technique. Thirdly, the region-based morphological operation is used for separation of tumorous part from segmented image. At the end, Root mean square error (RMSE), peak signal to noise ratio (PSNR), tumorous area in pixels and centimeters, system similarity index measurement (SSIM), area under curve (AUC), accuracy, sensitivity and specificity are the parameters used for evaluation of the proposed methodology. Visual and parametric results of proposed method are compared with the existing literature.

Muhammad Yousuf, Khan Bahadar Khan, Muhammad Adeel Azam, Muhammad Aqeel
Classification of Breast Lesions in Combination with Metamorphic Segmentation and Saliency Feature Block

Breast cancer is leading disease of females and every year death rate is increased gradually due to the breast cancer. Early diagnosis and treatment of breast cancer is an effective way to reduce the death rate of women. The development of the CAD systems improved the mortality rate by reducing false assumptions. This proposed work presents a computer-aided diagnosis (CAD) system for early detection of tumor in digitized mammograms. A novel classification method for breast lesions is proposed using transformative segmentation and saliency feature block. Experimental results show that proposed methods outperformed the existing method and provide timely diagnosis which greatly reduces the mortality rate in medical informatics.

Bushra Mughal, Faheem Mushtaq, Attaullah Buriro
Analysis of the MIDAS and OASIS Biomedical Databases for the Application of Multimodal Image Processing

In the last two decades, significant advancement occurs related to medical imaging modalities and image processing techniques. In biomedical imaging, the accuracy of a diagnosed area of interest can be increased using a multimodal dataset of patients. A lot of research and techniques are proposed for processing and analysis of multimodal imaging, which requires datasets for benchmarking and validation of their performances. In this connection, two important databases: MIDAS and OASIS are selected and evaluated for the guidance of the researcher to perform their results in the field of multimodal imaging. The associated diseases to these datasets and open issues in the field of multimodal imaging are also discussed. The main objective of this article is to discuss the current interest of the researcher and open platforms for future research in multimodal medical imaging. We originate some statistical results of graphs and charts using the online Web Analysis tool “SIMILIARWEB” to show public interest on these databases and also arranged these datasets according to various modalities, body scanned areas, disease-based and classification of images to motivate researchers working in multimodal medical areas. The significance of these databases in the field of multimodal image processing is encapsulated by graphical charts and statistical results.

Muhammad Adeel Azam, Khan Bahadar Khan, Muhammad Aqeel, Abdul Rehman Chishti, Muhammad Nawaz Abbasi
Image Quality Assessment Using a Combination of Hand-Crafted and Deep Features

No-reference image quality assessment is an important area of research and has gained significant interest over the past years. Full-reference image quality assessment is not feasible in some scenarios where the reference image is not available. Hand-crafted features are statistics of natural scene images which are used to train a regression algorithm. Deep learning based approaches learn discriminatory features from images. This paper proposes a hybrid method to Image Quality Assessment (IQA) which uses combination of hand-crafted and deep features. The hand-crafted features are extracted in multi-scale and color space configuration in order to capture greater details. Deep features are extracted using transfer learning of Vgg19. Dimensionality reduction is achieved using principal component analysis and features with 95% variance are retained and a final feature set of 102 transformed features is obtained. Gaussian process regression using the squared exponential kernel is used for modeling. The final model is tested on seven benchmark databases for correlation of estimated image quality and mean of subjective score. A comparison with twelve state-of-the-art methods is performed and superior performance is achieved.

Nisar Ahmed, Hafiz Muhammad Shahzad Asif, Hassan Khalid
An Ensemble Classification-Based Methodology Applied to MRI Brain Images for Tumor Detection and Segmentation

Automated brain tumor detection is an important application in the medical field. There is a lot of methods developed for this task. In this paper, we have implemented an algorithm which detects the type of brain tumor from MRI image using supervised classification techniques. The major part of the work includes feature extraction using DWT and then reduction of features by using PCA. These reduced features are submitted to different classifiers like SVM, k-NN, Naïve Bayes and LDA. The results from each classifier are then submitted to a voting algorithm that chooses the most frequent result. The dataset for training contains 160 MRI images. The algorithm is processed on 200 * 200 images to reduce processing time. This method is tested and found to be much beneficial and rapid. It could be utilized in the field of MRI classification and can assist doctors to detect the tumor type and diagnose about patient abnormality level.

Abuzar Qureshi, Khan Bahadar Khan, Hamza Ali Haider, Rameez Khawaja, Muhammad Yousuf
Efficient PatchMatch Algorithm to Detect False Crowd in Digitally Forged Images

The authenticity and reliability of digital images have become one of the major concerns recently due to the ease in manipulating and modifying these images. Similar manipulation in crowded images gives rise to the false crowd, where a person or group of persons is copied and pasted in the same image. Thus, the detection of such false crowds is the focus of current research. In this paper, false crowd detection in forged images is carried out using modified and improved PatchMatch algorithm which can even detect multiple copies of the same instance. To sperate humans from non-human objects a human detection algorithm is used in the post-processing phase. A benchmark database consisting of false crowd images has also been developed. Experimental results confirm that the technique is capable of detecting the false crowds successfully and even robust for multiple cloning problem.

Rakhshanda Javid, M. Mohsin Riaz, Abdul Ghafoor, Naveed Iqbal
Analysis of the Lifetime and Energy Consumption of WSN Routing Protocols: LEACH, EAMMH and SEP

The lifetime of Wireless Sensor Network (WSN) is based on the energy of each sensor. Therefore, numerous routing protocols have been introduced such as SEP (Stable Election Protocol), LEACH (Low energy adaptive clustering hierarchy) and EAMMH (Energy Aware Multi-hop Multi-path Hierarchical routing protocol) to stable the lifetime of WSN by efficient utilization of energy of the sensors. In WSN networks data is collected and forwarded wirelessly to the base stations. Routing protocols in WSN provide an efficient path for multi-hop communication because the transmission range and battery power storage of nodes is limited. As nodes are usually deployed in remote areas so, they face challenges of long lifetime, network architecture, security issues, and network coverage. Many routing protocols have been developed to increase the efficiency of networks. Out of these clustering algorithms are the best choice to increase the lifetime of the nodes. Each protocol has its limitations in different scenarios. In this paper, we discussed and compared the performance of LEACH, EAMMH and SEP. SEP is an efficient algorithm due to its weighted cluster head (CH) selection method. Simulation results are obtained using MATLAB. Design metrics of energy consumption and network lifetime are under consideration in this research work.

Muhammad Yasir Farooq, Khan Bahadar Khan, Ghulam Mohayy ud din, Eid Rehman, Sundus Amin

Intelligent Environments

Frontmatter
Crash-Resilient Synthesizer in Logic Optimization Paradigm for Sustainable Hardware Design of ASIC and Digital Systems

Fault detection and its recovery has always been a major concern in research. Both hardware and software level faults of a system are required to be dealt with for any liable fault-tolerant framework. Self-sustainable systems play an important role in diagnosing such problems in systems through an automatic response. ASIC (Application Specific Integrated Circuit) is a latest technology whose fabrication level designing and coding, utilizing logic synthesis phase can be a source of relief from multiple faults at hardware level. Various existing repair techniques for self-healing systems have been discussed in this research. An algorithm along with a flow model is further explained that is helpful to cater with physical, logical and electrical faults possibly occurring at chip and gate level for improving reliability and efficiency in upcoming self-sustainable systems.

Noor-ul-Qamar, Muhammad Habib, Waseem Ahmad, Taimoor Hassan
An Improved Ensemble Based Machine Learning Technique for Efficient Malware Classification

Android smartphones have become an emerging technology due to widespread adoption. The widely used Android devices allow installation of apps and grant privileges to access confidential information from the phone which resulted in being targeted by malware developers. The dramatic rise in the number of attacks, develop an interest to make a robust system that automatically identifies the presence of malicious behavior in Android applications. The previous malware detection studies comprised of static and dynamic analysis techniques, extreme learning machine and virtual machine introspection that have few shortcomings in detection of data outflow such as high computational and performance cost, low accuracy, high false positive rates, etc. The proposed approach overcomes the problems of static and dynamic techniques in malware detection. The novel classification approach senses all kinds of source-code and application behaviors. The proposed technique scans the keywords of manifest.xml files for malicious items. By the enhancement of manifest.xml feature the proposed technique can reduce apps scan time as compared to previous proposed malware detection frameworks. This technique also improves the security of Android users.

Farwa Maqbool Hussain, Farhan Hassan Khan
Green Computing: A Contribution Towards Better Future

Energy efficiency has become one of the momentous factors in the development of computer systems. Increasingly, processors and memory subsystem hankering after power have reinforced the need for audacious power management. Green computing allude to environmentally viable computing by paring down the use of electricity as well as energy consumption, hence making the environment more eco-friendly. Perceptibly a practice of utilizing reckoning resources available to us in a very optimistic way. Data and Information, during transmission, from one system to another must be secure, also making sure that it does not compromise on green computational factors. Though many modern techniques with different experiments have been carried out to minimize hazardous environmental affects since vacuum tubes. Dynamic Voltage Scaling (DVS) has become an important consideration for designing dynamic and energy efficient processors yet not compromising over system performance, efficiency and low-cost maintainability. Moreover managing energy consumption of RAM has also become censorious, ranging from small portable devices to VLSI based systems. As applications are becoming more database-centric, pressurizing memory systems and subsystem, in very fast-pace and competitive environments. This paper discusses about comparison of Intel and AMD processors with power consumption factors, emphasizing DVS to achieve next level of energy-efficient CPUs for computing.

Khawar Saleem, Nadia Rasheed, Muhammad Zonain, Salman Muneer, Abdul Rauf Bhatti, Muhammad Amjad
An Intelligent Approach to Detect Actuator Signal Errors Based on Remnant Filter

Linear electro-mechanical actuators are commonly found in a variety of critical dynamic systems including hazardous robotic applications and various aerospace applications, and are specifically designed on the basis of required operating functional parameters. The performance of electro-mechanical actuators significantly affects the performance of overall system. This paper works to improve the reliability of linear electro-mechanical actuators through analytical redundancy for critical applications. A fault diagnostic algorithm is designed based on dynamic mathematical model and a remnant filter is implemented to detect actuator faults. The remnant filter generates residual signal proportional to the induced fault. Fault detection thresholds are set and decision logic is established to compare residual signal with the lower and upper threshold constants. This designed diagnostic filter is tested experimentally on a fault diagnostic testing setup and results of designed model have been validated.

Naveed Riaz, Syed Irtiza Ali Shah, Faisal Rehman, Syed Omer Gilani, Emad-udin
Driving Activity Recognition of Motorcyclists Using Smartphone Sensor

Smartphone sensors ubiquitously provide an unobtrusive opportunity to develop solutions for road anomaly detection, driving behavior analysis, and activity recognition. Driver’s activity recognition is important for monitoring streets and narrow lanes where employed vehicles cannot get along. In this paper, smartphone sensor is used to monitor driving activity of motorcyclists. Motorcyclists are asked to follow a predefined path and gyroscope data is recorded from the phone, which is placed in motorcyclist pocket. Features are selected from twelve extracted statistical features from the recorded gyroscope data to classify four driving activities i.e., left turn, right turn, U-turn, and a straight path. Three different classifiers i.e., Bayes Net, random forest, and support vector machine are used to classify four motorcyclists driving activities. It is evident that the random forest classifies four motorcyclist driving activities with the highest accuracy of 86.51%.

Aasim Raheel, Muhammad Ehatisham-ul-Haq, Anees Iqbal, Hanan Ali, Muhammad Majid
PARK MY RIDE: Your True Parking Companion

Due to the increase in population and the number of vehicles, the parking issue is getting worst day by day in many big and crowded cities of the world. People have to spend more money and time to find safe parking for their vehicles. The street and roadside parking causes various troubles like fines and damages to the vehicles. In this paper, an optimum solution is proposed to find available parking slots in near by parking areas and to manage them efficiently. A mobile application is designed that helps users in a number of ways, find nearest parking areas, search for available parking slots and allow real time reservation of these slots. Moreover, a complete navigation map is provided that help users to reach the designated parking slot. The additional feature of this application is administrator panel, that allow parking owners to manage parking and collect the respective parking fee. With this application not only the time is saved but the safety of the vehicle is also ensured.

Muhammad Rizwan, Muhammad Asif, Maaz Bin Ahmad, Khalid Masood

Cloud and Data Systems

Frontmatter
Edge Caching Framework in Fog Based Radio Access Networks Through AI in Quantum Regime

Fog Computing based Radio Access Networks are a promising paradigm for 5th Generation wireless communication technology (5G) having edge devices endowed with some caching and storage capacity, as a key component for reducing caching burden on the cloud server and providing fast access and retrieval at F-UEs in a scenario where IoT based devices requiring ultra-low latency will be used extensively. The amount of static as well as dynamic data requests generated by these real-time applications will be unpredictable and unmanageable shortly causing fronthaul congestion. In order to avoid performance degradation of F-RANs in near future, cache resource allocation strategies to increase cache hit ratio, must be redefined in a further better way. Quantum computing, on the other hand, seems to be the future for every kind of classical computing problem having non-linearity and exponential growth of computation and memory with a linear increase in Quantum bits due to its parallelism. In this paper, AI has been engaged in an attempt to enhance the caching capability in F-APs by updating caching content intelligently in quantum regime, accelerating computational speed and facilitating limited storage concerns. To validate our proposed framework, certain simulations are carried out in MATLAB. The results show an inevitable outcomes for F-RANs performance up gradation.

Tayyabah Hassan, Wajiha Ajmal Khan, Fahad Ahmad, Muhammad Rizwan, Rabia Rehman
Challenges and Limitation of Resource Allocation in Cloud Computing

Cloud computing is internet-based computing era. The resources that are provided by cloud computing is easily accessible by the cloud clients when they are demanding. The infrastructure of cloud computing is dynamic in nature and resources are optimally allocated. These resources shared in cloud computing, like any other paradigm resource management is main issue in cloud computing. It is very challenging to provide all demanding resources, as the number of available shared-resources are increasing. This paper reviews sharing of resources (like servers, applications and data) over cloud and consider techniques to make adaptive algorithms for management of resources in cloud computing.

Sadia Ijaz, Tauqeer Safdar, Amanullah Khan
NFC Payment Security with Cloud Based Authentication System

Near Field Communication is a new medium of wireless communication. NFC technology is now widely introducing in smartphones. NFC technology in smartphone has made them capable of contact-less payment on POS terminals. The security protocol used for contact and contact-less payments is named as EMV (Europay Master Visa). EMV sets the security standards for online transactions in contact and contact-less payments. When deeply analyzed, EMV protocol has security vulnerabilities in (1) Mutual Authentication and (2) Exchange of banking information between payment device and payment terminal. As NFC payment involves exchange of sensitive data in open environment within a range of 10 cm, risks are involved for data being theft. We introduced cloud hosted security protocol to overcome vulnerabilities in EMV standards. The authenticity of this protocol is analyzed using Scyther tool. The protocol uses an authentication server hosted on cloud and asymmetric encryption in mutual authentication and exchange of banking data between payment device and payment terminal.

Saira Raqib, Muhammad Rizwan
Resource Utilization in Data Center by Applying ARIMA Approach

Resource administration is basically an important dire function in the data center that may be affect Service level agreement and operation cost provided by the data center known as OPEX and SLA. Efficient resource is the key factor of resource utilization and provided guarantee SLA to each application to maximize the resources in data center. Accurate prediction support each application in data center is the key requirement to provision an efficient resource management. However, Under-estimating and Overestimation in the application workload result shows the resource under provision or overestimating. In this paper, our approach to apply ARIMA model for workload applications in data center, it is forecasting technique and capture autocorrelation in the series by modeling it directly. The key concept of ARIMA model is ordering and differencing only with linear data capturing when data graph in straight line. We applies different operation model to fit applications workload time series. Performance of ARIMA can be tested by MATLAB simulation. We are using the ARIMA model parameters to find out prediction errors during the day and month calculated (i.e. Day = 7.01% and Month = 6.73%) to provide accuracy of ARIMA prediction model.

Farhan Nisar, Bilal Ahmed
Detecting Duplicates in Real-Time Data Warehouse Using Bloom Filter-Based Approach

Data warehousing has been a topic of intense research for past few years. A data warehouse is primarily as a central repository in which data is coming from disparate sources. Generally, fresh data in these warehouses are loaded to the central repository in disconnected mode through batch processing. Hence, there is always a chance of non-real time data available in the central warehouse. This stale data is not useful for most of the commercial real-time applications such as real-time transport monitoring, smart cities, semantic web, online transaction processing and sensor networks. In order to fully realize these applications, fresh data needs to be readily available for critical decision making purpose. In particular, they demand real time and quick accumulation of data from diverse sources in to main data warehouse. This paper focuses on maintaining consistency and providing real-time data updates in data warehouse. In particular, the paper targets the detection of duplicates in streaming environment with a limited amount of memory. For this purpose, it employs a novel concept called Bloom Filter. The bloom filter sets the bits in the array when the information is added in the data warehouse. This technique gives nearly 100% result without any false positive value. The error rate in worst case scenario is 0.01%. For implementation, a data structure called time frame bloom filter (TBF) is used which is essentially a bit map of information. Using this method, one can insert, update, delete and search the messages data in the data warehouse very quickly. To make the bloom filter scalable, one can also add more than one bloom filter to address the inconsistency issues.

Syed Rizwan, Syed Hasan Adil, Noman Islam
Restricted Boltzmann Machines Based Fault Estimation in Multi Terminal HVDC Transmission System

The facilitation of bulk power transmission and non-synchronized interconnection of alternating current (AC) grids convince engineers and researchers to explore high voltage direct current (HVDC) transmission system in a comprehensive way. This exploration focuses on control and protection of HVDC transmission system. Fault estimation is a core component of protection of HVDC transmission system. This is because of sudden built up of direct current (DC) fault. In this research, DC fault is estimated in multi terminal HVDC transmission system based on restricted Boltzmann machine. Restricted Boltzmann machine is a generative stochastic artificial neural network in which learning of probability distribution is conducted over the set of inputs. Three terminal HVDC transmission system is simulated under normal and faulty conditions to analyze variations in electrical parameters. These variations serve as learning parameters of restricted Boltzmann machine. Contrastive divergence algorithm is developed to train restricted Boltzmann machine. It is an approximate maximum likelihood learning algorithm in which gradient of difference of divergences is followed. It is found that fault is estimated with the testing of variations in minimum time steps. Simulation environment is built in Matlab/Simulink.

Raheel Muzzammel
Multi-agent Oriented Framework for University Course Timetable Scheduling

This research proposes a hierarchal, multi-agent based framework that is mapped onto combinatorial real life problems like university course time table scheduling at University of Sialkot that take a lot of hours even days to create or maintain a timetable for different courses. It is a web-based system assisted with an android application. In this paper, we present a multi-agent-based multi-layered hierarchical framework to allocate all events (instructors and courses) to fix predefined resources i.e. timeslots and rooms, where all constraints within the problem must be satisfied. Multiple agents are used for the development of university timetable, course allocation and to manage class held report. The Capturing Agent (CA) takes information from user and saves into database. The Monitoring Agent (MA) searched and justified the user query and pass out data to timetable generator for creation of timetable. The Distributing Agent (DA) publishes the information that becomes available to user’s inbox and on university’s website. We designed a mechanism for the development of timetable that presents the association and interaction process of different agents of the system. This system is implemented in Java Agent DEvelopment (JADE) framework.

Sehrish Munawar Cheema, Rukhsar Shafiq, Shahid Saleem, Syed Zeeshan Hussain Shah, Anees Baqir
Backmatter
Metadaten
Titel
Intelligent Technologies and Applications
herausgegeben von
Prof. Imran Sarwar Bajwa
Prof. Dr. Tatjana Sibalija
Dayang Norhayati Abang Jawawi
Copyright-Jahr
2020
Verlag
Springer Singapore
Electronic ISBN
978-981-15-5232-8
Print ISBN
978-981-15-5231-1
DOI
https://doi.org/10.1007/978-981-15-5232-8

Premium Partner